[go: up one dir, main page]

WO2025038432A2 - Vérification automatique de systèmes installés sur le terrain - Google Patents

Vérification automatique de systèmes installés sur le terrain Download PDF

Info

Publication number
WO2025038432A2
WO2025038432A2 PCT/US2024/041671 US2024041671W WO2025038432A2 WO 2025038432 A2 WO2025038432 A2 WO 2025038432A2 US 2024041671 W US2024041671 W US 2024041671W WO 2025038432 A2 WO2025038432 A2 WO 2025038432A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor data
sensors
verification result
verification
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/041671
Other languages
English (en)
Other versions
WO2025038432A3 (fr
Inventor
Travis M. KUSTER
Alexander F. Brown
Kevin J. Clark
Nic S. TEJADA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Frio Controls Inc
Original Assignee
Frio Controls Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Frio Controls Inc filed Critical Frio Controls Inc
Publication of WO2025038432A2 publication Critical patent/WO2025038432A2/fr
Publication of WO2025038432A3 publication Critical patent/WO2025038432A3/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Definitions

  • the present invention relates generally to improving the commissioning and start-up up of systems in the field; and more particularly where operational data from the system is analyzed to produce a result that can be used as a verifiable check to confirm proper system installation and operation.
  • the present disclosure provides a system and method to improve commissioning and start-up up of systems in the field, where operational data from the system is analyzed to produce a result that can be used as a verifiable check to confirm the system is installed and operating correctly.
  • the disclosure further describes enhancements to methods to ensure proper function of the system over its lifetime.
  • the disclosure describes a system that includes a controller with sensors that operate the system and that capture data from the sensors during a test run.
  • a microprocessor on the controller or a connected cloud platform (the data processing engine), processes the data from the test run and compares the data to the desired operating parameters for the specific type and model of system and controller to provide a verification report or other output indicating whether the system is operating correctly.
  • a system to verify satisfactory installation and operation of a field installed system where the system includes sensors and a controller, and where the system is configured to operate the system for a set period to time while capturing sensor data; process the sensor data; compare the processed sensor data to desired operating parameters; and produce a verification result.
  • the system can send sensor data to a cloud platform where the sensor data is processed in the cloud, and where the verification result is either sent back to the controller or disseminated via the cloud platform.
  • a user can initiate the verification of satisfactory installation and operation process from the cloud platform allowing for a completely remote set up of the system.
  • the system can be configured so that one or more of the external sensors are configured to sense temperature, where the controller uses weather data from a cloud platform as a comparison to verify correct operation of the external sensors sensing temperature.
  • one or more of the sensors are configured to sense temperature, where sensor data is recorded with the heating system off, then the heating system is turned on for a set period and the temperature response is recorded and processed to verify an effect of the heating system on the one or more sensors sensing temperature, whereby a result is used to determine if the one or more sensors is installed in a correct location in the system or if the heating system meets a specified heat up time period.
  • a method to verify satisfactory installation and operation of a field installed system where a controller runs the system and the system includes sensors, and where the method steps comprise operating the system for a set period of time while capturing sensor data; processing the sensor data; comparing the processed sensor data to desired operating parameters; and producing a verification result.
  • FIG. 1 is an illustrated system diagram with controller, gateway, mobile tester, and sensors, in accordance with an embodiment of this disclosure.
  • FIG. 2 illustrates an exemplary result report including test result, controller information, test conditions, contextualized test values, expected operating ranges, highlighted issues, troubleshooting recommendations and test identifier, all in accordance with an embodiment of this disclosure.
  • This disclosure provides a system that includes a controller with sensors that operates the system and captures data from the sensors during a test run.
  • a microprocessor on the controller, or on a connected cloud platform processes the data from the test run and compares the data to the desired operating parameters for a specific type and model of system and controller to provide a verification report or other output indicating whether the system is operating correctly.
  • the data processing engine can provide an expected operating range based on the captured data and the typical operating range for the type of system. For example, with a specific model of air handling unit that has a known fan motor and flow rate. For this system, prior testing could be used to establish parameters that indicate proper airflow, such as fan speed, fan power draw, airflow measurement, temperature measurements, etc. The data collected during the verification check can then be compared to these known values to determine if there are any issues that may be causing a deviation from the expected values.
  • the expected operating range may be determined as follows. Temperature is a principal factor in heat trace load. The verification check can report both the measured temperature and the expected range. It does this by first determining an expected operating temperature range, possibly taking into account the type of system and location. Then it looks at the typical range for self-regulating heater cable current and inrush current which is based on operating temperature. Since the temperature at the time of test is known, the result can be placed in the range and the maximum and minimum expected values calculated and presented to the user. If the type of heating cable is known, this process can be done more accurately.
  • the method can be used to ensure sensors are placed in the proper location and configured correctly. For example, in a pipe heating system where it is important that the temperature sensor is placed in contact with the pipe that needs to be heated. In this case, the check could turn on the heater once or multiple times to elicit a temperature response in the target sensor. Data analysis would then show whether the temperature sensor showed an increase in temperature during the heating cycle indicating that the sensor was, in fact, placed on the pipe. This feature could be extended for systems with known expected behavior. In this case, the rate of temperature increase measured by the sensor could be compared to an expected value to further ensure proper sensor placement.
  • the system could also take into account data from external sources such as weather data from either a connected weather station or an information service or API.
  • the weather data could be used in the verification check to establish test conditions and flag instances where operation conditions may differ from test conditions.
  • the weather information could also be used in individual verification checks. For example, if a system measures ambient temperature using a local sensor, the temperature from the local sensor and the temperature from the weather data could be compared, and if the difference is out of an established range this issue could be flagged.
  • the result could include a range of expected operating values based on the test value (measured during the check), the test conditions (measured during the test or taken from an external data source), and known operating characteristics of the system.
  • the verification check may also be conducted again at any time after the installation of the system. If the results of the initial check and all subsequent checks are recorded and stored in a database, they may be used as a baseline for the system to identify changes or faults in the system. Verification checks can be automatically run at some set interval or according to a pre-determined schedule to ensure proper function over the lifetime of the system. For example, a system could be set to run a check once a day. In the case of a freeze protection system which must function through the winter months, the check could be run each fall to ensure the system is ready to go for the winter heating season. In the case of a snow-melting system that is designed to run during a winter storm, the system could be programed to run prior to an expected winter storm event. The result could then be provided to the user along with either an assurance that the system is ready, or with a warning that there are issues that need to be resolved prior to the storm event.
  • the verification check result can include additional information that may be helpful to the various users of the system. For example, a check could help identify installation conditions that do not match the design drawings. To do this, an engineer or system designer would upload the expected system design including verifiable parameters such as expected heater load, in the case of a heating system, or number of nodes, in the case of a battery system. If the check does not align with the expected design values, this would be flagged in the result. In the case of a heat trace system where multiple pieces of heat trace are connected to a single control circuit, this method could be used to determine that the correct number of pieces of the correct length are connected to each circuit. A knowledgeable user could use the information from a number of circuits to correct the case where a piece of heat trace was connected to the wrong circuit.
  • the result could include recommendations to fix or improve the issues. For example, if the check identifies abnormally high ground-fault current in a heat tracing system. It could provide a recommendation to check the heat trace connections which are a common source for ground fault current issues.
  • the check could be completed by an external device that gathers information from the controller and either processes the data locally or passes the data to a cloud-platform or local data processing engine.
  • an external tester could be a mobile device that a technician brings to the site during installation or start up to record and process a test run. The mobile tester could record the data and store it. The data would then be uploaded to the processing engine at some later time and processed to create a test result. This method would allow systems in high security areas to be tested without any cloud connection.
  • the mobile tester could include the processing engine and the result could be generated locally.
  • the data from the verification run can be used to suggest certain control or alarm parameters. For example, a low-current alarm on a heat-trace system could be automatically programed based on the expected current range established during the verification run.
  • the system could flag troublesome control configurations either during start-up or at any other point during the system’s lifetime. This would help reduce programming errors, problems due to tampering, and improper control configurations set by unknowledgeable users.
  • an identifier such as an identification code
  • a user could enter the identification code via a web portal to access the result and ensure that it matches the result that they have. This method would deter miscreants from falsifying results and would allow users to ensure they have a correct result.
  • the test identifier or some other test ID could also be used to locate test results. For example, a log of test runs could be kept on the controller including a test identifier. This would allow someone with access to the controller to determine if any tests had been run and provide them with a way to track down the results of any tests.
  • FIG. 1 illustrates one example of a system where this type of check could be used.
  • FIG.l shows a heat trace system with a controller and a system with nodes.
  • the system could also include external sensors, internal sensors, an internet connection, a gateway connection, and an optional mobile tester, as shown in FIG. 1.
  • FIG. 1 illustrates a system under test 1; a controller 2; a wireless (or wired) connection to a cloud platform 3; a wired external sensor (whether permanent or temporary) 4; an internal sensor located in the controller 5; a wireless external sensor (whether permanent or temporary) 6; a power feed to the system 7; wiring to the wired external sensor 8; a node within the system 9; a wireless connection to a wireless sensor 10; a controller display and human machine interface (HMI) 11; a gateway 12; a mobile tester 13; a permanently wired connection to the gateway 14; and a temporary connection to the mobile tester 15.
  • HMI human machine interface
  • FIG. 2 illustrates an example result check report 21 for a check run on a system.
  • the result could include the overall result summary with recommendations for any issues found during the test, device information including system and model identifiers, test condition information including a summary of how test conditions may affect results, specific check criteria including contextualized test values, and a result identifier.
  • the result could also include all the settings as programmed at the time of the check and recommended settings such as a recommended low current alarm threshold based on the steady-state operating current and the expected operating range.
  • the result could also include other information about the system or system design such as the type or heater or heat trace, the length of heat trace, the power output of the heater or heat trace, the number of pieces of heat trace, the installation method, the insulation type, if the system is located outside, and any other relevant system information.
  • the result could include the reason for the failure and partial result of any information that is available on other criteria.
  • the exemplary result check report 21 shown in FIG. 2 could be hosted on a web platform, presented as a PDF, emailed to a user, viewed on a mobile device, or otherwise presented to users in a useful way.
  • the specific embodiment of the result check report 21 of FIG. 2 provides an overall test result 22; including a test result 22a; showing that issues were identified indicating that these issues should be addressed prior to system operation; a description of result including recommendation 22b; troubleshooting recommendations for issues identified during the test 22c; specific issues identified during the test 22d; and recommendation(s) for a specific issue identified during the test 22e.
  • the result check report 21 also notes the system information for the device under test 23; the device under test identifier 23a; the device under test model identifier 23b; the device under test name 23c; and the device under test group 23d.
  • the specific embodiment of the result check report 21 of FIG. 2 further provides test conditions 24; temperature and weather conditions 24a; test start time 24b; test location information 24c; and description of how test conditions may affect test results 24d. Also included are individual checks 25; with check criteria 25a; test value 25b; a result for specific check criteria 25c, showing “in range,” indicating the specific check is within the recommended range and the check has passed; a result for specific check criteria 25d, showing “higher than normal,” indicating the specific check is outside of the recommended range, but the system may be functional; a result for specific check criteria 25e, showing “out of range,” indicating the specific check is well out of range and the system is likely not functional; a context bar showing the test value within a context range 25f; a context bar marker showing the limits of the recommended, higher/lower, and out of range sections of the context bar 25g; a recommended range shown in green to indicate good 25h; a higher than recommended range shown in yellow to indicate caution 25i; an out of range shown in red to
  • the individual checks for a heat trace system could include the following:
  • Ambient temperature sensor reading with values compared to an ambient temperature reading from weather data. Values that differ significantly from the ambient temperature would be flagged to indicate the sensor is likely in a location with external heating or cooling such as a position in the sun that could negatively affect operation.
  • Heating system temperature sensor reading with a rate of temperature increase If the rate of temperature increase is lower than expected, this criterion would be flagged to indicate that the sensor may not be located in the correct position.
  • a heat tracing system check designed to run on a longer test cycle could also include the following check criteria:
  • Efficiency metric indicating how efficient the system is compared to similar systems including recommendations on how to improve system efficiency.
  • Fields of use for embodiments of the systems and methods of the above presented disclosure include industrial control, building system control, heating systems, process heating systems, heat trace systems, freeze protection systems, grease waste systems, cooling towers, snow melting and deicing systems, cooling systems, HVAC systems, air-conditioning systems, heatpump systems, mini-split systems, residential systems, plumbing systems, electrical systems, battery systems, energy systems, solar power systems, charging stations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

Sont divulgués ici un procédé et un système pour améliorer la mise en service et le démarrage de systèmes sur le terrain, des données de fonctionnement en provenance du système étant analysées pour produire un résultat qui peut être utilisé en tant que vérification vérifiable pour confirmer que le système est installé et fonctionne correctement. Sont également divulguées des améliorations apportées au système et au procédé pour assurer une fonction continue appropriée du système. Un système pour vérifier une installation et un fonctionnement satisfaisants d'un système installé sur le terrain peut comprendre des capteurs et un dispositif de commande, et le système peut être configuré pour faire fonctionner le système pendant une période définie à temps tout en capturant des données de capteurs ; traiter les données de capteurs ; comparer les données de capteurs traitées à des paramètres de fonctionnement souhaités ; et produire un résultat de vérification.
PCT/US2024/041671 2023-08-11 2024-08-09 Vérification automatique de systèmes installés sur le terrain Pending WO2025038432A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363532185P 2023-08-11 2023-08-11
US63/532,185 2023-08-11

Publications (2)

Publication Number Publication Date
WO2025038432A2 true WO2025038432A2 (fr) 2025-02-20
WO2025038432A3 WO2025038432A3 (fr) 2025-04-10

Family

ID=94633081

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/041671 Pending WO2025038432A2 (fr) 2023-08-11 2024-08-09 Vérification automatique de systèmes installés sur le terrain

Country Status (1)

Country Link
WO (1) WO2025038432A2 (fr)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103597292B (zh) * 2011-02-28 2016-05-18 艾默生电气公司 用于建筑物的供暖、通风和空调hvac系统的监视系统和监视方法
US10272014B2 (en) * 2016-01-22 2019-04-30 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
US11549302B2 (en) * 2019-03-14 2023-01-10 ASSA ABLOY Accessories and Door Controls Group, Inc. Door system with improved installation, set-up, and cloning

Also Published As

Publication number Publication date
WO2025038432A3 (fr) 2025-04-10

Similar Documents

Publication Publication Date Title
US10496065B2 (en) Systems and methods for mobile application for HVAC installation and diagnostics
US10884403B2 (en) Remote HVAC monitoring and diagnosis
CN105074344B (zh) Hvac系统远程监测和诊断
CN106462917B (zh) 制冷剂管线阻塞的hvac系统远程监视和诊断
CN104765354B (zh) 一种传感器及执行元件的故障诊断方法、装置及系统
CN107948302B (zh) 一种物联网嵌入式设备的生命周期管理方法及系统
AU2013225926A1 (en) HVAC system remote monitoring and diagnosis
CN107741739A (zh) 智能设备箱及其工作方法
CN115909674B (zh) 基于智慧燃气的报警器与燃气表联动方法和物联网系统
US12018852B2 (en) HVAC filter usage analysis system
CN108879956B (zh) 基于设备运行状态对系统故障进行主动判断并修复的方法
US9551495B2 (en) HVAC system grading systems and methods
WO2025038432A2 (fr) Vérification automatique de systèmes installés sur le terrain
KR20220028706A (ko) Hplc를 이용한 스마트환경 고장진단 시스템 및 그 방법
JP7138355B2 (ja) 住宅用太陽電池診断システム
KR20140113761A (ko) 스타 토폴리지를 이용한 지하 시설물 관리 시스템
CN207503018U (zh) 智能设备箱
KR101758100B1 (ko) 그린에너지를 이용한 원클릭 물관리 자동화시스템
AU2015255255B2 (en) Residential solutions HVAC monitoring and diagnosis
KR20190003104U (ko) 메인 변전실 온도감시 및 역률 제어 시스템
CN120610090A (zh) 基于物联网的光伏停车棚电流故障检测方法及相关装置
WO2007121911A1 (fr) Produit, dispositif et systeme de commande
CN121027904A (zh) 基于远程数据比对的交直流充电桩电能计量检定方法
CN120109994A (zh) 交直流电源管理中的远程状态监测方法及系统
CN119445741A (zh) 基于无线测温技术的老旧风电机防火灾监测方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24854695

Country of ref document: EP

Kind code of ref document: A2