[go: up one dir, main page]

CN120104429B - A capacitance measurement and control instrument data processing method and device - Google Patents

A capacitance measurement and control instrument data processing method and device

Info

Publication number
CN120104429B
CN120104429B CN202510585062.1A CN202510585062A CN120104429B CN 120104429 B CN120104429 B CN 120104429B CN 202510585062 A CN202510585062 A CN 202510585062A CN 120104429 B CN120104429 B CN 120104429B
Authority
CN
China
Prior art keywords
event
data
log record
event log
crc check
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202510585062.1A
Other languages
Chinese (zh)
Other versions
CN120104429A (en
Inventor
刘荣清
徐会宏
陈金领
黄金龙
张婷婷
叶卉
吴斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Chuangyi Electrical Technology Co ltd
Original Assignee
Zhejiang Chuangyi Electrical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Chuangyi Electrical Technology Co ltd filed Critical Zhejiang Chuangyi Electrical Technology Co ltd
Priority to CN202510585062.1A priority Critical patent/CN120104429B/en
Publication of CN120104429A publication Critical patent/CN120104429A/en
Application granted granted Critical
Publication of CN120104429B publication Critical patent/CN120104429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3065Monitoring arrangements determined by the means or processing involved in reporting the monitored data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/08Error detection or correction by redundancy in data representation, e.g. by using checking codes
    • G06F11/10Adding special bits or symbols to the coded information, e.g. parity check, casting out 9's or 11's
    • G06F11/1004Adding special bits or symbols to the coded information, e.g. parity check, casting out 9's or 11's to protect a block of data words, e.g. CRC or checksum
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3089Monitoring arrangements determined by the means or processing involved in sensing the monitored data, e.g. interfaces, connectors, sensors, probes, agents

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Debugging And Monitoring (AREA)
  • Remote Monitoring And Control Of Power-Distribution Networks (AREA)

Abstract

The invention provides a data processing method and device for a capacitance measurement and control instrument, and relates to the technical field of data processing of capacitance measurement and control instruments. The method comprises the steps of monitoring key nodes of a data processing flow according to capacitor operation data, generating corresponding event log records according to preset log record levels when the key nodes are monitored, storing the event log records in a local nonvolatile memory according to time sequence, providing a standard communication interface to support remote access and reading of the event log records, and allowing operation and maintenance personnel to remotely configure the log record levels. The data processing method of the capacitance measuring and controlling instrument solves the problems that the traditional intelligent capacitor measuring and controlling instrument lacks traceability in the data processing process, is easily affected by electromagnetic interference and faces resource constraint, realizes effective traceability of the data processing process on the premise of guaranteeing the real-time performance of data processing, and improves the integrity and reliability of data in a complex electromagnetic environment.

Description

Data processing method and device for capacitance measurement and control instrument
Technical Field
The invention relates to the technical field of data processing of capacitance measurement and control instruments, in particular to a data processing method and device of a capacitance measurement and control instrument.
Background
In a complex operation environment of a modern intelligent substation, an intelligent capacitor measurement and control instrument is used as key edge side equipment, and the data processing capacity of the intelligent capacitor measurement and control instrument directly influences the safety and stability of a power grid. Especially when high frequency transient faults occur, the measurement and control instrument needs to record and process relevant data rapidly and reliably so as to conduct fault analysis and responsibility tracing. However, conventional intelligent capacitor measurement and control instruments face many challenges when dealing with such scenarios.
First, the data processing process of the existing measurement and control instrument often lacks transparency. Once a fault occurs, it is difficult for operation and maintenance personnel to accurately trace back the details of data processing before and after the fault occurs, which brings difficulty to analysis of fault reasons and definition of responsibilities. In order to improve the reliability and maintainability of the operation of the transformer substation, a technical means capable of effectively tracking the data processing process of the measurement and control instrument is urgently needed.
Secondly, the internal electromagnetic environment of the intelligent substation is extremely complex, and high-frequency electromagnetic interference is ubiquitous. These disturbances tend to affect the reliability of data transmission and storage, resulting in data errors or loss. Particularly during high frequency transient faults, the electromagnetic environment is more severe and the threat to data integrity is further exacerbated. Therefore, how to ensure the reliability of the data processing process in a severe electromagnetic environment is a problem to be solved.
Furthermore, computing resources of edge-side intelligent electronic devices are often limited. Complex logging mechanisms may unduly consume computing and memory resources of the device, affect its normal data processing functions, and may even reduce the real-time nature of the data processing. How to realize effective data processing process tracing on the edge side equipment with limited resources and avoid excessive resource consumption is a key factor which must be considered when designing the log recording function of the intelligent capacitor measurement and control instrument.
In addition, the intelligent operation and maintenance level of the transformer substation still has a lifting space at present. When operation and maintenance personnel perform fault analysis, the operation and maintenance personnel often lack fine data support, and the fault source is difficult to quickly locate. If a mechanism can be provided, the data processing process of the measurement and control instrument when the fault occurs can be recorded in detail, so that the operation and maintenance efficiency and the fault processing capability can be greatly improved, and powerful support is provided for the intelligent operation and maintenance of the transformer substation.
Finally, existing data verification methods may not provide adequate assurance in high frequency transient fault scenarios. Simple verification methods may not be effective against high frequency electromagnetic interference, and overly complex verification methods may incur excessive resource overhead and time delay. Therefore, a data verification mechanism with high efficiency and strong anti-interference capability is designed aiming at the characteristics of high-frequency transient faults of the intelligent substation so as to ensure the traceability and reliability of the data processing process.
Disclosure of Invention
The invention aims to provide a data processing method and device for a capacitance measurement and control instrument, which solve the problems that the traditional intelligent capacitor measurement and control instrument lacks traceability in the data processing process, is easily influenced by electromagnetic interference and faces resource constraint, realize effective traceability of the data processing process on the premise of ensuring the real-time performance of data processing, and improve the integrity and reliability of data in a complex electromagnetic environment.
In a first aspect, the invention provides a data processing method of a capacitance measurement and control instrument, which comprises the following steps:
acquiring capacitor operation data;
monitoring key nodes of a data processing flow according to the capacitor operation data, wherein the key nodes comprise a data acquisition starting node, a filtering processing completion node and a fault judging completion node;
When the key node is monitored, a corresponding event log record is generated according to a preset log record level, wherein the event log record comprises a time stamp, an event type, an event description and data check information, the time stamp precision is in a microsecond level, the event type is used for identifying a data processing stage, the event description is used for recording details of processing steps, the data check information comprises CRC check codes adopting cyclic redundancy check, and different log record levels correspond to the CRC check codes with different lengths;
storing the event log records in a local nonvolatile memory according to time sequence, providing a standard communication interface to support remote access and reading of the event log records, and allowing operation and maintenance personnel to remotely configure log record levels according to requirements.
The data processing method of the capacitance measurement and control instrument provided by the invention is used for recording event logs in the intelligent capacitance measurement and control instrument, and presetting a plurality of log record levels, such as standard logs, enhanced logs and simplified logs. The event log recording module monitors key nodes of the data processing flow. And at each key node, the event log recording module automatically generates log records containing time stamps, event types, event descriptions and data verification information according to preset log record levels. Different log levels correspond to CRC check codes of different lengths, e.g., a longer CRC check code is employed for the "enhanced log" level to improve data reliability and a shorter CRC check code is employed for the "reduced log" level to reduce resource consumption. The "standard log" level then uses a compromise CRC check code length. The method meets the traceability requirement of the data processing process in the high-frequency transient fault scene, realizes the effective traceability of the data processing process, and improves the integrity and reliability of the data in the complex electromagnetic environment.
Further, when the key node is monitored, the step of generating the corresponding event log record according to the preset log record level includes:
A. determining the monitored current key node, specifically performing:
A1. If the data acquisition starting node is data, starting a preset phase-locked loop synchronization mechanism to calibrate a time stamp so as to ensure microsecond precision;
A2. If the filtering processing is completed, calculating the mean square error of the data before and after the filtering, and adding an abnormality-containing mark in the event type when the mean square error exceeds a first preset threshold;
A3. If the node is a fault judgment completion node, analyzing the sensitivity of input data to electromagnetic interference by using a preset fault judgment algorithm, and marking the input data with the sensitivity exceeding a second preset threshold value as a key parameter which is easy to be interfered in the event description according to an analysis result, wherein the input data is the data which is input to the fault judgment algorithm in the capacitor operation data;
B. The length of the CRC check code is selected according to the log record level, wherein the log record level comprises a high level, a medium level and a low level, the high level log record level correspondingly selects 32 bits of the CRC check code, the medium level log record level correspondingly selects 16 bits of the CRC check code, and the low level log record level correspondingly selects 8 bits of the CRC check code;
C. And obtaining the event log record according to the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code.
CRC check codes with different lengths are selected according to different log record levels, so that the data check strength is ensured, the resource consumption is considered, and the balance of the data reliability and the resource utilization rate is realized.
Further, the step of obtaining the event log record according to the calibrated timestamp, the event type containing the anomaly flag, the event description containing the key parameter and the selected CRC check code includes:
A pre-defined binary format is adopted, and a preliminary event log record is constructed by combining the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code;
And performing anti-interference coding on the preliminary event log record based on the RS code to obtain a final event log record.
The dual-check mechanism combines the error detection capability of the cyclic redundancy check code and the error correction capability of the RS code, can more effectively cope with the data interference problem in the complex electromagnetic environment, and improves the reliability of data processing.
Further, the step of constructing a preliminary event log record by combining the calibrated time stamp, the event type containing the anomaly flag, the event description containing the key parameter and the selected CRC check code using a predefined binary format includes:
respectively loading the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code into a corresponding data buffer area;
According to a predefined binary format, according to the sequence of the time stamp, the event type, the event description and the CRC check code, the data of each data buffer area is read, a preset separator is inserted, then the data are written into a target storage area, and finally a preliminary event log record is formed.
The log record constructed by the method has clear structure and easy analysis, and provides powerful support for subsequent log analysis and fault tracing.
Further, based on the RS code, the step of performing anti-interference encoding on the preliminary event log record to obtain a final event log record includes:
Determining coding parameters of an RS code, wherein the coding parameters comprise a code word length, an information bit length and an error correction capability coefficient, the code word length is determined according to the length of the preliminary event log record, the information bit length is determined according to the sum of a time stamp, an event type and a data length of an event description, and the error correction capability coefficient is calculated according to the code word length and the information bit length;
generating a generator polynomial of the RS code according to the determined RS code coding parameters;
Taking the time stamp, the event type and the event description in the preliminary event log record as information bits, and filling a plurality of zeros after the information bits to form a data sequence to be encoded;
Performing RS encoding on the data sequence to be encoded by using the generating polynomial to obtain a plurality of check bits, wherein the number of the check bits is the same as the number of zeros filled with the information bits;
After all the check bits are appended to the information bits, the final event log record is formed containing a timestamp, an event type, an event description and check bits.
In a second aspect, the present invention provides a data processing device of a capacitance measurement and control instrument, including:
The acquisition module is used for acquiring capacitor operation data;
the monitoring module is used for monitoring key nodes of a data processing flow according to the capacitor operation data, wherein the key nodes comprise a data acquisition starting node, a filtering processing completion node and a fault judging completion node;
The system comprises a generation module, a data processing module and a log record module, wherein the generation module is used for generating a corresponding event log record according to a preset log record level when the key node is monitored, the event log record comprises a time stamp, an event type, an event description and data checking information, the time stamp precision is in a microsecond level, the event type is used for identifying a data processing stage, the event description is used for recording details of the processing steps, the data checking information comprises CRC (cyclic redundancy check) codes which adopt cyclic redundancy check, and different log record levels correspond to CRC codes with different lengths;
The storage module is used for storing the event log records in the local nonvolatile memory according to the time sequence, providing a standard communication interface to support remote access and reading of the event log records, and allowing operation and maintenance personnel to remotely configure log record levels according to requirements.
The data processing device of the capacitance measurement and control instrument provided by the invention provides a stable and traceable record for data processing in the capacitor control instrument by recording key events comprising detailed information and data integrity check, and solves the problems of transparency, data reliability in a complex electromagnetic environment and resource constraint.
Further, the generating module is configured to, when the key node is detected, generate a corresponding event log record according to a preset log record level, and perform:
A. determining the monitored current key node, specifically performing:
A1. If the data acquisition starting node is data, starting a preset phase-locked loop synchronization mechanism to calibrate a time stamp so as to ensure microsecond precision;
A2. If the filtering processing is completed, calculating the mean square error of the data before and after the filtering, and adding an abnormality-containing mark in the event type when the mean square error exceeds a first preset threshold;
A3. If the node is a fault judgment completion node, analyzing the sensitivity of input data to electromagnetic interference by using a preset fault judgment algorithm, and marking the input data with the sensitivity exceeding a second preset threshold value as a key parameter which is easy to be interfered in the event description according to an analysis result, wherein the input data is the data which is input to the fault judgment algorithm in the capacitor operation data;
B. The length of the CRC check code is selected according to the log record level, wherein the log record level comprises a high level, a medium level and a low level, the high level log record level correspondingly selects 32 bits of the CRC check code, the medium level log record level correspondingly selects 16 bits of the CRC check code, and the low level log record level correspondingly selects 8 bits of the CRC check code;
C. And obtaining the event log record according to the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code.
Further, the generating module is configured to execute when the event log record is obtained according to the calibrated timestamp, the event type including the anomaly flag, the event description including the key parameter and the selected CRC check code:
A pre-defined binary format is adopted, and a preliminary event log record is constructed by combining the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code;
And performing anti-interference coding on the preliminary event log record based on the RS code to obtain a final event log record.
Further, the generating module is configured to perform when a preliminary event log record is constructed by combining the calibrated timestamp, the event type including the anomaly flag, the event description including the key parameter, and the selected CRC check code in a predefined binary format:
respectively loading the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code into a corresponding data buffer area;
According to a predefined binary format, according to the sequence of the time stamp, the event type, the event description and the CRC check code, the data of each data buffer area is read, a preset separator is inserted, then the data are written into a target storage area, and finally a preliminary event log record is formed.
Further, the generating module is configured to perform anti-interference encoding on the preliminary event log record based on the RS code, so as to obtain a final event log record when executing:
Determining coding parameters of an RS code, wherein the coding parameters comprise a code word length, an information bit length and an error correction capability coefficient, the code word length is determined according to the length of the preliminary event log record, the information bit length is determined according to the sum of a time stamp, an event type and a data length of an event description, and the error correction capability coefficient is calculated according to the code word length and the information bit length;
generating a generator polynomial of the RS code according to the determined RS code coding parameters;
Taking the time stamp, the event type and the event description in the preliminary event log record as information bits, and filling a plurality of zeros after the information bits to form a data sequence to be encoded;
Performing RS encoding on the data sequence to be encoded by using the generating polynomial to obtain a plurality of check bits, wherein the number of the check bits is the same as the number of zeros filled with the information bits;
After all the check bits are appended to the information bits, the final event log record is formed containing a timestamp, an event type, an event description and check bits.
According to the data processing method of the capacitance measurement and control instrument, detailed event logs are automatically generated at key nodes of the intelligent capacitance measurement and control instrument, all links of data processing are completely recorded, reliable data basis is provided for fault analysis, responsibility tracing and compliance audit, the operation reliability of an intelligent substation is remarkably improved, the event log recording process is designed to be light-weight operation, the occupied resources are small, the influence on the normal data processing flow is extremely small, microsecond-level time stamps ensure the time precision of event records, the real-time requirement of high-frequency transient fault rapid processing is met, in addition, the capability of resisting electromagnetic interference of log data is effectively improved by introducing data check information (CRC check codes) into the log records, the integrity and reliability of the log data are guaranteed, and the accuracy of the log records can be ensured even under the complex electromagnetic environment of the substation. And the data verification intensity can be flexibly adjusted according to the requirements through a hierarchical log recording mechanism, balance is achieved between the data reliability and the resource consumption, and finally, the characteristic of limited resources of the intelligent electronic equipment at the edge side is fully considered in the design of the event log recording method, so that the module deployment is simple, and the resource consumption is controllable. By selecting a proper log record level, the resource consumption can be reduced to the maximum extent while the traceability of the data is ensured, and the resource constraint of the edge side equipment is adapted.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the embodiments of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
Fig. 1 is a flowchart of a data processing method of a capacitance measurement and control instrument according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a data processing device of a capacitance measurement and control instrument according to an embodiment of the present invention.
Description of the reference numerals:
100. The system comprises an acquisition module, a monitoring module, a generation module, a storage module and a generation module.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present invention.
It should be noted that like reference numerals and letters refer to like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, the invention provides a data processing method of a capacitance measuring and controlling instrument, which comprises the following steps:
acquiring capacitor operation data;
monitoring key nodes of a data processing flow according to the capacitor operation data, wherein the key nodes comprise a data acquisition starting node, a filtering processing completion node and a fault judging completion node;
when a key node is monitored, a corresponding event log record is generated according to a preset log record level, wherein the event log record comprises a time stamp, an event type, an event description and data check information, the time stamp precision is microsecond, the event type is used for identifying a data processing stage, the event description is used for recording details of processing steps, the data check information comprises CRC check codes adopting cyclic redundancy check, and the different log record levels correspond to the CRC check codes with different lengths;
The event log records are stored in a local non-volatile memory in time sequence and a standard communication interface is provided to support remote access and reading of event log records and to allow the operator to remotely configure log record levels as required.
The core of the method is the generation of event logs at key nodes. Acquiring capacitor operating data is an initial step to provide data input for subsequent processing and monitoring, which may be accomplished by a sensor connected to the capacitor, which periodically sends data readings to the control system. The key node is a specific point in the data processing flow, such as the start of data acquisition, the completion of filtering processing, the completion of fault judgment and the like, and represents an important stage in the data processing flow. When a key node is monitored, an event log is generated according to a preset log level, log content comprises a timestamp used for recording the accurate time of occurrence of the event, microsecond precision is critical to high-frequency transient fault analysis, a high-precision real-time clock can provide the microsecond time stamp by using a phase-locked loop synchronization mechanism, an event type is used for identifying a data processing stage, the event type can be represented by a digital code or a descriptive character string so as to classify and screen the log, event description is provided for providing detailed information about the processing step, the transparency of the process is improved, the event description can be dynamically generated based on the event context and comprises related parameters or intermediate results, CRC check codes are used for data check information, cyclic redundancy check is adopted, different log levels correspond to different CRC lengths so as to balance reliability and resource use, the CRC check codes are calculated based on the log data before log data storage, and the CRC code length is 8 bits, 16 bits or 32 bits are selected according to the preset log level. The event log is stored in time-series in a non-volatile memory, such as a flash memory, to ensure persistence, to ensure that data is not lost in the event of a power outage, and to provide a standard communication interface, such as an ethernet or serial port, for remote access and reading, with the Modbus or TCP/IP protocol being employed for remote access. The operation and maintenance personnel can remotely adjust the log level according to the requirement, and the remote configuration is realized through a network interface, so that the operation and maintenance personnel can send a command to change the log level setting.
In particular, the data processing method aims at improving the transparency and reliability of the capacitor controller, especially in a scene with complex electromagnetic environment. First, capacitor operation data is acquired. Then, predefined key nodes in the data processing flow are monitored, including data acquisition start, filtering completion, and fault determination completion. These nodes are selected as representing important phases of data processing. When the key node is detected, an event log is automatically generated, each link of data processing is completely recorded in the event log, a reliable data basis is provided for fault analysis, responsibility tracing and compliance audit, the operation reliability of the intelligent substation is remarkably improved, the event log recording process is designed to be a lightweight operation, the resource occupation is small, and the influence on a normal data processing flow is extremely small. The log content depends on preset log levels, such as "standard log", "enhanced log", and "condensed log". Each log entry contains a time stamp with microsecond precision, so that the event tracking accuracy based on time is ensured, which is crucial to analyzing the fast transient fault, and the real-time requirement of fast processing of the high-frequency transient fault is met. The event type clearly identifies the stage of data processing and provides context for the log. The event description provides detailed information about the specific processing steps performed at the node, increasing the transparency of the data processing process. In addition, by introducing data check information (CRC check code) into the log record, the capability of the log data for resisting electromagnetic interference is effectively improved, the integrity and reliability of the log data are ensured, and the accuracy of the log record can be ensured even under the complex electromagnetic environment of a transformer substation. And the data verification intensity can be flexibly adjusted according to the requirements through a hierarchical log recording mechanism, and the balance between the data reliability and the resource consumption is achieved. Higher log levels employ longer CRC codes for higher reliability, while lower levels use shorter codes to save resources. The generated event log is stored in a nonvolatile memory in time sequence, so that log durability is ensured, and the event log can be stored even in the case of power failure. A standard communication interface is provided to enable maintenance personnel to remotely access these logs for diagnosis and analysis. In addition, the log level may be configured remotely, allowing an operator to adjust the level of detail of the log records based on current demand and resource availability. This dynamic adjustment capability ensures that sufficient log information is provided when needed without unnecessarily burdening the system during normal operation. The event log recording method fully considers the characteristic of limited resources of the intelligent electronic equipment at the edge side in design, and has simple module deployment and controllable resource consumption. By selecting a proper log record level, the resource consumption can be reduced to the maximum extent while the traceability of the data is ensured, and the resource constraint of the edge side equipment is adapted. By recording key events including detailed information and data integrity checks, the method provides a robust and traceable record for data processing in the capacitor controller, solving the problems of transparency, data reliability in complex electromagnetic environments, and resource constraints.
In some embodiments, consider that the logging system configures three levels, "low," medium, "and" high. When the data acquisition process starts, a "data acquisition start" node is triggered. If the log level is set to "middle," an event log is generated. This log includes a time stamp obtained from the PLL synchronized real time clock, an event type set to "data acquisition start", an event description such as "data acquisition start of capacitor bank a", and a 16-bit CRC check code calculated on the time stamp, event type, and event description data. This log entry is then appended to the event log file stored in flash memory. Subsequently, if the filtering process is complete, and a "filter complete" node is triggered. If the log level is still "in", another log entry is created. This entry contains a new timestamp, event type "filter complete", describing, for example, "apply moving average filter, standard deviation before filtering: 2.5, after filtering: 0.5", and another 16-bit CRC check code. This entry is also stored in the flash memory, chronologically after the previous entry. If a fault is detected during the fault determination phase and a "fault determination complete" node is reached, and the log level is "high", a more detailed log is generated. This log will include a timestamp, event type "fault determination complete", detailed event descriptions such as "detect overvoltage fault, voltage reading: 1.2kV, threshold: 1.1kV, sensitivity of incoming data to EMI: high (parameters: voltage, current)", and 32-bit CRC check code for enhancing data integrity. This detailed log and longer CRC provide maximum information and reliability for critical events when the log level is set to "high". Throughout the process, the remote operator may read the stored log using the standard Modbus TCP interface or change the log level to "low" to reduce redundancy during normal operation, or to "high" back during troubleshooting to make detailed diagnoses.
In some embodiments, when the key node is detected, the step of generating the corresponding event log record according to the preset log record level includes:
A. determining the monitored current key node, specifically performing:
A1. If the data acquisition starting node is data, starting a preset phase-locked loop synchronization mechanism to calibrate a time stamp so as to ensure microsecond precision;
A2. if the node is the filtering processing completion node, the mean square error of the data before and after filtering is counted, and when the mean square error exceeds a first preset threshold value, an abnormality-containing mark is added in the event type;
A3. if the node is the fault judgment completion node, analyzing the sensitivity of the input data to electromagnetic interference by using a preset fault judgment algorithm, and marking the input data with the sensitivity exceeding a second preset threshold value as a key parameter which is easy to be interfered in event description according to an analysis result;
B. The length of the CRC check code is selected according to the log record level, wherein the log record level comprises a high level, a medium level and a low level, the high level log record level correspondingly selects 32-bit CRC check codes, the medium level log record level correspondingly selects 16-bit CRC check codes, and the low level log record level correspondingly selects 8-bit CRC check codes;
C. And obtaining an event log record according to the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code.
In the step A, a phase-locked loop synchronization mechanism is started aiming at a data acquisition starting node, so that a time stamp is calibrated, the time stamp precision reaches a microsecond level, the mean square error of data before and after filtering is counted and compared with a first preset threshold value aiming at a filtering processing finishing node, when the mean square error exceeds the first preset threshold value, an abnormal mark is added into an event type, a preset fault judgment algorithm is used for analyzing the sensitivity of input data to electromagnetic interference aiming at a fault judgment finishing node, an analysis result is used for judging whether the sensitivity of the input data exceeds a second preset threshold value, and if so, the input data with the sensitivity exceeding the second preset threshold value is marked as a key parameter which is easy to be interfered in event description.
In step B, the log record level is used as a basis for selecting the length of the CRC check code, and is classified into three levels of high level, medium level and low level, the high level log record level corresponds to the 32-bit CRC check code, the medium level log record level corresponds to the 16-bit CRC check code, and the low level log record level corresponds to the 8-bit CRC check code.
In step C, the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code are combined to finally form an event log record.
Specifically, aiming at the problem that the traceability of the data processing process is reduced due to electromagnetic interference in a high-frequency transient fault scene of the intelligent substation, the event log record generation method provided by the application realizes the traceability enhancement of the data processing process by generating detailed event log records at key nodes of the data processing process. And when data acquisition starts, a phase-locked loop synchronization mechanism is started, microsecond precision of a time stamp is ensured, and an accurate time reference is provided for subsequent data analysis. After the filtering processing is finished, data anomalies possibly introduced in the filtering process can be effectively detected through calculation of mean square error and comparison of a threshold value, and the anomalies are recorded in an event log, so that the problems of a data preprocessing stage can be found and solved in time. And when the fault judgment is completed, analyzing the sensitivity of the input data to electromagnetic interference, marking the sensitive parameters in the event description, and providing reference information of electromagnetic interference influence for fault analysis. In addition, CRC check codes with different lengths are selected according to different log record levels, so that the data check strength is ensured, the resource consumption is considered, and the balance of the data reliability and the resource utilization rate is realized. The finally generated event log record contains key information such as time stamp, event type, event description, CRC check code and the like, and provides comprehensive and reliable data support for operation and maintenance personnel to perform fault analysis and responsibility tracing.
In some embodiments, during the data processing process of the capacitance measurement and control instrument, when the data acquisition start node is monitored, the phase-locked loop synchronization mechanism is immediately started, microsecond calibration is performed on the system time, an event log record containing an accurate timestamp is generated, the event type is set to be "data acquisition start", the event description can contain information such as "start data acquisition", and the like, and the corresponding CRC check code length is selected according to the preset log record level. When the filtering processing completion node is monitored, the system calculates the mean square error of the data before and after filtering, if the mean square error exceeds a preset first threshold value, an 'abnormal' mark is added in the event type, for example, the event type can be set as 'filtering completion-abnormal', the event description can contain information such as 'mean square error exceeding threshold value', specific mean square error value and the like, and the CRC check code length is selected according to the log record level. When the fault judging finishing node is monitored, the fault judging algorithm analyzes the sensitivity of the input data to electromagnetic interference, if the sensitivity of some input data is found to exceed a second preset threshold value, the data are marked as 'key parameters easy to be interfered' in event description, for example, the event description can contain 'parameter X, parameter Y is key parameter easy to be interfered', and the like, the event type can be set as 'fault judging finishing', and the CRC check code length is selected according to the log record level. Finally, these event log records, containing time stamps, event types, event descriptions and corresponding length CRC check codes, are stored in a time sequence in a local non-volatile memory and can be accessed and read remotely through a standard communication interface.
In some embodiments, the step of obtaining an event log record based on the calibrated timestamp, the event type including the exception flag, the event description including the key parameter, and the selected CRC check code includes:
A pre-defined binary format is adopted, and a preliminary event log record is constructed by combining the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code;
And performing anti-interference coding on the preliminary event log record based on the RS code to obtain a final event log record.
For constructing a preliminary event log record, a binary format may be predefined that specifies the time stamp, the event type, the event description, and the ordering and data type of the cyclic redundancy check in the data structure. For example, the timestamp may be defined as a 64-bit unsigned integer, the event type may be defined as an 8-bit enumeration type, the event description may be defined as a variable length string, and the length of the cyclic redundancy check code may be 8-bit, 16-bit, or 32-bit, depending on the logging level. When combining these information, they may be arranged in the order of time stamp, event type, event description, cyclic redundancy check code and written into a pre-allocated data buffer. To distinguish between different data fields, a preset separator, such as a particular byte sequence, may be inserted between the fields to accurately identify and extract the data of each field when the log record is subsequently parsed.
Further, after obtaining the preliminary event log record, in order to enhance the anti-interference capability of the data, reed-solomon codes (RS codes) are introduced for encoding. Specifically, it is first necessary to determine coding parameters of the RS code, including codeword length, information bit length, and error correction capability coefficient. The codeword length may be determined based on the length of the preliminary event log record, and the information bit length is then dependent on the sum of the data lengths of the time stamp, event type and event description. The error correction capability coefficient is calculated from the codeword length and the information bit length, and determines the number of errors that the RS code can correct. After determining the coding parameters, a generator polynomial of the RS code needs to be generated, which is typically mathematically operated on the basis of the galois field. The coding process is to regard the time stamp, the event type and the event description in the preliminary event log record as information bits and fill zeros after the information bits to form a data sequence to be coded. And then, performing RS coding on the data sequence by using a generator polynomial, and obtaining a certain number of check bits by linear feedback shift registers and modulo-2 division operation. The number of these check bits is the same as the number of padding zeros. Finally, after all check bits are appended to the information bits, a final event log record is formed containing the timestamp, event type, event description and check bits.
Specifically, by combining the components of the event log record in a predefined binary format, structured data integration is achieved, facilitating subsequent data processing and storage. And the RS code is introduced to perform anti-interference coding, so that higher-level data protection is provided on the basis of the cyclic redundancy check code. The cyclic redundancy check code is mainly used for detecting errors occurring in the data transmission or storage process, and the RS code can not only detect errors, but also correct errors in a certain range. Therefore, even in a severe electromagnetic interference environment, the final event log record has stronger anti-interference capability, the error rate can be effectively reduced, and the integrity and the accuracy of the event log record are ensured. The dual-check mechanism combines the error detection capability of the cyclic redundancy check code and the error correction capability of the RS code, can more effectively cope with the data interference problem in the complex electromagnetic environment, and improves the reliability of data processing.
In some embodiments, it is assumed that in the predefined binary format, the time stamp occupies 8 bytes, the event type occupies 1 byte, the event description length is variable, the maximum occupies 255 bytes, the cyclic redundancy check code selects 16 bits, and occupies 2 bytes. When the filtering processing completion node is monitored, the event type is marked as 0x02, the event description record is the 'post-filtering mean square error super threshold', the calibrated time stamp is the current microsecond time stamp, and the 16-bit cyclic redundancy check code is obtained through calculation. These data are combined in a predefined order to form a preliminary event log record. Further, the RS (255, 239) code is selected for anti-interference coding, wherein the codeword length is 255 bytes, the information bit length is 239 bytes, and the error of 8 bytes at most can be corrected. Taking the preliminary event log record as an information bit, filling zero of 16 bytes, performing RS coding, generating check bits of 16 bytes, and adding the check bits to the information bit to obtain a final event log record. In this way, the event log records can still maintain high reliability and integrity under severe electromagnetic environments.
In some embodiments, the step of constructing a preliminary event log record by combining the calibrated time stamp, the event type containing the anomaly flag, the event description containing the key parameter, and the selected CRC check code using a predefined binary format comprises:
respectively loading the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code into a corresponding data buffer area;
According to a predefined binary format, according to the sequence of the time stamp, the event type, the event description and the CRC check code, the data of each data buffer area is read, a preset separator is inserted, then the data are written into a target storage area, and finally a preliminary event log record is formed.
And realizing structured and efficient data assembly aiming at the construction process of the preliminary event log record. Log information components such as time stamp, event type, event description, and CRC check code are loaded into separate data buffers. The data block management method provides convenience for subsequent data combination, and improves data processing efficiency and maintainability. A predefined binary format is employed and the time stamps, event types, event descriptions, CRC check codes are arranged in a fixed order. During data reading, data is read from each data buffer and a preset separator is inserted between adjacent data blocks. The separator serves to clearly divide the different data fields in the resulting binary data stream, which is essential for subsequent data parsing and extraction. The complete data stream with the separator is written to the target storage area, thereby forming a preliminary event log record. The method for constructing the preliminary event log records is structured, standardized and easy to analyze by the methods of a data buffer area, a predefined format, a fixed sequence, separators and the like, and lays a foundation for the effective utilization of subsequent anti-interference codes and log data. The data buffer area is used for data block management, the efficiency is improved, the predefined format and the fixed sequence ensure the structuring and consistency of log records, and the separator is used for clearly dividing data fields, so that the subsequent analysis is convenient.
In particular, for the construction of preliminary event log records, a predefined binary format is employed, ensuring standardization of log data structures. The key information, timestamp, event type, event description and CRC check code, is arranged in a predetermined order, e.g. timestamp first, event type, event description, and finally CRC check code. This fixed order makes the log records predictable, facilitating subsequent data parsing operations. In the data combining process, the individual components, namely the time stamp, event type, event description and CRC check code, are first loaded into the respective data buffers. The data buffer may be understood as a storage area reserved in the memory for temporarily storing the data segments to be processed. By using data buffers, management of individual data segments becomes more independent and efficient. After the data loading is completed, the data is read from the respective data buffers in a predefined binary format and in a fixed order. A preset separator is inserted between reading adjacent data blocks. The function of the separator is to mark the boundaries of the different data fields in the continuous binary data stream. For example, a specific byte sequence, such as "0xFF 0xFE", may be used as the delimiter. Finally, the complete data stream with the delimiter is written to a target storage area, which may be a non-volatile storage medium such as a flash memory or a hard disk. Thus, the preliminary event log record construction is completed. The log record constructed by the method has clear structure and easy analysis, and provides powerful support for subsequent log analysis and fault tracing.
In some embodiments, the construction process of the preliminary event log record for capacitance measurement and control instrument data processing may be implemented as follows. First, assuming that the time stamp is 64-bit integer data, the event type is 8-bit enumeration type, the event is described as a character string of 256 bytes of maximum length, and the CRC check code is 16-bit integer data. For the four parts of data, four data buffers, namely a time stamp buffer, an event type buffer, an event description buffer and a CRC check code buffer are respectively allocated in the memory. When the event log record needs to be constructed, the calibrated time stamp data is written into the time stamp buffer, the event type data is written into the event type buffer, the event description character string is written into the event description buffer, and the calculated CRC check code is written into the CRC check code buffer. The predefined binary format is set to timestamp (8 bytes) +separator (2 bytes) +event type (1 byte) +separator (2 bytes) +event description (variable length, maximum 256 bytes) +separator (2 bytes) +crc check code (2 bytes). The delimiter selects the byte sequence "0xFF 0xFE". According to this format, data is sequentially read from each buffer and a separator is inserted. For example, 8 bytes of data are read from the timestamp buffer, then "0xFF 0xFE" is inserted, then 1 byte of data is read from the event type buffer, then "0xFF 0xFE" is inserted, and so on until the data of the CRC check code buffer is read and the last delimiter is inserted. Finally, the resulting binary data stream is written to a pre-allocated target storage area, such as a block of flash memory. This flash area is used to store event log records. Through the steps, the structured preliminary event log record is constructed, and a data base is provided for subsequent anti-interference coding and remote access. With the data buffers, predefined formats and separators, the structuring and parsing efficiency of log records is guaranteed.
In some embodiments, the step of performing anti-interference encoding on the preliminary event log record based on the RS code to obtain a final event log record includes:
Determining coding parameters of an RS code, wherein the coding parameters comprise a codeword length, an information bit length and an error correction capability coefficient, the codeword length is determined according to the length of a preliminary event log record, the information bit length is determined according to the sum of a time stamp, an event type and a data length of an event description, and the error correction capability coefficient is calculated according to the following formula: Wherein, the method comprises the steps of, In order for the error correction capability coefficient to be a function of,For the length of the code word,,For the order of the galois field,Is the information bit length;
generating a generator polynomial of the RS code according to the determined RS code coding parameters;
regarding the time stamp, the event type and the event description in the preliminary event log record as information bits, and filling after the information bits Zero, forming a data sequence to be coded;
RS encoding the data sequence to be encoded by using the generator polynomial to obtain The number of check bits is the same as the number of zeros filled with information bits;
after all check bits are appended to the information bits, a final event log record is formed containing the timestamp, event type, event description and check bits.
And determining coding parameters, which lays a foundation for subsequent RS codes and ensures that the RS codes can be configured according to the actual condition of the event log record. The code word length is determined according to the length of the preliminary event log record, the RS code is ensured to process event log records with various lengths, the information bit length is determined according to the time stamp, the event type and the sum of the data lengths of the event description, the effective data range of the RS code encoding is defined, the error correction capability coefficient is calculated according to the code word length and the information bit length, and a theoretical basis is provided for the error correction capability of the RS code. The generation of the generator polynomial of the RS code, which is the core step of the RS code, is the mathematical basis of the RS code, which determines the coding and decoding characteristics of the RS code. The time stamp, the event type and the event description in the preliminary event log record are regarded as information bits, a plurality of zeros are filled after the information bits to form a data sequence to be encoded, and the data preparation stage of RS encoding ensures that the RS encoding can protect key information in the event log record. And performing RS encoding on the data sequence to be encoded by using a generator polynomial to obtain a plurality of check bits, wherein the step is a key step of RS encoding, and the original information bits are converted into code words containing the check bits through RS encoding, and the check bits are used for subsequent error detection and correction. The number of check bits is the same as the number of zeros filled with information bits, so that the error correction capability of the RS code is ensured. After all check bits are added to the information bits, a final event log record containing the time stamp, the event type, the event description and the check bits is formed, and the final event log record contains original event information and check information for interference resistance, so that the reliability of the event log record in a complex electromagnetic environment is improved.
Specifically, by determining coding parameters such as codeword length, information bit length, error correction capability coefficient, etc., the RS code can be flexibly configured for preliminary event log records of different lengths. The generator polynomial serves as a mathematical basis for RS encoding, providing theoretical support for the subsequent encoding and decoding process. The time stamp, the event type and the event description are used as information bits for protection, so that key information recorded by the event log is not easy to lose or damage in a severe electromagnetic environment. The RS encoding process processes the data sequence to be encoded using a generator polynomial to generate check bits for error detection and correction. The addition of the check bit enhances the anti-interference capability of the event log record and reduces the probability of error occurrence in the transmission or storage process of the data. The finally formed event log record contains verification information, so that higher reliability and integrity can be kept even under an electromagnetic interference environment, and the traceability of a data processing process is ensured. Therefore, the anti-interference coding of the RS code can be efficiently and reliably realized under the edge side equipment with limited resources and the complex electromagnetic environment, and the situation that the data is disordered due to the influence of electromagnetic interference to generate a high error rate is reduced.
In some embodiments, assuming a preliminary event log record length of 255 bytes, where the timestamp occupies 8 bytes, the event type occupies 2 bytes, and the event description occupies 200 bytes, the information bit length is 210 bytes. The codeword length is determined to be 255 bytes, and the error correction capability coefficient is calculated by the codeword length and the information bit length. A generator polynomial of the RS (255, 210) code is generated based on the determined coding parameters. The time stamp, event type and event description are considered as information bits and 45 zero bytes are padded after the information bits, constituting a 255 byte sequence of data to be encoded. And performing RS encoding on the data sequence to be encoded by using the generated RS code generating polynomial to obtain 45 check bytes. These 45 check bytes are appended to the 210 byte information bits to form a 255 byte final event log record. The final event log record contains a time stamp, an event type, an event description and a 45-byte RS check code, so that the reliability of the data in a complex electromagnetic environment is improved.
In some embodiments, the step of generating the generator polynomial of the RS code according to the determined RS code encoding parameters comprises:
initializing a Galois field according to the determined RS code coding parameters;
based on the Galois field, a generator polynomial of the RS code is generated.
In this embodiment, initializing the Galois field may be implemented by first determining the Galois field's order, typically a power of 2, e.g., the power of 2 to the power of 8, i.e., 256, based on the RS code encoding parameters. Then, an irreducible polynomial in the binary domain is selected, the order of which is the same as the determined degree, for example, when the order is 256, an irreducible polynomial of the order 8 may be selected. The choice of irreducible polynomials is preset and can be chosen according to the application requirements and the computing resources. Elements and operation rules of the Galois field, including addition and multiplication operations, are then constructed based on the selected irreducible polynomial. The generator polynomial generating the RS code may be implemented such that, after finishing the galois field initialization, a root of the generator polynomial is determined according to the error correction capability coefficient of the RS code. The root of the generator polynomial is an element in the Galois field, with successive power elements of the Galois field generally selected as the root. For example, assume that the error correction capability coefficient isCan selectA kind of electronic deviceTo the power of the method,A kind of electronic deviceTo the power ofA kind of electronic deviceThe power of the square is taken as the root, wherein,Is the primitive element of the galois field,Is a preset offset. Thus, the generator polynomial may be obtained by multiplying the first order polynomials rooted at the roots. Polynomial multiplication is performed in the Galois field, ensuring that the coefficients of the generator polynomial are also elements in the Galois field.
Specifically, based on the galois field, a generator polynomial of the RS code is generated by:
D1. selecting primitive polynomials The primitive polynomial is a polynomial of a specified order (for example, m order) and satisfies the Galois fieldAll elements above can be represented by the root of the primitive polynomial;
D2. Based on the primitive polynomial, a continuous root is determined (i.e. as described above ......) The root is obtained by calculating primitive elements and error correction capability coefficients of a Galois field;
D3. based on determining the continuous root, a generator polynomial is obtained, expressed in particular as:
;
wherein, the To be aboutIs used for generating a polynomial of the generation formula,In the form of a variable which is a form variable,Is the primitive element of the Galois field, i.e. the root of the primitive polynomial,In order for the offset to be a preset value,Is an error correction capability coefficient.
In the generator polynomial of the RS code,Is a formal variable that itself has no specific value but rather acts as a placeholder in the polynomial to represent the different orders of the polynomial. In the RS encoding process, the coefficients in the polynomial are the values actually involved in the operation, and these coefficients are taken from the galois field.
When calculating the generator polynomial, the polynomial needs to be expanded to obtain: Wherein, the method comprises the steps of, ...Are coefficients of polynomials, which are elements in the Galois field.
In the RS encoding process, the data sequence to be encoded is represented as an information polynomial, and then the information polynomial is removed by a generator polynomial, and the remainder obtained is a parity bit. In the course of this process, the process,Still only one form variable, the coefficients of the polynomial actually participate in the operation.
Specifically, the step of generating the generator polynomial of the RS code is to solve the problem of how to accurately and efficiently obtain the generator polynomial. By initializing the Galois field first, a mathematical basis is provided for the subsequent generator polynomial, ensuring that all operations are performed within a predefined finite field, and ensuring the mathematical correctness of the RS encoding. Then, on the basis of the Galois field, a proper root is selected and polynomial multiplication is carried out, so that a generator polynomial meeting the RS code coding requirement is finally obtained. The method of the sub-steps ensures that the process of generating the polynomial is clearer and more standard, and improves the reliability of the generating polynomial. The accurate and effective polynomial generation provides necessary technical support for the subsequent RS coding links, and further improves the anti-interference capability of event log records.
In some embodiments, for example, for an RS code with an error correction capability coefficient of 8, a galois field with an order of 8 is first initialized. The Galois field process is initialized by selecting an irreducible polynomial(In this polynomial expression,The highest power of 8 and the coefficient is 1.The 4 th, 3 rd and 2 nd powers of coefficients are 1. The constant term coefficient is 1. The remaining terms, i.e., the coefficients of the 7 th, 6 th, 5 th and 1 st powers of x, are 0 and are not explicitly written in the polynomial). Based on this polynomial, a finite field containing 256 elements is constructed and the rules of addition and multiplication within the field are determined. Then, assuming that the offset is 0, the root of the generator polynomial is determined to be,,...,. Generating polynomial through calculationObtained. Polynomial multiplication inInternal completion, finally obtaining 16-order generator polynomial. Therefore, the process of generating the polynomial is embodied and can be directly used for the subsequent RS coding process, and the anti-interference capability is provided for the data verification information.
In some embodiments, the step of RS encoding the data sequence to be encoded using a generator polynomial to obtain a plurality of check bits comprises:
inputting the generating polynomial and the data sequence to be coded into a linear feedback shift register, and obtaining a remainder sequence through modulo-2 division operation;
And carrying out bit inversion on the remainder sequence to obtain a plurality of check bits.
The linear feedback shift register is configured to operate based on the galois field, which ensures the correctness of the mathematical basis of the RS code. The selection and configuration of the Galois field is accomplished based on predetermined RS code encoding parameters. The generator polynomial and the data sequence to be encoded are provided as inputs to a linear feedback shift register. The initial state of the linear feedback shift register is set to be all zero, so that the deterministic initial state of the coding process is ensured. The modulo-2 division operation is performed in a linear feedback shift register to produce a remainder sequence. This remainder sequence directly reflects the RS encoded parity information. To adapt to a particular RS code standard or to optimize hardware implementation, the remainder sequence is then subjected to a bit reversal process, resulting in a final plurality of parity bits.
Specifically, in the RS encoding process, the generator polynomial and the data sequence to be encoded are first loaded into a linear feedback shift register. The registers and feedback paths within the linear feedback shift register are configured according to the operation rules of the Galois field. When the data sequence moves into the linear feedback shift register bit by bit, the register performs the modulo-2 division operation at the same time. The result of the operation, i.e., the remainder sequence, is output after all the data bits have been processed. As a preferred embodiment, the bit reversal operation is performed after the remainder sequence is output to generate parity bits that meet the requirements of a particular RS code format. For example, in one specific implementation of a linear feedback shift register, it may be constructed using exclusive-or gates and shift register cells, the connection of which and the number of shift register cells are determined by a generator polynomial. The data sequence to be encoded may be input into the linear feedback shift register in serial or parallel fashion.
In some embodiments, consider an example of RS (255, 239) encoding applied to a capacitance measurement and control instrument. The codeword length is set to 255 bytes and the information bit length is 239 bytes, whereby 16 bytes of check bits can be obtained. Galois field based generator polynomialAnd (5) generating. The linear feedback shift register consists of 16 8-bit registers and a plurality of exclusive-OR gates, and the specific connection mode of the exclusive-OR gates is determined by coefficients of a generator polynomial. 239 bytes of data to be encoded are serially input byte by byte into a linear feedback shift register. After all data inputs are completed, the remainder sequence of 16 bytes stored in the linear feedback shift register is bit-reversed, and is output as a check bit of RS coding, and after the check bit is added to the original information bit, a final event log record containing check information is formed. Through the RS encoding process, the anti-interference capability of event log recording is enhanced, and the reliability of data transmission in a complex electromagnetic environment is ensured.
Referring to fig. 2, fig. 2 is a schematic diagram of a capacitance measurement and control apparatus according to some embodiments of the present invention, where the capacitance measurement and control apparatus is integrated in a back-end control device in a form of a computer program, and includes:
An acquisition module 100 for acquiring capacitor operation data;
the monitoring module 200 is used for monitoring key nodes of the data processing flow according to the capacitor operation data, wherein the key nodes comprise a data acquisition starting node, a filtering processing completion node and a fault judging completion node;
The generation module 300 is used for generating a corresponding event log record according to a preset log record level when the key node is monitored, wherein the event log record comprises a time stamp, an event type, an event description and data check information, the time stamp precision is microsecond, the event type is used for identifying a data processing stage, the event description is used for recording details of a processing step, the data check information comprises CRC check codes adopting cyclic redundancy check, and the different log record levels correspond to the CRC check codes with different lengths;
the storage module 400 is configured to store the event log records in the local nonvolatile memory according to a time sequence, and provide a standard communication interface to support remote access and reading of the event log records, and allow the operation and maintenance personnel to remotely configure the log record level according to the requirement.
In some embodiments, the generating module 300 is configured to, when the key node is detected, generate the corresponding event log record according to the preset log record level, perform:
A. determining the monitored current key node, specifically performing:
A1. If the data acquisition starting node is data, starting a preset phase-locked loop synchronization mechanism to calibrate a time stamp so as to ensure microsecond precision;
A2. if the node is the filtering processing completion node, the mean square error of the data before and after filtering is counted, and when the mean square error exceeds a first preset threshold value, an abnormality-containing mark is added in the event type;
A3. if the node is the fault judgment completion node, analyzing the sensitivity of the input data to electromagnetic interference by using a preset fault judgment algorithm, and marking the input data with the sensitivity exceeding a second preset threshold value as a key parameter which is easy to be interfered in event description according to an analysis result;
B. The length of the CRC check code is selected according to the log record level, wherein the log record level comprises a high level, a medium level and a low level, the high level log record level correspondingly selects 32-bit CRC check codes, the medium level log record level correspondingly selects 16-bit CRC check codes, and the low level log record level correspondingly selects 8-bit CRC check codes;
C. And obtaining an event log record according to the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code.
In some embodiments, the generation module 300 performs when obtaining the event log record based on the calibrated timestamp, the event type including the exception flag, the event description including the key parameter, and the selected CRC check code:
A pre-defined binary format is adopted, and a preliminary event log record is constructed by combining the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code;
And performing anti-interference coding on the preliminary event log record based on the RS code to obtain a final event log record.
In some embodiments, the generation module 300 performs when used to construct a preliminary event log record in a predefined binary format by combining the calibrated timestamp, the event type containing the anomaly flag, the event description containing the key parameters, and the selected CRC check code:
respectively loading the calibrated time stamp, the event type containing the abnormal mark, the event description containing the key parameter and the selected CRC check code into a corresponding data buffer area;
According to a predefined binary format, according to the sequence of the time stamp, the event type, the event description and the CRC check code, the data of each data buffer area is read, a preset separator is inserted, then the data are written into a target storage area, and finally a preliminary event log record is formed.
In some embodiments, the generating module 300 performs when configured to perform anti-interference encoding on the preliminary event log record based on the RS code to obtain the final event log record:
Determining coding parameters of an RS code, wherein the coding parameters comprise codeword length, information bit length and error correction capability coefficients, the codeword length is determined according to the length of a preliminary event log record, the information bit length is determined according to the sum of a time stamp, an event type and the data length of an event description, and the error correction capability coefficients are calculated according to the codeword length and the information bit length;
generating a generator polynomial of the RS code according to the determined RS code coding parameters;
Taking the time stamp, the event type and the event description in the preliminary event log record as information bits, and filling a plurality of zeros after the information bits to form a data sequence to be encoded;
performing RS encoding on a data sequence to be encoded by using a generating polynomial to obtain a plurality of check bits, wherein the number of the check bits is the same as the number of zeros filled with information bits;
after all check bits are appended to the information bits, a final event log record is formed containing the timestamp, event type, event description and check bits.
In some embodiments, the generating module 300 performs when generating the generator polynomial for the RS code according to the determined RS code encoding parameters:
initializing a Galois field according to the determined RS code coding parameters;
based on the Galois field, a generator polynomial of the RS code is generated.
In some embodiments, the generating module 300 performs when performing RS encoding on a data sequence to be encoded using a generating polynomial to obtain a plurality of check bits:
inputting the generating polynomial and the data sequence to be coded into a linear feedback shift register, and obtaining a remainder sequence through modulo-2 division operation;
And carrying out bit inversion on the remainder sequence to obtain a plurality of check bits.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present invention and is not intended to limit the scope of the present invention, and various modifications and variations will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (4)

1.一种电容测控仪数据处理方法,其特征在于,包括以下步骤:1. A capacitance measurement and control instrument data processing method, characterized in that it comprises the following steps: 获取电容器运行数据;Obtain capacitor operation data; 根据所述电容器运行数据,监测数据处理流程的关键节点;所述关键节点包括数据采集开始节点、滤波处理完成节点和故障判断完成节点;Based on the capacitor operation data, key nodes of the data processing process are monitored; the key nodes include the data collection start node, the filtering process completion node, and the fault judgment completion node; 在监测到所述关键节点时,根据预设的日志记录级别,生成对应的事件日志记录;所述事件日志记录包含时间戳、事件类型、事件描述以及数据校验信息;所述时间戳精度为微秒级;所述事件类型用于标识数据处理阶段;所述事件描述用于记录处理步骤细节;所述数据校验信息包括采用循环冗余校验的CRC校验码,且不同的所述日志记录级别对应有不同长度的CRC校验码;When the key node is monitored, a corresponding event log record is generated according to the preset log record level; the event log record includes a timestamp, event type, event description and data verification information; the timestamp accuracy is microsecond level; the event type is used to identify the data processing stage; the event description is used to record the details of the processing steps; the data verification information includes a CRC checksum using a cyclic redundancy check, and different log record levels correspond to CRC checksums of different lengths; 将所述事件日志记录按照时间顺序存储于本地非易失性存储器,并提供标准通信接口,以支持远程访问与读取所述事件日志记录,且允许运维人员根据需求远程配置日志记录级别;The event log records are stored in local non-volatile memory in chronological order, and a standard communication interface is provided to support remote access and reading of the event log records, and to allow operation and maintenance personnel to remotely configure the logging level as needed; 在监测到所述关键节点时,根据预设的日志记录级别,生成对应的事件日志记录的步骤包括:When the key node is detected, the steps of generating a corresponding event log record according to a preset log record level include: A.确定监测到的当前关键节点,具体执行:A. Determine the current key nodes monitored and implement the following: A1.若为数据采集开始节点,则启动预设的锁相环同步机制校准时间戳,以确保微秒级精度;A1. If it is the data collection start node, the preset phase-locked loop synchronization mechanism is activated to calibrate the timestamp to ensure microsecond accuracy. A2.若为滤波处理完成节点,则统计滤波前后数据的均方差,并当所述均方差超过第一预设阈值时,在所述事件类型中添加含异常标志;A2. If the filtering process is completed at the node, the mean square error of the data before and after filtering is calculated, and when the mean square error exceeds a first preset threshold, an abnormal flag is added to the event type; A3.若为故障判断完成节点,则利用预设的故障判断算法,分析输入数据对电磁干扰的敏感度,并根据分析结果,在所述事件描述中将敏感度超过第二预设阈值的输入数据标记为易受干扰的关键参数;所述输入数据为所述电容器运行数据中被输入到所述故障判断算法的数据;A3. If the fault diagnosis is complete, analyze the sensitivity of the input data to electromagnetic interference using a preset fault diagnosis algorithm. Based on the analysis results, mark the input data with a sensitivity exceeding a second preset threshold as a key parameter susceptible to interference in the event description. The input data is the capacitor operating data input to the fault diagnosis algorithm. B.根据所述日志记录级别选择CRC校验码长度;所述日志记录级别包括高级、中级和低级,高级的日志记录级别对应选择32位的CRC校验码,中级的日志记录级别对应选择16位的CRC校验码,低级的日志记录级别对应选择8位的CRC校验码;B. Select the CRC check code length according to the logging level; the logging level includes high, medium and low. The high logging level corresponds to the selection of a 32-bit CRC check code, the medium logging level corresponds to the selection of a 16-bit CRC check code, and the low logging level corresponds to the selection of an 8-bit CRC check code; C.根据校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,得到所述事件日志记录;C. Obtain the event log record based on the calibrated timestamp, the event type including the abnormal flag, the event description including the key parameters, and the selected CRC check code; 根据校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,得到所述事件日志记录的步骤包括:The steps of obtaining the event log record according to the calibrated timestamp, the event type including the abnormality flag, the event description including the key parameters and the selected CRC check code include: 采用预定义的二进制格式,通过组合校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,构建出初步的事件日志记录;Using a predefined binary format, a preliminary event log record is constructed by combining a calibrated timestamp, an event type with an exception flag, an event description with key parameters, and a selected CRC checksum. 基于RS码,对所述初步的事件日志记录进行抗干扰编码,得到最终的事件日志记录。Based on the RS code, the preliminary event log record is subjected to anti-interference encoding to obtain a final event log record. 2.根据权利要求1所述的电容测控仪数据处理方法,其特征在于,采用预定义的二进制格式,通过组合校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,构建出初步的事件日志记录的步骤包括:2. The capacitance measurement and control instrument data processing method according to claim 1, wherein the step of constructing a preliminary event log record using a predefined binary format by combining a calibrated timestamp, an event type including an abnormality flag, an event description including key parameters, and a selected CRC checksum comprises: 将校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述以及所选的CRC校验码,分别加载至对应的数据缓冲区;Load the calibrated timestamp, event type including abnormality flag, event description including key parameters, and selected CRC check code into the corresponding data buffer respectively; 根据预定义的二进制格式,按照时间戳、事件类型、事件描述、CRC校验码的顺序,通过读取各个数据缓冲区的数据,并插入预设分隔符,然后写入目标存储区域,最终形成初步事件日志记录。According to the predefined binary format, in the order of timestamp, event type, event description, and CRC check code, the data of each data buffer is read, the preset separator is inserted, and then written to the target storage area to finally form a preliminary event log record. 3.一种电容测控仪数据处理装置,其特征在于,包括:3. A capacitance measurement and control instrument data processing device, comprising: 获取模块,用于获取电容器运行数据;An acquisition module is used to obtain capacitor operation data; 监测模块,用于根据所述电容器运行数据,监测数据处理流程的关键节点;所述关键节点包括数据采集开始节点、滤波处理完成节点和故障判断完成节点;A monitoring module, configured to monitor key nodes of a data processing flow according to the capacitor operation data; the key nodes include a data acquisition start node, a filtering process completion node, and a fault determination completion node; 生成模块,用于在监测到所述关键节点时,根据预设的日志记录级别,生成对应的事件日志记录;所述事件日志记录包含时间戳、事件类型、事件描述以及数据校验信息;所述时间戳精度为微秒级;所述事件类型用于标识数据处理阶段;所述事件描述用于记录处理步骤细节;所述数据校验信息包括采用循环冗余校验的CRC校验码,且不同的所述日志记录级别对应有不同长度的CRC校验码;A generation module is configured to generate a corresponding event log record according to a preset log record level when the key node is monitored; the event log record includes a timestamp, an event type, an event description, and data verification information; the timestamp accuracy is microsecond level; the event type is used to identify the data processing stage; the event description is used to record the details of the processing steps; the data verification information includes a cyclic redundancy check (CRC) check code, and different log record levels correspond to CRC check codes of different lengths; 存储模块,用于将所述事件日志记录按照时间顺序存储于本地非易失性存储器,并提供标准通信接口,以支持远程访问与读取所述事件日志记录,且允许运维人员根据需求远程配置日志记录级别;A storage module is used to store the event log records in a local non-volatile memory in chronological order, and provide a standard communication interface to support remote access and reading of the event log records, and allow operation and maintenance personnel to remotely configure the logging level as needed; 生成模块在用于在监测到所述关键节点时,根据预设的日志记录级别,生成对应的事件日志记录的时候执行:The generation module is used to generate corresponding event log records according to the preset log record level when monitoring the key node: A.确定监测到的当前关键节点,具体执行:A. Determine the current key nodes monitored and implement the following: A1.若为数据采集开始节点,则启动预设的锁相环同步机制校准时间戳,以确保微秒级精度;A1. If it is the data collection start node, the preset phase-locked loop synchronization mechanism is activated to calibrate the timestamp to ensure microsecond accuracy. A2.若为滤波处理完成节点,则统计滤波前后数据的均方差,并当所述均方差超过第一预设阈值时,在所述事件类型中添加含异常标志;A2. If the filtering process is completed at the node, the mean square error of the data before and after filtering is calculated, and when the mean square error exceeds a first preset threshold, an abnormal flag is added to the event type; A3.若为故障判断完成节点,则利用预设的故障判断算法,分析输入数据对电磁干扰的敏感度,并根据分析结果,在所述事件描述中将敏感度超过第二预设阈值的输入数据标记为易受干扰的关键参数;所述输入数据为所述电容器运行数据中被输入到所述故障判断算法的数据;A3. If the fault diagnosis is complete, analyze the sensitivity of the input data to electromagnetic interference using a preset fault diagnosis algorithm. Based on the analysis results, mark the input data with a sensitivity exceeding a second preset threshold as a key parameter susceptible to interference in the event description. The input data is the capacitor operating data input to the fault diagnosis algorithm. B.根据所述日志记录级别选择CRC校验码长度;所述日志记录级别包括高级、中级和低级,高级的日志记录级别对应选择32位的CRC校验码,中级的日志记录级别对应选择16位的CRC校验码,低级的日志记录级别对应选择8位的CRC校验码;B. Select the CRC check code length according to the logging level; the logging level includes high, medium and low. The high logging level corresponds to the selection of a 32-bit CRC check code, the medium logging level corresponds to the selection of a 16-bit CRC check code, and the low logging level corresponds to the selection of an 8-bit CRC check code; C.根据校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,得到所述事件日志记录;C. Obtain the event log record based on the calibrated timestamp, the event type including the abnormal flag, the event description including the key parameters, and the selected CRC check code; 生成模块在用于根据校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,得到所述事件日志记录的时候执行:The generation module is executed when obtaining the event log record based on the calibrated timestamp, the event type including the abnormal flag, the event description including the key parameters and the selected CRC check code: 采用预定义的二进制格式,通过组合校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,构建出初步的事件日志记录;Using a predefined binary format, a preliminary event log record is constructed by combining a calibrated timestamp, an event type with an exception flag, an event description with key parameters, and a selected CRC checksum. 基于RS码,对所述初步的事件日志记录进行抗干扰编码,得到最终的事件日志记录。Based on the RS code, the preliminary event log record is subjected to anti-interference encoding to obtain a final event log record. 4.根据权利要求3所述的电容测控仪数据处理装置,其特征在于,生成模块在用于采用预定义的二进制格式,通过组合校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述及所选的CRC校验码,构建出初步的事件日志记录的时候执行:4. The capacitance measurement and control instrument data processing device according to claim 3, wherein the generation module is configured to construct a preliminary event log record by combining a calibrated timestamp, an event type including an abnormality flag, an event description including key parameters, and a selected CRC checksum in a predefined binary format, and executing: 将校准后的时间戳、含异常标志的事件类型、含关键参数的事件描述以及所选的CRC校验码,分别加载至对应的数据缓冲区;Load the calibrated timestamp, event type including abnormality flag, event description including key parameters, and selected CRC check code into the corresponding data buffer respectively; 根据预定义的二进制格式,按照时间戳、事件类型、事件描述、CRC校验码的顺序,通过读取各个数据缓冲区的数据,并插入预设分隔符,然后写入目标存储区域,最终形成初步事件日志记录。According to the predefined binary format, in the order of timestamp, event type, event description, and CRC check code, the data of each data buffer is read, the preset separator is inserted, and then written to the target storage area to finally form a preliminary event log record.
CN202510585062.1A 2025-05-08 2025-05-08 A capacitance measurement and control instrument data processing method and device Active CN120104429B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510585062.1A CN120104429B (en) 2025-05-08 2025-05-08 A capacitance measurement and control instrument data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510585062.1A CN120104429B (en) 2025-05-08 2025-05-08 A capacitance measurement and control instrument data processing method and device

Publications (2)

Publication Number Publication Date
CN120104429A CN120104429A (en) 2025-06-06
CN120104429B true CN120104429B (en) 2025-09-02

Family

ID=95874314

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510585062.1A Active CN120104429B (en) 2025-05-08 2025-05-08 A capacitance measurement and control instrument data processing method and device

Country Status (1)

Country Link
CN (1) CN120104429B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562768A (en) * 2016-09-14 2018-01-09 彩讯科技股份有限公司 A kind of data handling procedure dynamic back jump tracking method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10019471B2 (en) * 2013-01-31 2018-07-10 Hewlett Packard Enterprise Development Lp Event log system
US20150347599A1 (en) * 2014-05-28 2015-12-03 Arcadia Solutions, LLC Systems and methods for electronic health records
WO2018145743A1 (en) * 2017-02-08 2018-08-16 Huawei Technologies Co., Ltd. System and method for failure management using distributed execution traces
CN108038049B (en) * 2017-12-13 2021-11-09 西安电子科技大学 Real-time log control system and control method, cloud computing system and server
GB2631117A (en) * 2023-06-20 2024-12-25 Brozzoni Stefano System and method for tracking checkpoint transition times for races
CN119512859A (en) * 2024-10-29 2025-02-25 重庆川仪自动化股份有限公司 A method, device, equipment and medium for monitoring operation log of intelligent instrument
CN119689922A (en) * 2024-11-13 2025-03-25 北京中电科电子装备有限公司 Industrial equipment log data processing method, device, equipment and medium
CN119917390A (en) * 2025-01-22 2025-05-02 深圳市怡珑科技有限公司 A software application performance optimization method based on big data analysis

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562768A (en) * 2016-09-14 2018-01-09 彩讯科技股份有限公司 A kind of data handling procedure dynamic back jump tracking method

Also Published As

Publication number Publication date
CN120104429A (en) 2025-06-06

Similar Documents

Publication Publication Date Title
US9792812B1 (en) Method for correcting electricity meter readings
US8643471B2 (en) Method and system for state encoding
He et al. A hierarchical scrubbing technique for SEU mitigation on SRAM-based FPGAs
CN112262387A (en) Detection device and detection method
CN110851113A (en) Method and device for detecting randomness of byte sequence, storage medium and electronic equipment
CN120104429B (en) A capacitance measurement and control instrument data processing method and device
CN112380046A (en) Calculation result checking method, system, device, equipment and storage medium
CN110941236B (en) PLC safety monitoring and dynamic measuring method and system
CN119704090A (en) Intelligent electric batch head intelligent calibration and identification method, device, equipment and storage medium
CN109739715B (en) A fault detection method and device
CN117407264B (en) Method, device, computer equipment and medium for predicting memory aging residual time
CN115495082B (en) TLV format data automatic conversion method and related equipment
KR102645325B1 (en) Apparatus and method for acquiring synchrophasor data from different pmus
Sadi et al. A new error correction coding approach
Noubir et al. Signature-based method for run-time fault detection in communication protocols1
CN117743065B (en) Method for detecting irradiation of memory chip
CN111740817A (en) Code tampering detection method and system for concentrator in power data acquisition system
CN118467237B (en) Solid state disk error correction stabilization method, equipment and medium based on LDPC and ECC technologies
US20250004723A1 (en) Random number generation using sparse noise source
CN120509225B (en) New energy outgoing transformer load capacity assessment method
CN108388648A (en) Configuration file access method, system, equipment and computer readable storage medium
CN205246862U (en) Standard electric energy meter
CN116304763B (en) Power data pre-analysis method, system, equipment and medium
CN119883413B (en) ICD-based switch configuration generation method, device, equipment and medium
CN119830709B (en) A high-safety, wide-temperature range analog correction method, system, device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant