[go: up one dir, main page]

WO2025176272A1 - A method of determining an event of an industrial process at an industrial site and a system thereof - Google Patents

A method of determining an event of an industrial process at an industrial site and a system thereof

Info

Publication number
WO2025176272A1
WO2025176272A1 PCT/DK2025/050028 DK2025050028W WO2025176272A1 WO 2025176272 A1 WO2025176272 A1 WO 2025176272A1 DK 2025050028 W DK2025050028 W DK 2025050028W WO 2025176272 A1 WO2025176272 A1 WO 2025176272A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
processing apparatus
data processing
industrial
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/DK2025/050028
Other languages
French (fr)
Inventor
Kasper Koops Kratmann
Daniel Aron GUDNASON
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Claviate Aps
Original Assignee
Claviate Aps
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Claviate Aps filed Critical Claviate Aps
Publication of WO2025176272A1 publication Critical patent/WO2025176272A1/en
Priority to DKPA202530631A priority Critical patent/DK202530631A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload

Definitions

  • the present invention relates to a method of determining at least one event of an industrial process at an industrial site, comprising acts of collecting sensor data representing conditions at/or related to the industrial site, inputting the collected data to a data processing apparatus, performing data analysis on the one or more of the inputted data using feature extraction for determine at least one event, logging a timestamp and a position of the at least one determined event, and synchronising the at least one event with the data inputted to the data processing apparatus to form a structured dataset, storing said structured dataset in a database.
  • the time stamp is synchronized to an external clock operating a globally unique time system.
  • the present invention also relates to a system configured to perform the method.
  • the present invention further relates to a computer program configured to execute the method and a computer-readable medium configured to store the computer program.
  • the construction or installation process is governed by a contract between the parties, which specifies terms and conditions for payments, project milestones, quality of work, breach of contract, deadline for completion of work and other information.
  • the construction process normally consists of a plurality of various subtasks that must be completed before the construction is completed.
  • the contractor and sub-contractors must submit a payment claim and supporting documentation for the completed work in accordance with the contract.
  • the client assesses the completed work to verify the payment claim. Any discrepancies between the claimed work and the assessed work are settled in accordance with the terms laid out in the contract.
  • objects of interest may be tracked automatically by processing the captured image data using object detection algorithms or machine learning algorithms implemented in a remote server or local data processing unit.
  • object detection algorithms or machine learning algorithms implemented in a remote server or local data processing unit.
  • Such algorithms can be trained and evaluated via various available programming platforms provided by software providers using either open-source datasets or custom datasets.
  • WO 2021/110226 Al discloses a method and system for monitoring an industrial site, comprising a plurality of cameras connected to a data processing apparatus.
  • the raw image data is processed in data processing apparatus by an object detection algorithm, where the detected object is anonymised in the image data and the altered image data is stored in a database for later inspection.
  • This solution focuses on anonymising image data so it may be stored for later analysis.
  • WO 2018/191555 Al discloses a method of recognising repeating actions of an overall assembly line process using deep learning, wherein a neutral network in communication with a long-short term memory is trained to detect actions and compare it to a benchmark process. Abnormalities in the detected actions is determined by the neural network based on the benchmark process, and video snippets of these abnormalities may be stored for later analysis.
  • This solution is designed specifically for monitoring assembly lines for automotive parts or computer parts, and uses only two-dimensional data in the data processing.
  • US 11175650 B2 discloses another system for monitoring an assembly line process using sensors arranged at working stations and workers interacting with the working stations.
  • the neural network and long-short term memory are trained to detect objects and actions from the sensors stream using deep learning, which relate to assembly steps of an overall assembly process.
  • the workers or robots may interact with the system at the working stations, e.g., via user terminals.
  • the neural network is configured to generate a set of performance data, which can be compared with a set of reference data. During optimization, operators may access portions of the set of data for optimising the assembly process This solution is designed specifically for real-time monitoring of the assembly of components.
  • US 11321944 B2 discloses a system for monitoring repeating actions in an overall process, where a neural network and long-short term memory are trained to detect actions in video data.
  • the neural network is trained to detect the start, the end, and other details of each action based on the image frames, and to generate statistical data based on the details of the actions. This solution is designed specifically for monitoring assembly lines in a factory.
  • US2021/0117684 discloses a system for detecting cycle data using object and motion properties of a single object with the aim of optimizing a repetitive production at a stationary production line.
  • the abovementioned systems are all limited to monitoring repeating actions of an overall process, particularly of assembly lines in automotive and computer part manufacturing factories. These systems are limited for use at stationary sites, where the actions are performed at fixed stations or tables.
  • One object of the present invention is to solve the problems of the abovementioned prior art.
  • Another object of the present invention is to provide a method and system that allows for determining an event of an industrial process at of an industrial process at an industrial site including dynamic sites.
  • One object of the present invention is achieved by a method of determining at least one event of an industrial process at an industrial site, comprising acts of:
  • a data processing apparatus comprising a computer unit
  • An event is defined by specific time and may be defined by a combination of conditions or features in the data or a relation between data at the specific time.
  • a process segment consists of a minimum of two events, e.g. a start event and an end event.
  • the timestamp must have an absolute reference to an external clock running a globally unique time system such as coordinated universal time (UTC), international atomic time (TAI), GPS time or similar. This guarantees consistency, accuracy, and resistance to timing discrepancies caused by drift, system latency, or local clock variations.
  • UTC coordinated universal time
  • TAI international atomic time
  • the method comprises further acts of:
  • the position is synchronised to a globally unique geospatial positioning reference system.
  • the position may have a unique and well-defined relation to a globally unique geospatial positioning reference system ensuring absolute and unambiguous location data. This may be achieved by integrating satellite-based navigation system such as GPS or GLONASS, or geodetic reference frames such as WGS84 or ITRF, and additional augmentation sources such as an IMU (inertial measurement unit). By leveraging these positioning references, the system can ensure high accuracy, consistency, and resistance to signal drift or loss, thereby providing a precise and persistent positioning framework.
  • satellite-based navigation system such as GPS or GLONASS, or geodetic reference frames such as WGS84 or ITRF
  • additional augmentation sources such as an IMU (inertial measurement unit).
  • An absolute and unambiguous position ensured by relating to a globally unique geospatial positioning reference system, can also enable relating specific operations, process segments or events to unique object IDs, such as serial numbers. This may be important to document that e.g. a specific construction was completed or serviced. Unambiguous positional data may also be an important data parameter to document the construction and service history of a given object such as a wind turbine both on shore and offshore, combined with e.g. historical meteorological data.
  • the method comprises further acts of
  • one object of the present invention is achieved by a method of managing an industrial site, comprising the steps:
  • the data processing apparatus comprising a computer unit
  • one or more computer vision algorithms configured to determine a number of operational states in the image data
  • the feature extraction algorithms being configured to determine a number of unique features representative of a number of operational states in the other data
  • other data or ‘additional data’ may be understood as sensor data and/or data received from a third party.
  • Data generally covers both image data, other data and sensor data.
  • the present method provides an improved method of managing an industrial process performed at an industrial site compared to conventional methods.
  • the present method can also be used for analysing and optimizing the progress of the industrial process, or events thereof.
  • the present method is versatile and can be adapted to different applications and to changes in the dynamic mapping at the site and is able to detect events around the entire site.
  • Conventional methods for high-volume, in-line production sites, i.e., working stations along a production line are not suited for use at low-volume production or assembly sites without changes to the monitoring setup as well as significant modification to the implemented computer vision algorithms. Further, conventional monitoring systems are limited to evaluate the performance at fixed workstations at factories of the automobile or electronics industry.
  • the mapping of the industrial site preferably has an absolute reference to a globally unique global positioning reference system.
  • the term “industrial process” is defined as an overall process being performed at the industrial site, wherein the industrial process is composed of a plurality of operations (or actions) performed in parallel or in a sequential order. Further, a plurality of operational states (or tasks) is performed during each operation.
  • the term “industrial site” should in this case be understood as a site for the production, assembly, installation, or handling of large structures, or components thereof, with a size and weight unsuited for high-volume, mass production.
  • the present system and method are particularly suited for, but not limited to, use in the energy sector, such as wind energy, the shipping (freight transport) industry, or the construction industry.
  • the present method can use a set of cameras to monitor the entire industrial site, or subareas of particular interest. This allows a user of the present system to visually monitor the industrial site, or sub-areas thereof.
  • the captured image data e.g., videos or still pictures, is inputted to a data processing apparatus for data analysis and evaluation. This allows the user to access image data of the industrial site.
  • additional data is collected and inputted to the data processing apparatus for data storage and optional data analysis and evaluation.
  • the additional data is representative of conditions at or related to the industrial site and may be collected from local sources and/or external sources. This allows the user to access different types of data from multiple sources.
  • the inputted data, or a selected subset thereof, is analysed in the computer unit using computer vision algorithms, feature extraction algorithms, or a combination thereof.
  • the computer vision algorithms are configured to determine a number of operational states based on the captured image data.
  • the feature extraction algorithms are configured to determine a number of unique features representative of said operational states based on the inputted additional data. This allows the computer unit to automatically detect the operational states of an operation. This also allows the computer unit to automatically detect operational states of interest.
  • the outputs of these algorithms are then evaluated in the computer unit using classification algorithms, custom logics, or a combination thereof to determine one or more events.
  • the events may be operations of the industrial process or other events of interest, wherein the parameters thereof may be set up by the user or administrator of the present system.
  • a scaling or weight value may be applied to the outputs of these algorithms, thus making the outputs better suited for clustering and/or filtering during the evaluation. This allows the computer unit to automatically detect events based on the image data and/or the inputted additional data.
  • the additional data and image data are synchronized in the computer unit based on the determined events to form a structured dataset. At least one timestamp of the event may be used to synchronize the different types of data.
  • the structured dataset is stored in a dedicated database, or links may be stored together with the respective data in the database. This allows the user to access the present system and view the structured dataset of a selected event.
  • the computer vision algorithms comprise at least one of an image classification algorithm, an optical flow algorithm, an object detection algorithm, a semantic segmentation algorithm, an action classification algorithm, a temporal action localization algorithm, or a combination thereof.
  • the computer vision algorithm(s) may advantageously be implemented in the computer unit.
  • the computer vision algorithms may comprise an image classification algorithm, which may determine at least one state of process based on the content of the image frames in the image data as a whole.
  • the computer vision algorithms may comprise an optical flow algorithm, which may quantify the motion between two consecutive image frames in the image data.
  • the computer vision algorithms may comprise an action classification algorithm, which may determine at least one type of operational state based on the performed action in the stream of image frames in the image data.
  • the computer vision algorithms may comprise a temporal action localization algorithm, which may determine at least one start, end, and type of operational state based on the stream of image frames in the image data. This allows the computer unit to identify and determine the operational states using featurebased event detection.
  • More than one computer vision algorithm may be implemented in the computer unit, wherein the computer vision algorithms may be combined to determine the operational states. This allows for a more accurate detection of multiple operations of the industrial process.
  • the computer vision algorithms may also comprise an object tracking algorithm, which may track the motion of one or more objects between consecutive image frames in the image data.
  • object tracking algorithm may track the motion of one or more objects between consecutive image frames in the image data.
  • a one-stage or multi-stage object detection algorithm, or a semantic segmentation model may further be implemented in the computer unit.
  • the computer unit may use boundary boxes or segmentation masks (regions of interest) to localise and track objects of interest within the image frames in the image data.
  • the computer vision algorithms may also comprise an action detection algorithm, which may determine at least one type of operational state based on the performed object actions in the image frames in the image data.
  • the action detection algorithm may compute or combine special, temporal, and semantic features from the image data. This allows the operational states to be identified and determined in accordance with one or more object classes and action classes stored in the computer unit. This also allows various object actions to be identified and recognised within the image frames in the image data.
  • the industrial site changes during the industrial process.
  • the method further comprises the step of determining a position such as a geographical position of at least one object, and optionally tracking said at least one object throughout the industrial site.
  • the data processing apparatus may be configured to determine the position of one or more objects within the industrial site.
  • the data processing apparatus may further be configured to track the motion of these objects throughout the industrial site.
  • a coordinate system of the industrial site may be stored in the data processing apparatus for determining the (global or local) position of objects within the coordinate system.
  • the position of an object may be determined based on the image data using a geolocalization algorithm implemented in the computer unit. For example, the geographical position of one or more cameras in combination with computer vision-based triangulation may be used to determine the geographical positions of the objects.
  • the computer unit may then use a tracking algorithm to predict and track the positions of the objects within the image data.
  • a positioning system may communicate with the data processing apparatus for determining the position of one or more objects.
  • the positioning system may transmit position data directly to the data processing apparatus, or via local positioning units on the objects. This allows objects to be tracked even if no or limited image data is available.
  • a local positioning unit on the object may receive beacon signals from multiple nodes of the positioning system, wherein the positioning unit determines its position based on the received beacon signals.
  • multiple local positioning units of the positioning system may be arranged relative to the industrial site and the individual objects may act as mobile nodes each transmitting a beacon signal to the respective local positioning units.
  • a signal may be transmitted from the local positioning unit which receives the reflected signal from the respective nodes.
  • the position of the object may be determined using lateration or multi -laterati on, angulation or other known techniques. This allows objects to be tracked even if no or limited reception of global positioning signals is available.
  • the present method allows users of the present system to determine and track the positions of objects during various operations of the overall industrial process. This also allows for better management of the flow of materials on the industrial site, further allowing for better distribution and allocation of equipment and machinery on the industrial site.
  • At least radio communications between at least two radio communications devices and/or audio signals picked up by at least one microphone are transmitted to the data processing apparatus, wherein the computer unit extracts voice commands and/or unique audio signals from said radio communications or audio signals using the feature extraction algorithms.
  • the operational states may also be determined based on the radio communications between two or more radio communications devices and/or audio signals pick up at the industrial site.
  • the raw radio communications may be picked up by a radio scanner unit or dedicated radio unit and transmitted to the data processing apparatus and stored in the database for data analysis.
  • One or more microphones may pick up audio signals from one or more positions at the industrial site, wherein the audio signals may include voice communications between workers, various machine sounds, audio alarms, announcements over speakers, weather related sounds and the like.
  • the raw audio signals may be stored in the database for data analysis.
  • the feature extraction algorithms may be one or more speech recognition algorithms, which may detect unique voice commands in the radio communications and/or audio signals. Examples of such algorithms are disclosed in the article “Some Commonly Used Speech Feature Extraction Algorithms” by Alim, Sabur A. et al.
  • the unique voice commands may be representative of the start, end and/or type of one or more operational states.
  • a spectral analysis and discrete transformation of the voice signals may be used to extract the unique voice commands. Classifications for voice commands or parameters thereof may be set up by the user or administrator of the present system and stored in the computer unit.
  • the speech recognition algorithms may be trained to detect the voice commands in one or more languages. This allows unique voice commands spoken by workers during various operations to be stored in the computer unit.
  • the feature extraction algorithms may also be one or more audio feature extraction algorithms, which may detect unique audio signals in the radio communications and/or audio signals. Examples of such algorithms are disclosed in the article “An Overview of Audio Event Detection Methods from Feature Extraction to Classification” by Ba- baee, Elham et al.
  • the unique audio signals may be warnings, alarms, machine, or tool related sounds and/or environmental sounds. Noise reduction, filtering, and spectral analysis of the audio signals may be used to extract the unique audio signals. Classifications for unique audio signals or parameters thereof may be set up by the user or administrator of the present system and stored in the computer unit.
  • the audio feature extraction algorithms may be trained to detect unique audio signals such as warning alarms, start or stop of engines or winches, operation of tools, or other characteristic operational sounds. This allows the computer unit to automatically detect unique audio signals associated with or affecting the performance of operations.
  • a transcription (speed-to-text) algorithm may be implemented in the computer unit, wherein the transcription algorithm may perform an automatic transcription of the radio communications.
  • the transcription may then be stored in the database for later analysis.
  • the radio communications may be conducted via encrypted communications links to prevent unauthorized access to the radio communications.
  • An encrypted communications link may be established between the individual radio communications devices and/or between the radio communications devices and the data processing apparatus.
  • An encrypted or un-encrypted link may also be established between the microphones and the data processing apparatus.
  • telemetry data from one or more telemetry devices located within the industrial site is transmitted to the data processing apparatus, wherein the computer unit extracts at least one unique feature and/or process segment from the telemetry data using the feature extraction algorithms.
  • the telemetry data of one or more telemetry devices may be transmitted to the data processing apparatus and stored in the database for data analysis.
  • the telemetry device may receive data from one or more sensors or data logging systems located on the object.
  • the telemetry data may be representative of the operating conditions or performance of machines or tools, health conditions of workers, environmental conditions, or the like.
  • the telemetry data may also comprise accelerations from accelerometers, movements from gyroscopes, or compass headings from magnetometers.
  • the position data may form part of the telemetry data.
  • the telemetry data may be standard telemetry data.
  • the term “standard telemetry data” should be understood as telemetry data that is presented in accordance with an open or globally recognised standard, such as ISO 15143-3. This allows for easy access to third- party data provided by multiple OEMs.
  • the feature extraction algorithm may extract one or more operational states from the telemetry data.
  • the operational states may be representative of conditions relating to an operation of the industrial process.
  • Reference parameters or thresholds may be set up by the user or administrator of the present system and stored in the computer unit.
  • the selected telemetry signal may be compared to the reference parameters or thresholds in the computer unit, which may be used to detect certain signal characteristics or to determine if the telemetry signal is within safe operating conditions.
  • the reference parameters may also be used to detect if machines or tools are turned on, or if certain actions of the machines or tool have commenced. This allows the computer unit to automatically determine unique operational states associated with an operation based on the telemetry data.
  • telemetry data from at least one crane unit and/or image data of the at least one crane unit is transmitted to the data processing apparatus, wherein the computer unit determines at least one unique feature and/or process segment of a crane operation using the feature extract algorithms and/or the computer vision algorithms.
  • the present method allows the present system to detect events that involves the use of a crane unit, since crane operations are sensitive to environmental and load conditions and need to be monitored carefully to prevent any accidents.
  • Telemetry data of the crane unit may thus be transmitted to the data processing apparatus, wherein the telemetry data is analysed in the computer unit.
  • the feature extraction algorithms implemented in the computer unit may be used to determine the operational states of a crane operation.
  • the computer unit may also analyse the image data using the computer vision algorithms.
  • the computer vision algorithms may thus be used to also determine the operational states of the crane operation.
  • meteorological data from a meteorological unit or system is transmitted to the data processing apparatus, optionally the computer unit extracts at least one environmental feature from the meteorological data using the feature extraction algorithms.
  • the meteorological data from one or more meteorological units or systems may be transmitted to the data processing apparatus and stored in the database.
  • the meteorological data may be representative of the environmental conditions relating to the industrial site.
  • the meteorological data may comprise wind speed, temperature, lightning, humidity, precipitation, visibility, cloud cover, air pressure, wave height, wavelength, wave dwell, wave direction, or other relevant weather, wind data, or wave data.
  • the computer unit may extract one or more environmental features of the meteorological data using the feature extraction algorithms.
  • the environmental feature may be related to an operational state of an operation.
  • Reference parameters or thresholds may be set up by the user or administrator of the present system and stored in the computer unit.
  • the selected meteorological signal may be compared to the reference parameters or thresholds in the computer unit, which may determine if the meteorological signals are within safe operating conditions.
  • This allows the computer unit to automatically determine meteorological features influencing an operational state based on the meteorological data. This further allows users to evaluate the root-cause of abnormal situations, such as delays, failures, or accidents, in an easier and less data-heavy investigation.
  • the meteorological data may be actual historical weather data or weather forecasts hereof, as planning decisions are often made based on the forecast. Actual historical weather data can be used to determine e.g., abort scenarios, in which actual weather conditions exceeded safe working conditions despite the weather forecast predicting safe weather conditions.
  • the method further comprises logging additional data into the data processing apparatus using a user terminal, the logged data is representative of conditions related to a selected process segment and/or event.
  • the present method allows workers to access the present system and manually inputting additional data relating to one or more operational states of the industrial process.
  • the additional data may be logged into the present system via a dedicated user interface.
  • the additional data comprise image data (videos or still pictures), position data, text messages, measurements, or the like.
  • the user interface may be set up so that the worker can select text messages from a list or menu, upload image data or position data, record measurements or voice messages, or manually enter text.
  • the computer unit may automatically link the additional data to a dedicated operational state or operation based on the logged data.
  • the worker may select a dedicated operational state or operation when logging data.
  • the dedicated user interface may be implemented on a mobile or handheld device, a user terminal, or a computer device in communication with the data processing apparatus.
  • the worker may access a web-based user interface on the data processing apparatus.
  • the method further comprises at least one of:
  • the path of objects moving between various locations at the industrial state may be tracked by the computer unit using the image data and/or the position data. Further, multiple objects of the same type or different types may be tracked relative to each other or relative to one or more fixpoints. Computer vision algorithms may be used to identify types of objects. Optionally, the distance between objects and/or between a selected object and a fixpoints may be calculated based on the image data and/or the position data. This allows the user of the present system to identify the current position of objects within the industrial site, and to determine if objects are spaced too far apart or too close relative to each other. This also allows the user of the present system to evaluate the path of movement of objects and optimize the flow within the industrial site.
  • event classifications may be used to identify types of events involving a particular object, such as dedicated operations and operational states thereof.
  • object state classifications may be used to identify types of object states.
  • the event classifications may also be used to identify types of events that involves larger, assembled object units. This allows the user of the present system to monitor various actions performed on the objects during one or more operations of the industrial process. The various actions may be loading or unloading of objects, preparation of objects before next operation, lifting or lowering of objects into alignment or assembly of objects into a larger object unit.
  • the computer unit may also determine the position of a machine or tool relative to a dedicated point of operation and/or to a selected object based on the image data and/or the position data. This allows the user of the present system to evaluate the placement of various of machines or tools and objects relative to the dedicated point of operation. This reduces waiting time and allows for a smoother operation.
  • the image data is anonymised using an anonymisation algorithm implemented in the data processing apparatus, wherein the anonymised image data is stored in a database.
  • the raw image data may be stored locally at the industrial site, e.g., in the local node or in the camera.
  • the raw image data may be encrypted before being transmitted to the data processing apparatus.
  • the encrypted image data may then be decrypted, and the raw image data may be anonymised using an anonymisation algorithm implemented in the computer unit.
  • the anonymised image data may then be stored in another database, e.g., located at the industrial site or at a remote location. This allows the present method to comply with the data protection law.
  • the data processing and data storage may thus be performed off-site, thereby reducing the amount of local data processing and reducing the installation costs of the present system.
  • the data processing and data storage may be performed on-site, thus reducing the amount of data transmission. Users may then access the local or remote data processing apparatus via the user terminal using dedicated user profiles.
  • An initial data processing e.g., the anonymisation of the image data
  • the computer vision algorithms and/or the feature extraction algorithms may further be implemented in a local data processing apparatus or in the camera units. This reduces the need for remote data processing as any less demanding data processing algorithms can be performed locally.
  • the computer unit may automatically log one or more timestamps for each event using the meta data of the image data. For example, the computer unit may determine a start timestamp, an end timestamp, and any other timestamps of the event. The time period for the event may be calculated as the difference between the start timestamp and the end timestamp. The respective timestamps may be stored in the database together with the structured dataset.
  • the timestamps may also be used to identify the image segment of that event in the stored image data.
  • a copy of the image segment may be stored together with the structured dataset.
  • the computer unit may use the internal clock of the data processing apparatus to log the timestamps of the events.
  • the internal clock may also be used to verify or synchronize the timestamps of one or more of signals inputted to the data processing apparatus.
  • the timestamps of one inputted signal may be used to verify the timestamps of another inputted signal. This allows for the correction of incorrect or missing timestamps in the inputted signals.
  • the method further comprises steps of:
  • a training dataset may be used to train the algorithms implemented in the computer unit using machine learning to automatically detect and identify various events and objects.
  • the training dataset may be historic data from a previous industrial site or previous data from the industrial site.
  • the training dataset may comprise the events, the image data, the inputted data, or any combination thereof.
  • An administrator of the present system may evaluate the output of the computer unit after each training run and correct any errors and/or missed objects or events in the output. Once the evaluation is completed, a subsequent training run may be performed. The training section may be repeated until an acceptable level of performance is achieved.
  • the training dataset may be updated upon request, at regular intervals or after each training run. New objects and/or events may be added to the classifications, and/or the parameters for existing objects or events may be adjusted.
  • the algorithms implemented in the computer unit may thus be trained to detect objects and/or events unique for a particular industrial site, as the size and complexity of each industrial site may vary during the overall industrial process.
  • the classifications may comprise standardized classes, customized classes, or a combination thereof, which may be updated during evaluation and training of the computer unit.
  • the events of the industrial process are arranged in a chronological or predetermined order in the structured dataset.
  • the computer unit may arrange the events in accordance with a predetermined order descriptive of the industrial process.
  • the events may be arranged in a chronological order. This allows users to identify if events are performed in an incorrect order, or if events are not started or waiting for the completion of other events. This also allows user to easy identify any missing events of a particular sequence of operations.
  • the data received by the data processing apparatus may be processed in batches or in real-time, and the structured dataset may be updated continuously based on the detected events. This allows users of the present system to view the ongoing events and the completed events, thus allowing for an easy status of the progress of the industrial process.
  • a statistical analysis is performed by the computer unit on the structured dataset using the first and second timestamps and/or the time period, wherein the results of the statistical analysis are stored in the database.
  • a statistical analysis may be performed by the computer unit on the stored structured dataset using the timestamps and/or the time period of the events, where the results of the statistical analysis may also be stored in the database for later analysis.
  • the results may include averaging times, time periods between selected events, and other relevant statistical data for each operation and the overall industrial process. This allows users of the system to monitor the time spend for each event as well as when a particular event was started, stopped, or ended.
  • One or more parameters of the statistical data may be used as references for detecting abnormal events, and such abnormal events may be used for later analysis and optionally retraining of the data processing apparatus.
  • the structured dataset and/or the results of the statistical analysis may be presented to users of the present system as a report or in a dedicated graphic user interface.
  • One object of the present invention is achieved by a system determining at least one event of an industrial process at an industrial site, said system comprising:
  • At least one sensor configured to collect sensor data representing conditions at/or related to the industrial site
  • a data processing apparatus comprising a computer unit, wherein the sensor(s) is connected to the data processing apparatus via a communications link for inputting the collected data to the data processing apparatus, - wherein the computer unit of the data processing apparatus is adapted to execute the steps of any one of embodiments of the method described herein.
  • the system comprises at least one camera configured to capture image data of at least a part of the industrial site, the image data comprising a plurality of image frames, and wherein the camera(s) is connected to the data processing apparatus via a communications link for inputting the image data to the data processing apparatus.
  • one object of the present invention is achieved by a system , comprising the steps:
  • At least one camera configured to capture image data of at least a part of the industrial site, the image data comprising a plurality of image frames,
  • a data processing apparatus comprising a computer unit, wherein the cameras are connected to the data processing apparatus via a communications link for inputting the image data to the data processing apparatus,
  • the data processing apparatus being configured to further receive other data from other sources via another communications link, the other data being representative of conditions at or related to the industrial site,
  • the present system can also be used for analysing and optimizing the progress of the industrial process, or events thereof.
  • the configuration of the present system is versatile and can be adapted to different applications and to dynamic changes in the mapping at the site.
  • Conventional monitoring systems for stationary production sites are not suited for dynamic sites without changing the system setup and modifying the algorithms in the computer unit.
  • the cameras are arranged relative to the industrial site to capture image data of the industrial site, or sub-areas thereof, wherein the resolution and frame rate are selected based on the application and type of industrial site.
  • Each camera is connected to the data processing apparatus via a first communications link, e.g., an encrypted communications link.
  • the data processing apparatus comprises at least of a computer unit and a database in communication with computer unit, wherein a computer program is implemented on the computer unit and configured to execute the method step described above.
  • One or more computer vision algorithms are implemented in the computer unit for determining a plurality of operational states of an operation based on the image data.
  • the data processing apparatus uses classification algorithms and custom logics to determine an operation of the industrial process based on the operational states.
  • the data processing apparatus is configured to communicate with a positioning system via a communications link, wherein the positioning system is configured to input position data to the data processing apparatus.
  • the data processing apparatus may be configured to determine the position of one or more objects within a coordinate system defining the industrial site.
  • the data processing apparatus may further be configured to track the motion of these objects throughout the industrial site using the position data and/or the image data.
  • the data processing apparatus may communicate with a server of a third-party provider, or an integrated positioning system of the present system.
  • the positioning system may be a Global Navigation Satellite System (such as GPS or GLONASS), a Local Positioning System (such as a radio, BLE, WIFI, UWB, or LIFI- based positioning system) or an Indoor Positioning System (such as HPPS).
  • GPS Global Navigation Satellite System
  • LEO Local Positioning System
  • HPPS Indoor Positioning System
  • the computer unit is configured to combine the image data from multiple cameras into a multi-dimensional mapping of the industrial site.
  • a plurality of cameras may be positioned relative to the industrial site and angled so that they cover the monitored area from different angles.
  • the inputted image data may be combined in the computer unit to generate a multi-dimensional mapping of the monitored area, e.g., a 2D or 3D map. This allows for a better tracking of the objects as the installation often requires the objects to be moved in multiple directions within the coordinate system before being mounted or assembled to another object or structure.
  • the system further comprises:
  • radio communications devices configured to communicate with each other via a radio communications link, the radio communication between the radio communications devices being transmitted to the data processing apparatus, and/or
  • At least one microphone connected to the data processing apparatus, wherein the at least one microphone is configured to pick up audio signals from at least one position at the industrial site.
  • the data processing apparatus may comprise a radio scanner unit configured to scan the radio frequencies used by a plurality of radio communications devices to conduct radio communications between workers.
  • the radio scanner unit may be configured to tune to a particular radio channel being used and pick up the raw radio communication between the radio communications devices.
  • a dedicated radio unit may be set up to access the radio channel being used.
  • the radio communication between the radio communications devices may then be stored in the database of the data processing apparatus.
  • An encryption key may be entered into or stored in the radio scanner unit to access the radio frequencies if encrypted radio channels are used.
  • one or more microphones may pick up audio signals from one or more positions at the industrial site, wherein the audio signals may include voice communications between workers, various machine sounds, audio alarms, announcements over speakers, weather sounds, and the like.
  • the audio signals picked up by the microphones may be stored in the database of the data processing apparatus.
  • a speech recognition algorithm may be implemented in the computer unit to extract unique voice commands from the radio communications and/or the audio signals.
  • the computer unit may use these unique voice commands to further determine the operational states of an operation.
  • an audio feature extraction algorithm may be implemented in the computer unit to extract unique audio signals associated with the performance of an operation and/or other characteristic audio signals that indirectly or directly affect the performance of an operation.
  • classes comprising machine operating sounds may be used for determining machine operating conditions.
  • classes comprising weather related sounds may be used for determining environmental operating conditions.
  • a transcription (speech-to-text) algorithm may be implemented in the computer unit to generate a transcription of the radio communication. The transcription may be stored in the database for later analysis.
  • the system further comprises a plurality of telemetry devices located within the industrial site and/or at least one meteorological unit or system, wherein the telemetry devices and/or the meteorological unit or system are configured to communicate with the data processing apparatus via one or more communications links.
  • the data processing apparatus may be configured to communicate with a plurality of telemetry devices located within the industrial site for receiving telemetry data.
  • the telemetry data may be provided from a third-party provider.
  • a feature extraction algorithm may be implemented in the computer unit to extract operational states from the telemetry data, such as crane based operational states.
  • the data processing apparatus may establish a wired or wireless communications link with the telemetry devices or a server of the third-party provider. These operational states may also be used to determine one or more operations of the industrial process.
  • the data processing apparatus may be configured to communicate with a meteorological unit or system located relative to the industrial site for receiving meteorological data.
  • a third-party provider may provide the meteorological data.
  • the data processing apparatus may establish a wired or wireless communications link with the meteorological unit or system or a server of the third-party provider.
  • the meteorological data may be stored in the database and combined with the image data and/or position data to form a structured dataset.
  • the industrial site is an installation or shipping vessel, a shipping terminal, or an installation or construction site, a heavy loading or unloading area, or a production or pre-assembly factory or area.
  • the present method is suited for monitoring industrial sites where multiple objects are moved around and different events involving these objects are performed, particularly sites where repetitive events are performed but the location for these events may change during the overall process.
  • the industrial site may be a dynamic site, such as an installation or shipping vessel, where objects must be loaded and stored in predetermined pattern for optimal usage of the available storage space, and where the objects must be installed or unloaded in a particular order.
  • the dynamic site may also be a shipping terminal (such as airports, seaports, or standalone inland terminals), where objects must be loaded or unloaded and optionally stored in predetermined pattern, and where the movement of objects must be coordinated with the loading/unloading process to reduce idle time.
  • the dynamic site may also be a construction or installation site where objects are temporary stored between usage or installed directly upon arrival, and where the mapping of the site changes during the construction process.
  • the dynamic site may also be a site (such as a temporary site) where various tasks or operations are performed within dedicated areas under controlled environments, and objects are moved between said areas during the industrial process.
  • a site such as a temporary site
  • various tasks or operations are performed within dedicated areas under controlled environments, and objects are moved between said areas during the industrial process.
  • the dynamic site may be a base port for the handling and loading of offsite components before installation.
  • the base port may comprise local assembly and/or production facilities so that the offshore components can be pre-assembled and/or produced onsite at the base port.
  • the industrial site is:
  • the present system can thus be used at sites where objects are moved around and where there is need for monitoring repeatable events, allows the user of the present system to easily track and monitor events and objects.
  • One object of the present invention is achieved by a computer program, comprising instructions which, when loaded and run on the system described above, causes the data processing apparatus to execute the method steps described earlier.
  • the present method is implemented into the data processing apparatus, wherein the computer unit is configured to execute the steps of the present method.
  • This allows the use and training of artificial intelligence, such as neutral networks (e.g., FNN, CNN, RNN, attention networks, DNN or SNN) using machine learning to automatically determine the events of the industrial process.
  • neutral networks e.g., FNN, CNN, RNN, attention networks, DNN or SNN
  • One object of the present invention is achieved by a computer-readable medium, having stored thereon the computer program described above.
  • the present computer program is implemented on the computer-readable medium of a remote server (or server network) or a local computer unit, thus allowing the data processing and storage to be adapted to desired site configuration and application.
  • a hybrid version of the data processing apparatus may be used where parts of the computer program may be implemented in the remote server and parts of the computer program may be implemented on the local computer unit.
  • the data analysis, the data storage, the data evaluation, and the data access and interaction may be implemented on and executed by one or more remote servers or server networks.
  • the present data processing apparatus to be configured as a local-based data service, a remote- or cloud-based data service, or a hybrid data service thereof.
  • the computer-readable medium may be a non-volatile memory or non-volatile storage.
  • One object of the present invention is achieved by use of the method of determining at least one event of an industrial process at an industrial site according to any one of any one of embodiments of the method described herein, wherein the industrial site changes during the industrial process.
  • One object of the present invention is achieved by use of the method of determining at least one event of an industrial process at an industrial site according to any one of the embodiments of the method described herein for asset management.
  • One object of the present invention may be achieved by use of the method according to any one of the embodiments disclosed herein to track component installation, maintenance activities, or operational status in the construction and/or service of offshore and/or onshore wind turbines and/or solar farms.
  • One object of the present invention may be achieved by use of the method according to any one of the embodiments disclosed herein to track the construction of large-scale highway bridges and/or tunnels, including but not limited to tunnels excavated through mountainous terrain or immersed tunnels constructed across ocean floors.
  • Fig. 1 shows a block diagram of a first embodiment of the method according to the present invention
  • Fig. 2 shows a first embodiment of the present system according to the present invention
  • Fig. 3 shows a second embodiment of the present system
  • Fig. 4 shows a third embodiment of the present system
  • Fig. 5 shows a fourth embodiment of the present system
  • Fig. 6 shows an example of synchronizing the image data and the other data
  • Fig. 7 shows an example of the structured dataset stored in the database
  • Fig. 8 shows an exemplary application of the present system for an offshore site
  • Fig. 9 shows an exemplary application of the present system for an onshore site
  • Fig. 10 shows another example of synchronizing the image data and the other data.
  • a number of cameras are positioned relative to the industrial site, each of which captures image data 1 of the industrial site.
  • the frame resolution, shutter speed and other settings of the cameras are optionally adapted to the industrial process performed at the industrial site.
  • the image data is inputted to a data processing apparatus via a communications link, e.g., an encrypted communications link, wherein the image data is processed and analysed 2 as mentioned later.
  • other types of data 3 are further inputted to the data processing apparatus together with the image data 1.
  • the other data 3 and the image data 1 are then processed and analysed 2 in the computer unit of the data processing apparatus.
  • the other data 3 are representative of conditions at or related to the industrial site, and some of the other data 3 may be provided by third-party providers.
  • the location data 3a is inputted via a positioning system, such as a global or local positioning system, in communication with the data processing apparatus.
  • the location data 3a is indicative of the locations of one or more objects within the industrial site.
  • the telemetry data 3b and/or the acceleration and gyroscopic data 3c may be inputted directly to the data processing apparatus from sensors or other devices arranged on objects located within the industrial site.
  • the telemetry data 3b and/or the acceleration and gyroscopic data 3c may also be inputted from a third-party provider, which manages the operation of one or more objects, such as cranes, vessels, machines, or the like, within the industrial site.
  • the location data 3a, the telemetry data 3b, and/or the acceleration and gyroscopic data 3c may be supplied by the same third-party provider.
  • the meteorological data 3d is inputted via a local meteorological unit or a third-party provider to the data processing apparatus.
  • the meteorological data 3d is representative of the environmental conditions relating to the industrial site.
  • the meteorological data 3d includes, but is not limited to, historical weather data, and/or weather forecasts.
  • the radio communications 3e between radio communications devices located within the industrial site is further inputted, e.g., directly, to the data processing apparatus.
  • the radio communications 3e is optionally encrypted to prevent unauthorised access to the radio communication.
  • Workers at the industrial site can manually log further data relating to the conditions at the industrial site or to operations performed at the industrial site.
  • the inputted image data 1 and the inputted other data 3 are analysed 2 by a computer unit in the data processing apparatus to determine a number of operational states within the image data and the other types of data 3.
  • the operational states are preferably determined by the computer unit using computer vision algorithms, feature extraction algorithms, or a combination thereof.
  • the inputted data 1, 3 are filtered and pre-processed into signals or signal ranges suitable for the subsequent data analysis.
  • the operational states are then evaluated 4 by the computer unit to determine a number of events 4a using classification algorithms, custom logics, or a combination thereof.
  • the events 4a determined by the computer unit are then synchronised with the image data and the other data to form a structured dataset 4b.
  • the structured dataset 4b along with the events 4a determined by the computer unit are then stored in a database 5.
  • Fig. 2 shows a first embodiment of the system according to the present invention.
  • the data processing apparatus is configured as a hybrid data service, where parts of the data processing 2’ (ref. data analysis) are performed locally while other parts of the data processing 2” are performed remotely.
  • the image data 1 and other data 3 are captured or measured locally 7 using cameras, sensors, or other data recording devices 8a, 8b.
  • the image data 1 and other data 3 are inputted to a local data processing unit where the inputted data 1, 3 are at least analysed 2’ by a local computer unit in the local data processing unit.
  • the data outputted from the local data processing unit is transmitted to a remote data processing unit, wherein the outputted data is at least evaluated remotely 9 for determining the events 4a.
  • the data outputted from the local data processing unit together with the data 8c from third-party providers are then evaluated by the remote computer unit to determine the events 4a and generate the structured dataset 4b.
  • the events 4a and structured dataset 4b are then stored in a remote database 5.
  • Users are then able to access the remote data processing apparatus by a dedicated API interface using their respective user profiles.
  • the user can then view or download the events 4a and structured dataset 4b by transmitting data requests 10 to the remote data processing apparatus.
  • the user may request corrections or changes to the structured dataset 4b via the API interface.
  • Fig. 3 shows a second embodiment of the present system.
  • the data processing apparatus is configured as a remote data service, where the data processing 2 is performed remotely.
  • the image data 1 and other data 3 are captured or measured locally 7’ using cameras, sensors, or other data recording devices 8a, 8b.
  • the image data 1 and other data 3 are transmitted to the remote data processing unit, wherein the received data together with the data 8c inputted from third-party providers are analysed and evaluated remotely 9”.
  • the computer unit in the remote data processing unit determines the events 4a and generates the structured dataset 4b.
  • the events 4a and structured dataset 4b are then stored in the remote database 5.
  • Fig. 4 shows a third embodiment of the system.
  • the data processing apparatus is configured as an alternative remote data service, where the data storage 5 is performed in a remote database 5’ separate from the data processing 2.
  • the computer unit in the remote data processing unit determines the events 4a and generates the structured dataset 4b.
  • the events 4a and structured dataset 4b are then transmitted to the remote database 5’ for data storage.
  • the remote database 5’ is in communication with the remote data processing apparatus for accessing the stored data.
  • Fig. 5 shows a fourth embodiment of the system.
  • the data processing apparatus is configured as a local data service, where the data processing 2 is performed locally.
  • the image data 1 and other data 3 are captured or measured locally 7” using cameras, sensors, or other data recording devices 8a, 8b.
  • the image data 1 and other data 3 are inputted to the local data processing unit, wherein the inputted data together with the data 8c inputted from third-party providers are analysed and evaluated locally 7”.
  • Figs. 6 and 10 show two examples of synchronizing the image data 1 and the other data 3.
  • the image data 1 and the other data 3 are synchronised using a timestamp signal 11.
  • the data associated with each event 4a in the structured dataset 4b are synchronised using this timestamp signal 11.
  • One or more timestamps 1 la are logged and linked to each event 4a. Thus, allowing the user to view at least the start, the end, and the time period for each event 4a.
  • Fig. 7 shows an example of the structured dataset 4b stored in the database 5.
  • the dataset 4b is presented as a matrix where the relevant inputted data 1, 3 for each event 4a are listed together with the number of events 4a determined by the computer unit.
  • the matrix may be changed or altered upon request from the users, wherein the administrator of the present system may control the configuration of the matrix.
  • Fig. 8 shows an exemplary application of the present system for an offshore site.
  • the industrial site is an offshore site, such as an installation vessel 12 for erection of a wind turbine 13.
  • the installation vessel 12 comprises a crane unit for lifting the various wind turbine components into position relative to each other.
  • Cameras 14 are positioned at predetermined locations on the offshore site, wherein the cameras 14 are used to monitor areas of interest. Other sensors (not shown) are further arranged at the offshore site for inputting other data 3 to the data processing apparatus 15.
  • the operation of at least the crane unit can be monitored using the cameras 14 and the sensors (not shown), wherein image data and measured data of the crane operations are transmitted to the data processing apparatus 15 for data analysis and evaluation. Thereby allowing at least events involving the use of the crane unit to be determined by the computer unit in the data processing apparatus 15.
  • Fig. 9 shows an exemplary application of the present system for an onshore site.
  • the industrial site is an onshore site, such as an erection site of a wind turbine 13.
  • the installation involves the use of a crane unit 16 for lifting the various wind turbine components into position relative to each other.
  • Cameras 14 are positioned at predetermined locations at onshore site, wherein the cam- eras 14 are used to monitor areas of interest. Other sensors (not shown) are further arranged at the onshore site for inputting other data 3 to the data processing apparatus 15.
  • the operation of at least the crane unit 16 can be monitored using the cameras 14 and the sensors (not shown), wherein image data and measured data of the crane oper- ations are transmitted to the data processing apparatus 15 for data analysis and evaluation. Thereby allowing at least events involving the use of the crane unit 16 to be determined by the computer unit in the data processing apparatus 15.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Security & Cryptography (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Educational Administration (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Game Theory and Decision Science (AREA)
  • Evolutionary Computation (AREA)
  • Development Economics (AREA)
  • Bioethics (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a method of determining at least one event of an industrial process at an industrial site, comprising acts of collecting sensor data representing conditions at/or related to the industrial site, inputting the collected data to a data processing apparatus, performing data analysis on the one or more of the inputted data using feature extraction for determine at least one event, logging a timestamp and a position of the at least one determined event, and synchronising the at least one event with the data inputted to the data processing apparatus to form a structured dataset, storing said structured dataset in a database. The time stamp is synchronized to an external clock operating a globally unique time system. The present invention also relates to a system configured to perform the method. The present invention further relates to a computer program configured to execute the method and a computer-readable medium configured to store the computer program.

Description

A method of determining an event of an industrial process at an industrial site and a system thereof
Field of the Invention
The present invention relates to a method of determining at least one event of an industrial process at an industrial site, comprising acts of collecting sensor data representing conditions at/or related to the industrial site, inputting the collected data to a data processing apparatus, performing data analysis on the one or more of the inputted data using feature extraction for determine at least one event, logging a timestamp and a position of the at least one determined event, and synchronising the at least one event with the data inputted to the data processing apparatus to form a structured dataset, storing said structured dataset in a database. The time stamp is synchronized to an external clock operating a globally unique time system.
The present invention also relates to a system configured to perform the method.
The present invention further relates to a computer program configured to execute the method and a computer-readable medium configured to store the computer program.
Background of the Invention
Large construction or installation projects are known to suffer from delays, design changes, unforeseen circumstances, and exceeding budgets, which can lead to claim disputes between the client and the contractor or sub-contractors. The construction or installation process is governed by a contract between the parties, which specifies terms and conditions for payments, project milestones, quality of work, breach of contract, deadline for completion of work and other information. The construction process normally consists of a plurality of various subtasks that must be completed before the construction is completed.
The contractor and sub-contractors must submit a payment claim and supporting documentation for the completed work in accordance with the contract. The client then assesses the completed work to verify the payment claim. Any discrepancies between the claimed work and the assessed work are settled in accordance with the terms laid out in the contract.
Therefore, there is a desire to monitor the progress of the construction or installation, as well as the completion of subtasks and other key events. For insurance reasons, it may be desired to monitor certain areas or the entire construction site. Currently, it is known to use cameras to capture images of the monitored areas and link boxes, or data loggers, to gather operating data from the construction machinery. Workers may also use logbooks to manually log timestamps, notes, and other information related to the construction progress.
In the event of an accident, the stored image data, operating data, and logged data can then be analysed for insurance reasons. Settling disputes often requires the identification of specific tasks and the start and stop times of these tasks. This may be achieved by manually analysing the stored and logged data, which can be time consuming and labour intensive.
It is known that that objects of interest may be tracked automatically by processing the captured image data using object detection algorithms or machine learning algorithms implemented in a remote server or local data processing unit. Such algorithms can be trained and evaluated via various available programming platforms provided by software providers using either open-source datasets or custom datasets.
WO 2021/110226 Al discloses a method and system for monitoring an industrial site, comprising a plurality of cameras connected to a data processing apparatus. The raw image data is processed in data processing apparatus by an object detection algorithm, where the detected object is anonymised in the image data and the altered image data is stored in a database for later inspection. This solution focuses on anonymising image data so it may be stored for later analysis.
WO 2018/191555 Al discloses a method of recognising repeating actions of an overall assembly line process using deep learning, wherein a neutral network in communication with a long-short term memory is trained to detect actions and compare it to a benchmark process. Abnormalities in the detected actions is determined by the neural network based on the benchmark process, and video snippets of these abnormalities may be stored for later analysis. This solution is designed specifically for monitoring assembly lines for automotive parts or computer parts, and uses only two-dimensional data in the data processing.
US 11175650 B2 discloses another system for monitoring an assembly line process using sensors arranged at working stations and workers interacting with the working stations. The neural network and long-short term memory are trained to detect objects and actions from the sensors stream using deep learning, which relate to assembly steps of an overall assembly process. The workers or robots may interact with the system at the working stations, e.g., via user terminals. The neural network is configured to generate a set of performance data, which can be compared with a set of reference data. During optimization, operators may access portions of the set of data for optimising the assembly process This solution is designed specifically for real-time monitoring of the assembly of components.
US 11321944 B2 discloses a system for monitoring repeating actions in an overall process, where a neural network and long-short term memory are trained to detect actions in video data. The neural network is trained to detect the start, the end, and other details of each action based on the image frames, and to generate statistical data based on the details of the actions. This solution is designed specifically for monitoring assembly lines in a factory.
US2021/0117684 discloses a system for detecting cycle data using object and motion properties of a single object with the aim of optimizing a repetitive production at a stationary production line.
The abovementioned systems are all limited to monitoring repeating actions of an overall process, particularly of assembly lines in automotive and computer part manufacturing factories. These systems are limited for use at stationary sites, where the actions are performed at fixed stations or tables.
Other management systems for monitoring a construction site disclosed in CN 111429107 A, CA 3015879 Al, CN 111242574 A, and US 2021/0004591 Al. Object of the Invention
One object of the present invention is to solve the problems of the abovementioned prior art.
Another object of the present invention is to provide a method and system that allows for determining an event of an industrial process at of an industrial process at an industrial site including dynamic sites.
Another object of the present invention is to provide a method and system that allows for time stamping and synchronizing an event to an external clock operating a globally unique time system. Preferably, also providing a method and system that allows for determining a specific global position of the event.
Description of the Invention
One object of the present invention is achieved by a method of determining at least one event of an industrial process at an industrial site, comprising acts of:
- collecting sensor data representing conditions at/or related to the industrial site,
- inputting the collected data to a data processing apparatus comprising a computer unit,
- performing data analysis on the one or more of the inputted data in the computer unit using one or more feature extraction algorithms configured to determine a number of unique features, wherein a combination of said unique features represents one or more process segments and/or events,
- evaluating said combination of unique features in the computer unit to determine at least one event,
- logging a timestamp and a position of the at least one determined event, and
- synchronising the at least one event with the data inputted to the data processing apparatus to form a structured dataset, storing said structured dataset in a database, wherein said time stamp is synchronized to an external clock operating a globally unique time system.
The present method can also be used not only for analysing and optimizing the progress of the industrial process, or events thereof, but just as importantly to establish when an event of the industrial process occurred. The present method is versatile and can be adapted to different applications and to changes in the dynamic mapping at the site and is able to detect events around the entire site. Conventional methods for high-volume, in-line production sites, i.e., working stations along a production line, are not suited for use at low-volume production or assembly sites without changes to the monitoring setup as well as significant modification to the implemented computer vision algorithms. Further, conventional monitoring systems are limited to evaluate the performance at fixed workstations at factories of the automobile or electronics industry.
Here, the term “industrial process” is defined as an overall process being performed at the industrial site, wherein the industrial process is composed of a plurality of interrelated operations (or actions) which may be performed in parallel or in a sequential order. Further, a plurality of process segments also referred to as operational states (or tasks) is performed during each operation. Hence the process and the process segments have a duration in time.
An event is defined by specific time and may be defined by a combination of conditions or features in the data or a relation between data at the specific time.
A process segment consists of a minimum of two events, e.g. a start event and an end event.
The more precise the specific times of the features are synchronized, the more precise the event can be determined to have occurred at a specific time and place. Thus, to ensure absolute and correct synchronisation of the multiple data sources, the timestamp must have an absolute reference to an external clock running a globally unique time system such as coordinated universal time (UTC), international atomic time (TAI), GPS time or similar. This guarantees consistency, accuracy, and resistance to timing discrepancies caused by drift, system latency, or local clock variations.
In one embodiment, the method comprises further acts of:
- determining a least one process segment based on the determined unique features,
- evaluating said determined process segment in the computer unit to determine at least one event. In one embodiment, the position is synchronised to a globally unique geospatial positioning reference system.
I.e. the position has an absolute relation to a globally unique geospatial positioning reference system. Thus, the method allows for using a local position coordinate system arranged relative to a global position reference system.
The position may have a unique and well-defined relation to a globally unique geospatial positioning reference system ensuring absolute and unambiguous location data. This may be achieved by integrating satellite-based navigation system such as GPS or GLONASS, or geodetic reference frames such as WGS84 or ITRF, and additional augmentation sources such as an IMU (inertial measurement unit). By leveraging these positioning references, the system can ensure high accuracy, consistency, and resistance to signal drift or loss, thereby providing a precise and persistent positioning framework.
An absolute and unambiguous position, ensured by relating to a globally unique geospatial positioning reference system, can also enable relating specific operations, process segments or events to unique object IDs, such as serial numbers. This may be important to document that e.g. a specific construction was completed or serviced. Unambiguous positional data may also be an important data parameter to document the construction and service history of a given object such as a wind turbine both on shore and offshore, combined with e.g. historical meteorological data.
In one embodiment, the method comprises further acts of
- capturing image data of at least a part of the industrial site using at least one camera, the image data comprising a plurality of image frames,
- inputting the image data to the data processing apparatus,
- performing data analysis on the inputted image data using one or more computer vision algorithms configured to determine a number of unique features in the image data.
Alternatively, one object of the present invention is achieved by a method of managing an industrial site, comprising the steps:
- capturing image data of at least a part of the industrial site using at least one camera, the image data comprising a plurality of image frames, - inputting the image data to a data processing apparatus, the data processing apparatus comprising a computer unit,
- further inputting other data to the data processing apparatus, the other data being representative of conditions at or related to the industrial site,
- analysing one or more of the inputted data in the computer unit, wherein the data analysis is performed using at least one of:
- one or more computer vision algorithms configured to determine a number of operational states in the image data,
- one or more feature extraction algorithms, the feature extraction algorithms being configured to determine a number of unique features representative of a number of operational states in the other data,
- evaluating said operational states in the computer unit to determine at least one event,
- logging at least one timestamp of the at least one event,
- synchronising the at least one event with the other data inputted to the data processing apparatus to form a structured dataset, storing said structured dataset in a database.
The terms ‘other data’ or ‘additional data’ may be understood as sensor data and/or data received from a third party.
Data generally covers both image data, other data and sensor data.
This provides an improved method of managing an industrial process performed at an industrial site compared to conventional methods. The present method can also be used for analysing and optimizing the progress of the industrial process, or events thereof. The present method is versatile and can be adapted to different applications and to changes in the dynamic mapping at the site and is able to detect events around the entire site. Conventional methods for high-volume, in-line production sites, i.e., working stations along a production line, are not suited for use at low-volume production or assembly sites without changes to the monitoring setup as well as significant modification to the implemented computer vision algorithms. Further, conventional monitoring systems are limited to evaluate the performance at fixed workstations at factories of the automobile or electronics industry. The mapping of the industrial site preferably has an absolute reference to a globally unique global positioning reference system.
Here, the term “industrial process” is defined as an overall process being performed at the industrial site, wherein the industrial process is composed of a plurality of operations (or actions) performed in parallel or in a sequential order. Further, a plurality of operational states (or tasks) is performed during each operation.
Further, the term “industrial site” should in this case be understood as a site for the production, assembly, installation, or handling of large structures, or components thereof, with a size and weight unsuited for high-volume, mass production. The present system and method are particularly suited for, but not limited to, use in the energy sector, such as wind energy, the shipping (freight transport) industry, or the construction industry.
The present method can use a set of cameras to monitor the entire industrial site, or subareas of particular interest. This allows a user of the present system to visually monitor the industrial site, or sub-areas thereof. The captured image data, e.g., videos or still pictures, is inputted to a data processing apparatus for data analysis and evaluation. This allows the user to access image data of the industrial site.
Further, additional data is collected and inputted to the data processing apparatus for data storage and optional data analysis and evaluation. The additional data is representative of conditions at or related to the industrial site and may be collected from local sources and/or external sources. This allows the user to access different types of data from multiple sources.
The inputted data, or a selected subset thereof, is analysed in the computer unit using computer vision algorithms, feature extraction algorithms, or a combination thereof. The computer vision algorithms are configured to determine a number of operational states based on the captured image data. The feature extraction algorithms are configured to determine a number of unique features representative of said operational states based on the inputted additional data. This allows the computer unit to automatically detect the operational states of an operation. This also allows the computer unit to automatically detect operational states of interest.
The outputs of these algorithms are then evaluated in the computer unit using classification algorithms, custom logics, or a combination thereof to determine one or more events. The events may be operations of the industrial process or other events of interest, wherein the parameters thereof may be set up by the user or administrator of the present system. A scaling or weight value may be applied to the outputs of these algorithms, thus making the outputs better suited for clustering and/or filtering during the evaluation. This allows the computer unit to automatically detect events based on the image data and/or the inputted additional data.
The additional data and image data are synchronized in the computer unit based on the determined events to form a structured dataset. At least one timestamp of the event may be used to synchronize the different types of data. The structured dataset is stored in a dedicated database, or links may be stored together with the respective data in the database. This allows the user to access the present system and view the structured dataset of a selected event.
In one embodiment, the computer vision algorithms comprise at least one of an image classification algorithm, an optical flow algorithm, an object detection algorithm, a semantic segmentation algorithm, an action classification algorithm, a temporal action localization algorithm, or a combination thereof.
The computer vision algorithm(s) may advantageously be implemented in the computer unit. For example, the computer vision algorithms may comprise an image classification algorithm, which may determine at least one state of process based on the content of the image frames in the image data as a whole. For example, the computer vision algorithms may comprise an optical flow algorithm, which may quantify the motion between two consecutive image frames in the image data. For example, the computer vision algorithms may comprise an action classification algorithm, which may determine at least one type of operational state based on the performed action in the stream of image frames in the image data. For example, the computer vision algorithms may comprise a temporal action localization algorithm, which may determine at least one start, end, and type of operational state based on the stream of image frames in the image data. This allows the computer unit to identify and determine the operational states using featurebased event detection.
More than one computer vision algorithm may be implemented in the computer unit, wherein the computer vision algorithms may be combined to determine the operational states. This allows for a more accurate detection of multiple operations of the industrial process.
The computer vision algorithms may also comprise an object tracking algorithm, which may track the motion of one or more objects between consecutive image frames in the image data. A one-stage or multi-stage object detection algorithm, or a semantic segmentation model, may further be implemented in the computer unit. Optionally, the computer unit may use boundary boxes or segmentation masks (regions of interest) to localise and track objects of interest within the image frames in the image data.
The computer vision algorithms may also comprise an action detection algorithm, which may determine at least one type of operational state based on the performed object actions in the image frames in the image data. The action detection algorithm may compute or combine special, temporal, and semantic features from the image data. This allows the operational states to be identified and determined in accordance with one or more object classes and action classes stored in the computer unit. This also allows various object actions to be identified and recognised within the image frames in the image data.
In one embodiment the industrial site changes during the industrial process.
In one embodiment, the method further comprises the step of determining a position such as a geographical position of at least one object, and optionally tracking said at least one object throughout the industrial site.
The data processing apparatus may be configured to determine the position of one or more objects within the industrial site. The data processing apparatus may further be configured to track the motion of these objects throughout the industrial site. A coordinate system of the industrial site may be stored in the data processing apparatus for determining the (global or local) position of objects within the coordinate system. The position of an object may be determined based on the image data using a geolocalization algorithm implemented in the computer unit. For example, the geographical position of one or more cameras in combination with computer vision-based triangulation may be used to determine the geographical positions of the objects. The computer unit may then use a tracking algorithm to predict and track the positions of the objects within the image data. An example of a geo-localization algorithm is disclosed in the article “Object Tracking and Geo-Localization from Street Images” by Wilson, Daniel et al. and the article “CAPE: Camera View Position Embedding for Multi-View 3D Object Detection” by Xiong, Kaixin et al. However, other geo-localization algorithms may also be used. This allows the user to visually track objects within the industrial site using the cameras.
Optionally, a positioning system may communicate with the data processing apparatus for determining the position of one or more objects. The positioning system may transmit position data directly to the data processing apparatus, or via local positioning units on the objects. This allows objects to be tracked even if no or limited image data is available.
A local positioning unit on the object may receive beacon signals from multiple nodes of the positioning system, wherein the positioning unit determines its position based on the received beacon signals. Alternatively, multiple local positioning units of the positioning system may be arranged relative to the industrial site and the individual objects may act as mobile nodes each transmitting a beacon signal to the respective local positioning units. Optionally, a signal may be transmitted from the local positioning unit which receives the reflected signal from the respective nodes. For example, the position of the object may be determined using lateration or multi -laterati on, angulation or other known techniques. This allows objects to be tracked even if no or limited reception of global positioning signals is available.
The present method allows users of the present system to determine and track the positions of objects during various operations of the overall industrial process. This also allows for better management of the flow of materials on the industrial site, further allowing for better distribution and allocation of equipment and machinery on the industrial site.
In one embodiment, at least radio communications between at least two radio communications devices and/or audio signals picked up by at least one microphone are transmitted to the data processing apparatus, wherein the computer unit extracts voice commands and/or unique audio signals from said radio communications or audio signals using the feature extraction algorithms.
The operational states may also be determined based on the radio communications between two or more radio communications devices and/or audio signals pick up at the industrial site. The raw radio communications may be picked up by a radio scanner unit or dedicated radio unit and transmitted to the data processing apparatus and stored in the database for data analysis. One or more microphones may pick up audio signals from one or more positions at the industrial site, wherein the audio signals may include voice communications between workers, various machine sounds, audio alarms, announcements over speakers, weather related sounds and the like. The raw audio signals may be stored in the database for data analysis.
The feature extraction algorithms may be one or more speech recognition algorithms, which may detect unique voice commands in the radio communications and/or audio signals. Examples of such algorithms are disclosed in the article “Some Commonly Used Speech Feature Extraction Algorithms” by Alim, Sabur A. et al. The unique voice commands may be representative of the start, end and/or type of one or more operational states. A spectral analysis and discrete transformation of the voice signals may be used to extract the unique voice commands. Classifications for voice commands or parameters thereof may be set up by the user or administrator of the present system and stored in the computer unit. The speech recognition algorithms may be trained to detect the voice commands in one or more languages. This allows unique voice commands spoken by workers during various operations to be stored in the computer unit. This allows the computer unit to automatically determine the operational states based on the radio communications or audio signals. The feature extraction algorithms may also be one or more audio feature extraction algorithms, which may detect unique audio signals in the radio communications and/or audio signals. Examples of such algorithms are disclosed in the article “An Overview of Audio Event Detection Methods from Feature Extraction to Classification” by Ba- baee, Elham et al. The unique audio signals may be warnings, alarms, machine, or tool related sounds and/or environmental sounds. Noise reduction, filtering, and spectral analysis of the audio signals may be used to extract the unique audio signals. Classifications for unique audio signals or parameters thereof may be set up by the user or administrator of the present system and stored in the computer unit. The audio feature extraction algorithms may be trained to detect unique audio signals such as warning alarms, start or stop of engines or winches, operation of tools, or other characteristic operational sounds. This allows the computer unit to automatically detect unique audio signals associated with or affecting the performance of operations.
Optionally, a transcription (speed-to-text) algorithm may be implemented in the computer unit, wherein the transcription algorithm may perform an automatic transcription of the radio communications. The transcription may then be stored in the database for later analysis.
The radio communications may be conducted via encrypted communications links to prevent unauthorized access to the radio communications. An encrypted communications link may be established between the individual radio communications devices and/or between the radio communications devices and the data processing apparatus. An encrypted or un-encrypted link may also be established between the microphones and the data processing apparatus.
In one embodiment, telemetry data from one or more telemetry devices located within the industrial site is transmitted to the data processing apparatus, wherein the computer unit extracts at least one unique feature and/or process segment from the telemetry data using the feature extraction algorithms.
The telemetry data of one or more telemetry devices may be transmitted to the data processing apparatus and stored in the database for data analysis. The telemetry device may receive data from one or more sensors or data logging systems located on the object. For example, the telemetry data may be representative of the operating conditions or performance of machines or tools, health conditions of workers, environmental conditions, or the like. The telemetry data may also comprise accelerations from accelerometers, movements from gyroscopes, or compass headings from magnetometers. Optionally, the position data may form part of the telemetry data. For example, the telemetry data may be standard telemetry data. Here, the term “standard telemetry data” should be understood as telemetry data that is presented in accordance with an open or globally recognised standard, such as ISO 15143-3. This allows for easy access to third- party data provided by multiple OEMs.
The feature extraction algorithm may extract one or more operational states from the telemetry data. The operational states may be representative of conditions relating to an operation of the industrial process. Reference parameters or thresholds may be set up by the user or administrator of the present system and stored in the computer unit. The selected telemetry signal may be compared to the reference parameters or thresholds in the computer unit, which may be used to detect certain signal characteristics or to determine if the telemetry signal is within safe operating conditions. The reference parameters may also be used to detect if machines or tools are turned on, or if certain actions of the machines or tool have commenced. This allows the computer unit to automatically determine unique operational states associated with an operation based on the telemetry data.
In one embodiment, telemetry data from at least one crane unit and/or image data of the at least one crane unit is transmitted to the data processing apparatus, wherein the computer unit determines at least one unique feature and/or process segment of a crane operation using the feature extract algorithms and/or the computer vision algorithms.
The present method allows the present system to detect events that involves the use of a crane unit, since crane operations are sensitive to environmental and load conditions and need to be monitored carefully to prevent any accidents.
Telemetry data of the crane unit may thus be transmitted to the data processing apparatus, wherein the telemetry data is analysed in the computer unit. The feature extraction algorithms implemented in the computer unit may be used to determine the operational states of a crane operation.
If the crane unit is within the field of view of one or more cameras, then the computer unit may also analyse the image data using the computer vision algorithms. The computer vision algorithms may thus be used to also determine the operational states of the crane operation.
In one embodiment, meteorological data from a meteorological unit or system is transmitted to the data processing apparatus, optionally the computer unit extracts at least one environmental feature from the meteorological data using the feature extraction algorithms.
The meteorological data from one or more meteorological units or systems may be transmitted to the data processing apparatus and stored in the database. The meteorological data may be representative of the environmental conditions relating to the industrial site. For example, the meteorological data may comprise wind speed, temperature, lightning, humidity, precipitation, visibility, cloud cover, air pressure, wave height, wavelength, wave dwell, wave direction, or other relevant weather, wind data, or wave data.
Optionally, the computer unit may extract one or more environmental features of the meteorological data using the feature extraction algorithms. The environmental feature may be related to an operational state of an operation. Reference parameters or thresholds may be set up by the user or administrator of the present system and stored in the computer unit. The selected meteorological signal may be compared to the reference parameters or thresholds in the computer unit, which may determine if the meteorological signals are within safe operating conditions. This allows the computer unit to automatically determine meteorological features influencing an operational state based on the meteorological data. This further allows users to evaluate the root-cause of abnormal situations, such as delays, failures, or accidents, in an easier and less data-heavy investigation. The meteorological data may be actual historical weather data or weather forecasts hereof, as planning decisions are often made based on the forecast. Actual historical weather data can be used to determine e.g., abort scenarios, in which actual weather conditions exceeded safe working conditions despite the weather forecast predicting safe weather conditions.
In one embodiment, the method further comprises logging additional data into the data processing apparatus using a user terminal, the logged data is representative of conditions related to a selected process segment and/or event.
The present method allows workers to access the present system and manually inputting additional data relating to one or more operational states of the industrial process. The additional data may be logged into the present system via a dedicated user interface. For example, the additional data comprise image data (videos or still pictures), position data, text messages, measurements, or the like. The user interface may be set up so that the worker can select text messages from a list or menu, upload image data or position data, record measurements or voice messages, or manually enter text.
The computer unit may automatically link the additional data to a dedicated operational state or operation based on the logged data. Alternatively, the worker may select a dedicated operational state or operation when logging data.
The dedicated user interface may be implemented on a mobile or handheld device, a user terminal, or a computer device in communication with the data processing apparatus. Alternatively, the worker may access a web-based user interface on the data processing apparatus.
In one embodiment, the method further comprises at least one of:
- tracking the movement of a first object relative to at least a second object or a fixpoint,
- determining a first event and at least a second event, wherein both the first and second events involve the same object(s),
- determining the positions of a machine or tool and a selected object relative to a dedicated point of operation. The path of objects moving between various locations at the industrial state may be tracked by the computer unit using the image data and/or the position data. Further, multiple objects of the same type or different types may be tracked relative to each other or relative to one or more fixpoints. Computer vision algorithms may be used to identify types of objects. Optionally, the distance between objects and/or between a selected object and a fixpoints may be calculated based on the image data and/or the position data. This allows the user of the present system to identify the current position of objects within the industrial site, and to determine if objects are spaced too far apart or too close relative to each other. This also allows the user of the present system to evaluate the path of movement of objects and optimize the flow within the industrial site.
Further, event classifications may be used to identify types of events involving a particular object, such as dedicated operations and operational states thereof. Optionally, object state classifications may be used to identify types of object states. The event classifications may also be used to identify types of events that involves larger, assembled object units. This allows the user of the present system to monitor various actions performed on the objects during one or more operations of the industrial process. The various actions may be loading or unloading of objects, preparation of objects before next operation, lifting or lowering of objects into alignment or assembly of objects into a larger object unit.
The computer unit may also determine the position of a machine or tool relative to a dedicated point of operation and/or to a selected object based on the image data and/or the position data. This allows the user of the present system to evaluate the placement of various of machines or tools and objects relative to the dedicated point of operation. This reduces waiting time and allows for a smoother operation.
In one embodiment, the image data is anonymised using an anonymisation algorithm implemented in the data processing apparatus, wherein the anonymised image data is stored in a database.
The raw image data may be stored locally at the industrial site, e.g., in the local node or in the camera. The raw image data may be encrypted before being transmitted to the data processing apparatus. The encrypted image data may then be decrypted, and the raw image data may be anonymised using an anonymisation algorithm implemented in the computer unit. The anonymised image data may then be stored in another database, e.g., located at the industrial site or at a remote location. This allows the present method to comply with the data protection law.
The data processing and data storage may thus be performed off-site, thereby reducing the amount of local data processing and reducing the installation costs of the present system. Alternatively, the data processing and data storage may be performed on-site, thus reducing the amount of data transmission. Users may then access the local or remote data processing apparatus via the user terminal using dedicated user profiles.
An initial data processing, e.g., the anonymisation of the image data, may be performed in the respective camera units before transmitting the image data to the data processing apparatus. Optionally, the computer vision algorithms and/or the feature extraction algorithms may further be implemented in a local data processing apparatus or in the camera units. This reduces the need for remote data processing as any less demanding data processing algorithms can be performed locally.
The computer unit may automatically log one or more timestamps for each event using the meta data of the image data. For example, the computer unit may determine a start timestamp, an end timestamp, and any other timestamps of the event. The time period for the event may be calculated as the difference between the start timestamp and the end timestamp. The respective timestamps may be stored in the database together with the structured dataset.
The timestamps may also be used to identify the image segment of that event in the stored image data. Alternatively, a copy of the image segment may be stored together with the structured dataset.
Alternatively, the computer unit may use the internal clock of the data processing apparatus to log the timestamps of the events. The internal clock may also be used to verify or synchronize the timestamps of one or more of signals inputted to the data processing apparatus. Optionally, the timestamps of one inputted signal may be used to verify the timestamps of another inputted signal. This allows for the correction of incorrect or missing timestamps in the inputted signals.
In one embodiment, the method further comprises steps of:
- updating a training dataset based on at least the events, the image data, or the inputted data, and
- retraining the one or more algorithms implemented in the computer unit using the updated training dataset.
A training dataset may be used to train the algorithms implemented in the computer unit using machine learning to automatically detect and identify various events and objects. The training dataset may be historic data from a previous industrial site or previous data from the industrial site. The training dataset may comprise the events, the image data, the inputted data, or any combination thereof. An administrator of the present system may evaluate the output of the computer unit after each training run and correct any errors and/or missed objects or events in the output. Once the evaluation is completed, a subsequent training run may be performed. The training section may be repeated until an acceptable level of performance is achieved.
The training dataset may be updated upon request, at regular intervals or after each training run. New objects and/or events may be added to the classifications, and/or the parameters for existing objects or events may be adjusted. The algorithms implemented in the computer unit may thus be trained to detect objects and/or events unique for a particular industrial site, as the size and complexity of each industrial site may vary during the overall industrial process. The classifications may comprise standardized classes, customized classes, or a combination thereof, which may be updated during evaluation and training of the computer unit.
In one embodiment, the events of the industrial process are arranged in a chronological or predetermined order in the structured dataset.
The computer unit may arrange the events in accordance with a predetermined order descriptive of the industrial process. Alternatively, the events may be arranged in a chronological order. This allows users to identify if events are performed in an incorrect order, or if events are not started or waiting for the completion of other events. This also allows user to easy identify any missing events of a particular sequence of operations.
The data received by the data processing apparatus may be processed in batches or in real-time, and the structured dataset may be updated continuously based on the detected events. This allows users of the present system to view the ongoing events and the completed events, thus allowing for an easy status of the progress of the industrial process.
In one embodiment, a statistical analysis is performed by the computer unit on the structured dataset using the first and second timestamps and/or the time period, wherein the results of the statistical analysis are stored in the database.
Optionally, a statistical analysis may be performed by the computer unit on the stored structured dataset using the timestamps and/or the time period of the events, where the results of the statistical analysis may also be stored in the database for later analysis. The results may include averaging times, time periods between selected events, and other relevant statistical data for each operation and the overall industrial process. This allows users of the system to monitor the time spend for each event as well as when a particular event was started, stopped, or ended. One or more parameters of the statistical data may be used as references for detecting abnormal events, and such abnormal events may be used for later analysis and optionally retraining of the data processing apparatus.
The structured dataset and/or the results of the statistical analysis may be presented to users of the present system as a report or in a dedicated graphic user interface.
One object of the present invention is achieved by a system determining at least one event of an industrial process at an industrial site, said system comprising:
- at least one sensor configured to collect sensor data representing conditions at/or related to the industrial site,
- a data processing apparatus comprising a computer unit, wherein the sensor(s) is connected to the data processing apparatus via a communications link for inputting the collected data to the data processing apparatus, - wherein the computer unit of the data processing apparatus is adapted to execute the steps of any one of embodiments of the method described herein.
In one embodiment, the system comprises at least one camera configured to capture image data of at least a part of the industrial site, the image data comprising a plurality of image frames, and wherein the camera(s) is connected to the data processing apparatus via a communications link for inputting the image data to the data processing apparatus.
Alternatively, one object of the present invention is achieved by a system , comprising the steps:
- at least one camera configured to capture image data of at least a part of the industrial site, the image data comprising a plurality of image frames,
- a data processing apparatus comprising a computer unit, wherein the cameras are connected to the data processing apparatus via a communications link for inputting the image data to the data processing apparatus,
- the data processing apparatus being configured to further receive other data from other sources via another communications link, the other data being representative of conditions at or related to the industrial site,
- wherein the computer unit of the data processing apparatus is adapted to execute the steps of the method described above.
This provides an improved system for managing an industrial process performed at an industrial site compared to conventional methods. The present system can also be used for analysing and optimizing the progress of the industrial process, or events thereof. The configuration of the present system is versatile and can be adapted to different applications and to dynamic changes in the mapping at the site. Conventional monitoring systems for stationary production sites are not suited for dynamic sites without changing the system setup and modifying the algorithms in the computer unit.
In one aspect, the cameras are arranged relative to the industrial site to capture image data of the industrial site, or sub-areas thereof, wherein the resolution and frame rate are selected based on the application and type of industrial site. Each camera is connected to the data processing apparatus via a first communications link, e.g., an encrypted communications link. The data processing apparatus comprises at least of a computer unit and a database in communication with computer unit, wherein a computer program is implemented on the computer unit and configured to execute the method step described above.
One or more computer vision algorithms are implemented in the computer unit for determining a plurality of operational states of an operation based on the image data. The data processing apparatus uses classification algorithms and custom logics to determine an operation of the industrial process based on the operational states.
In one embodiment, the data processing apparatus is configured to communicate with a positioning system via a communications link, wherein the positioning system is configured to input position data to the data processing apparatus.
The data processing apparatus may be configured to determine the position of one or more objects within a coordinate system defining the industrial site. The data processing apparatus may further be configured to track the motion of these objects throughout the industrial site using the position data and/or the image data. The data processing apparatus may communicate with a server of a third-party provider, or an integrated positioning system of the present system.
The positioning system may be a Global Navigation Satellite System (such as GPS or GLONASS), a Local Positioning System (such as a radio, BLE, WIFI, UWB, or LIFI- based positioning system) or an Indoor Positioning System (such as HPPS).
In one embodiment, the computer unit is configured to combine the image data from multiple cameras into a multi-dimensional mapping of the industrial site.
A plurality of cameras may be positioned relative to the industrial site and angled so that they cover the monitored area from different angles. The inputted image data may be combined in the computer unit to generate a multi-dimensional mapping of the monitored area, e.g., a 2D or 3D map. This allows for a better tracking of the objects as the installation often requires the objects to be moved in multiple directions within the coordinate system before being mounted or assembled to another object or structure. In one embodiment, the system further comprises:
- at least two radio communications devices configured to communicate with each other via a radio communications link, the radio communication between the radio communications devices being transmitted to the data processing apparatus, and/or
- at least one microphone connected to the data processing apparatus, wherein the at least one microphone is configured to pick up audio signals from at least one position at the industrial site.
The data processing apparatus may comprise a radio scanner unit configured to scan the radio frequencies used by a plurality of radio communications devices to conduct radio communications between workers. The radio scanner unit may be configured to tune to a particular radio channel being used and pick up the raw radio communication between the radio communications devices. Alternatively, a dedicated radio unit may be set up to access the radio channel being used. The radio communication between the radio communications devices may then be stored in the database of the data processing apparatus. An encryption key may be entered into or stored in the radio scanner unit to access the radio frequencies if encrypted radio channels are used.
Alternatively, or additionally, one or more microphones may pick up audio signals from one or more positions at the industrial site, wherein the audio signals may include voice communications between workers, various machine sounds, audio alarms, announcements over speakers, weather sounds, and the like. The audio signals picked up by the microphones may be stored in the database of the data processing apparatus.
A speech recognition algorithm may be implemented in the computer unit to extract unique voice commands from the radio communications and/or the audio signals. The computer unit may use these unique voice commands to further determine the operational states of an operation. Further, an audio feature extraction algorithm may be implemented in the computer unit to extract unique audio signals associated with the performance of an operation and/or other characteristic audio signals that indirectly or directly affect the performance of an operation. For example, classes comprising machine operating sounds may be used for determining machine operating conditions. For example, classes comprising weather related sounds may be used for determining environmental operating conditions. Optionally, a transcription (speech-to-text) algorithm may be implemented in the computer unit to generate a transcription of the radio communication. The transcription may be stored in the database for later analysis.
In one embodiment, the system further comprises a plurality of telemetry devices located within the industrial site and/or at least one meteorological unit or system, wherein the telemetry devices and/or the meteorological unit or system are configured to communicate with the data processing apparatus via one or more communications links.
The data processing apparatus may be configured to communicate with a plurality of telemetry devices located within the industrial site for receiving telemetry data. Alternatively, the telemetry data may be provided from a third-party provider. A feature extraction algorithm may be implemented in the computer unit to extract operational states from the telemetry data, such as crane based operational states. The data processing apparatus may establish a wired or wireless communications link with the telemetry devices or a server of the third-party provider. These operational states may also be used to determine one or more operations of the industrial process.
Further, the data processing apparatus may be configured to communicate with a meteorological unit or system located relative to the industrial site for receiving meteorological data. A third-party provider may provide the meteorological data. The data processing apparatus may establish a wired or wireless communications link with the meteorological unit or system or a server of the third-party provider. The meteorological data may be stored in the database and combined with the image data and/or position data to form a structured dataset.
In one embodiment, the industrial site is an installation or shipping vessel, a shipping terminal, or an installation or construction site, a heavy loading or unloading area, or a production or pre-assembly factory or area.
The present method is suited for monitoring industrial sites where multiple objects are moved around and different events involving these objects are performed, particularly sites where repetitive events are performed but the location for these events may change during the overall process.
The industrial site may be a dynamic site, such as an installation or shipping vessel, where objects must be loaded and stored in predetermined pattern for optimal usage of the available storage space, and where the objects must be installed or unloaded in a particular order. The dynamic site may also be a shipping terminal (such as airports, seaports, or standalone inland terminals), where objects must be loaded or unloaded and optionally stored in predetermined pattern, and where the movement of objects must be coordinated with the loading/unloading process to reduce idle time. The dynamic site may also be a construction or installation site where objects are temporary stored between usage or installed directly upon arrival, and where the mapping of the site changes during the construction process.
The dynamic site may also be a site (such as a temporary site) where various tasks or operations are performed within dedicated areas under controlled environments, and objects are moved between said areas during the industrial process. For example, a heavy loading or unloading area for offshore or onshore transport, or a production or pre-assembly factory or area for large, heavy components. For example, the dynamic site may be a base port for the handling and loading of offsite components before installation. Optionally, the base port may comprise local assembly and/or production facilities so that the offshore components can be pre-assembled and/or produced onsite at the base port.
In one embodiment, the industrial site is:
- a dynamic site where the mapping of said dynamic site changes during the progress of the industrial process, or
- a factory where the industrial process is performed at one or more production lines.
The present system can thus be used at sites where objects are moved around and where there is need for monitoring repeatable events, allows the user of the present system to easily track and monitor events and objects. One object of the present invention is achieved by a computer program, comprising instructions which, when loaded and run on the system described above, causes the data processing apparatus to execute the method steps described earlier.
The present method is implemented into the data processing apparatus, wherein the computer unit is configured to execute the steps of the present method. This allows the use and training of artificial intelligence, such as neutral networks (e.g., FNN, CNN, RNN, attention networks, DNN or SNN) using machine learning to automatically determine the events of the industrial process.
One object of the present invention is achieved by a computer-readable medium, having stored thereon the computer program described above.
The present computer program is implemented on the computer-readable medium of a remote server (or server network) or a local computer unit, thus allowing the data processing and storage to be adapted to desired site configuration and application. Alternatively, a hybrid version of the data processing apparatus may be used where parts of the computer program may be implemented in the remote server and parts of the computer program may be implemented on the local computer unit. In the hybrid version, the data analysis, the data storage, the data evaluation, and the data access and interaction may be implemented on and executed by one or more remote servers or server networks. Thus, allowing the present data processing apparatus to be configured as a local-based data service, a remote- or cloud-based data service, or a hybrid data service thereof.
The computer-readable medium may be a non-volatile memory or non-volatile storage.
One object of the present invention is achieved by use of the method of determining at least one event of an industrial process at an industrial site according to any one of any one of embodiments of the method described herein, wherein the industrial site changes during the industrial process.
One object of the present invention is achieved by use of the method of determining at least one event of an industrial process at an industrial site according to any one of the embodiments of the method described herein for asset management. One object of the present invention may be achieved by use of the method according to any one of the embodiments disclosed herein to track component installation, maintenance activities, or operational status in the construction and/or service of offshore and/or onshore wind turbines and/or solar farms.
One object of the present invention may be achieved by use of the method according to any one of the embodiments disclosed herein to track the construction of large-scale highway bridges and/or tunnels, including but not limited to tunnels excavated through mountainous terrain or immersed tunnels constructed across ocean floors.
Description of the Drawing
The present invention is described by example only and with reference to the drawings, wherein:
Fig. 1 shows a block diagram of a first embodiment of the method according to the present invention,
Fig. 2 shows a first embodiment of the present system according to the present invention,
Fig. 3 shows a second embodiment of the present system,
Fig. 4 shows a third embodiment of the present system,
Fig. 5 shows a fourth embodiment of the present system,
Fig. 6 shows an example of synchronizing the image data and the other data,
Fig. 7 shows an example of the structured dataset stored in the database,
Fig. 8 shows an exemplary application of the present system for an offshore site,
Fig. 9 shows an exemplary application of the present system for an onshore site, and
Fig. 10 shows another example of synchronizing the image data and the other data.
In the following text, the figures will be described one by one, and the distinct parts and positions seen in the figures will be numbered with the same numbers throughout the different figures. Not all parts and positions indicated in a specific figure will necessarily be discussed together with that figure. Detailed Description of the Invention
Fig. 1 shows a block diagram of a first embodiment of the method according to the present invention. The method is adapted to manage an industrial site where the mapping of the industrial site may change during the industrial process.
A number of cameras are positioned relative to the industrial site, each of which captures image data 1 of the industrial site. The frame resolution, shutter speed and other settings of the cameras are optionally adapted to the industrial process performed at the industrial site. The image data is inputted to a data processing apparatus via a communications link, e.g., an encrypted communications link, wherein the image data is processed and analysed 2 as mentioned later.
Optionally, other types of data 3 are further inputted to the data processing apparatus together with the image data 1. The other data 3 and the image data 1 are then processed and analysed 2 in the computer unit of the data processing apparatus. The other data 3 are representative of conditions at or related to the industrial site, and some of the other data 3 may be provided by third-party providers.
Here, the other data 3 includes, but is not limited to, location data 3a, telemetry data 3b, acceleration and gyroscopic data 3c, meteorological data 3d, radio communications 3e and logged data 3f. Other types of other data may be inputted to the data processing apparatus depending on the application of the present system.
The location data 3a is inputted via a positioning system, such as a global or local positioning system, in communication with the data processing apparatus. The location data 3a is indicative of the locations of one or more objects within the industrial site.
The telemetry data 3b and/or the acceleration and gyroscopic data 3c may be inputted directly to the data processing apparatus from sensors or other devices arranged on objects located within the industrial site. The telemetry data 3b and/or the acceleration and gyroscopic data 3c may also be inputted from a third-party provider, which manages the operation of one or more objects, such as cranes, vessels, machines, or the like, within the industrial site. Optionally, the location data 3a, the telemetry data 3b, and/or the acceleration and gyroscopic data 3c may be supplied by the same third-party provider.
The meteorological data 3d is inputted via a local meteorological unit or a third-party provider to the data processing apparatus. The meteorological data 3d is representative of the environmental conditions relating to the industrial site. The meteorological data 3d includes, but is not limited to, historical weather data, and/or weather forecasts.
The radio communications 3e between radio communications devices located within the industrial site is further inputted, e.g., directly, to the data processing apparatus. The radio communications 3e is optionally encrypted to prevent unauthorised access to the radio communication.
Workers at the industrial site can manually log further data relating to the conditions at the industrial site or to operations performed at the industrial site.
The inputted image data 1 and the inputted other data 3 are analysed 2 by a computer unit in the data processing apparatus to determine a number of operational states within the image data and the other types of data 3. The operational states are preferably determined by the computer unit using computer vision algorithms, feature extraction algorithms, or a combination thereof. Optionally, the inputted data 1, 3 are filtered and pre-processed into signals or signal ranges suitable for the subsequent data analysis.
The operational states are then evaluated 4 by the computer unit to determine a number of events 4a using classification algorithms, custom logics, or a combination thereof. The events 4a determined by the computer unit are then synchronised with the image data and the other data to form a structured dataset 4b. The structured dataset 4b along with the events 4a determined by the computer unit are then stored in a database 5.
Users of the present system are then able to access the data processing apparatus and view the results 6 of the evaluation process. The structured dataset 4b together with the events 5 may be presented to the user as a report 6a or in a dedicated graphic user interface 6b. The user or administrator of the present system is further able to input data from third-party providers via a dedicated API interface 6c. Fig. 2 shows a first embodiment of the system according to the present invention. Here, the data processing apparatus is configured as a hybrid data service, where parts of the data processing 2’ (ref. data analysis) are performed locally while other parts of the data processing 2” are performed remotely.
The image data 1 and other data 3 are captured or measured locally 7 using cameras, sensors, or other data recording devices 8a, 8b. The image data 1 and other data 3 are inputted to a local data processing unit where the inputted data 1, 3 are at least analysed 2’ by a local computer unit in the local data processing unit. The data outputted from the local data processing unit is transmitted to a remote data processing unit, wherein the outputted data is at least evaluated remotely 9 for determining the events 4a.
This remote data processing unit may further receive data 8c inputted from third-party providers. This data 8c is then analysed by the remote computer unit in the remote data processing unit to determine the operational states.
The data outputted from the local data processing unit together with the data 8c from third-party providers are then evaluated by the remote computer unit to determine the events 4a and generate the structured dataset 4b. The events 4a and structured dataset 4b are then stored in a remote database 5.
Users are then able to access the remote data processing apparatus by a dedicated API interface using their respective user profiles. The user can then view or download the events 4a and structured dataset 4b by transmitting data requests 10 to the remote data processing apparatus. The user may request corrections or changes to the structured dataset 4b via the API interface.
Fig. 3 shows a second embodiment of the present system. Here, the data processing apparatus is configured as a remote data service, where the data processing 2 is performed remotely.
The image data 1 and other data 3 are captured or measured locally 7’ using cameras, sensors, or other data recording devices 8a, 8b. The image data 1 and other data 3 are transmitted to the remote data processing unit, wherein the received data together with the data 8c inputted from third-party providers are analysed and evaluated remotely 9”.
The computer unit in the remote data processing unit determines the events 4a and generates the structured dataset 4b. The events 4a and structured dataset 4b are then stored in the remote database 5.
Fig. 4 shows a third embodiment of the system. Here, the data processing apparatus is configured as an alternative remote data service, where the data storage 5 is performed in a remote database 5’ separate from the data processing 2.
The computer unit in the remote data processing unit determines the events 4a and generates the structured dataset 4b. The events 4a and structured dataset 4b are then transmitted to the remote database 5’ for data storage. The remote database 5’ is in communication with the remote data processing apparatus for accessing the stored data.
Fig. 5 shows a fourth embodiment of the system. Here, the data processing apparatus is configured as a local data service, where the data processing 2 is performed locally.
The image data 1 and other data 3 are captured or measured locally 7” using cameras, sensors, or other data recording devices 8a, 8b. The image data 1 and other data 3 are inputted to the local data processing unit, wherein the inputted data together with the data 8c inputted from third-party providers are analysed and evaluated locally 7”.
The computer unit in the local data processing unit determines the events 4a and generates the structured dataset 4b. The events 4a and structured dataset 4b are then stored in a local database 5”.
Users are then able to access the local data processing apparatus remotely 9”’ via a dedicated API interface using their respective user profiles. The user can then view or download the events 4 and structured dataset 4b by transmitting a data request 10 to the remote data processing apparatus. The user may request corrections or changes to the structured dataset 4b via the API interface. Although different versions of the system are illustrated in Figs. 2-5, other configurations of the system may be used.
Figs. 6 and 10 show two examples of synchronizing the image data 1 and the other data 3. Here, the image data 1 and the other data 3 are synchronised using a timestamp signal 11. The data associated with each event 4a in the structured dataset 4b are synchronised using this timestamp signal 11.
One or more timestamps 1 la are logged and linked to each event 4a. Thus, allowing the user to view at least the start, the end, and the time period for each event 4a.
Fig. 7 shows an example of the structured dataset 4b stored in the database 5. Here, the dataset 4b is presented as a matrix where the relevant inputted data 1, 3 for each event 4a are listed together with the number of events 4a determined by the computer unit. The matrix may be changed or altered upon request from the users, wherein the administrator of the present system may control the configuration of the matrix.
Fig. 8 shows an exemplary application of the present system for an offshore site. Here, the industrial site is an offshore site, such as an installation vessel 12 for erection of a wind turbine 13. The installation vessel 12 comprises a crane unit for lifting the various wind turbine components into position relative to each other.
Cameras 14 are positioned at predetermined locations on the offshore site, wherein the cameras 14 are used to monitor areas of interest. Other sensors (not shown) are further arranged at the offshore site for inputting other data 3 to the data processing apparatus 15.
Here, the operation of at least the crane unit can be monitored using the cameras 14 and the sensors (not shown), wherein image data and measured data of the crane operations are transmitted to the data processing apparatus 15 for data analysis and evaluation. Thereby allowing at least events involving the use of the crane unit to be determined by the computer unit in the data processing apparatus 15.
Fig. 9 shows an exemplary application of the present system for an onshore site. Here, the industrial site is an onshore site, such as an erection site of a wind turbine 13. The installation involves the use of a crane unit 16 for lifting the various wind turbine components into position relative to each other.
Cameras 14 are positioned at predetermined locations at onshore site, wherein the cam- eras 14 are used to monitor areas of interest. Other sensors (not shown) are further arranged at the onshore site for inputting other data 3 to the data processing apparatus 15.
Here, the operation of at least the crane unit 16 can be monitored using the cameras 14 and the sensors (not shown), wherein image data and measured data of the crane oper- ations are transmitted to the data processing apparatus 15 for data analysis and evaluation. Thereby allowing at least events involving the use of the crane unit 16 to be determined by the computer unit in the data processing apparatus 15.

Claims

1. A method of determining at least one event of an industrial process at an industrial site, comprising acts of:
- collecting sensor data representing conditions at/or related to the industrial site,
- inputting the collected data to a data processing apparatus comprising a computer unit,
- performing data analysis on the one or more of the inputted data in the computer unit using one or more feature extraction algorithms configured to determine a number of unique features, wherein a combination of said unique features represents one or more process segments and/or events,
- evaluating said combination of unique features in the computer unit to determine at least one event,
- logging a timestamp and preferably a position of the at least one determined event, and
- synchronising the at least one event with the data inputted to the data processing apparatus to form a structured dataset, storing said structured dataset in a database, wherein said time stamp is synchronized to an external clock operating a globally unique time system.
2. The method according to claim 1, wherein the position is synchronised to a globally unique geospatial positioning reference system.
3. The method according to claim 1 or 2 further comprise further acts of:
- determining a least one process segment based on the determined unique features,
- evaluating said determined process segment in the computer unit to determine at least one event.
4. The method according to any one of the preceding claims comprising further acts of
- capturing image data of at least a part of the industrial site using at least one camera, the image data comprising a plurality of image frames,
- inputting the image data to the data processing apparatus,
- performing data analysis on the inputted image data using one or more computer vision algorithms configured to determine a number of unique features in the image data.
5. The method according to claim 4, characterised in that the computer vision algorithms comprises at least one of an image classification algorithm, an optical flow algorithm, an object detection algorithm, an action classification algorithm, a temporal action localization algorithm, or a combination thereof.
6. The method according to any one of the preceding claims wherein the industrial site changes during the industrial process.
7. The method according to any one of the preceding claims, characterised in that the method further comprises the step of determining the position of at least one object, and optionally tracking said at least one object throughout the industrial site.
8. The method according to any one of the preceding claims, characterised in that at least radio communications between at least two radio communications devices and/or audio signals picked up by at least one microphone are transmitted to the data processing apparatus, wherein the computer unit extracts voice commands and/or unique audio signals from said radio communications or audio signals using the feature extraction algorithms.
9. The method according to any one of the preceding claims, characterised in that telemetry data from one or more telemetry devices located within the industrial site is transmitted to the data processing apparatus, wherein the computer unit extracts at least one unique feature from the telemetry data using the feature extraction algorithms.
10. The method according to claim 9, characterised in that telemetry data from at least one crane unit and/or image data of the at least one crane unit is transmitted to the data processing apparatus, wherein the computer unit determines at least one unique feature of a crane operation using the feature extract algorithms and/or the computer vision algorithms.
11. The method according to any one of the preceding claims, characterised in that meteorological data from a meteorological unit or system is transmitted to the data processing apparatus, optionally the computer unit extracts at least one environmental feature from the meteorological data using the feature extraction algorithms.
12. The method according to any one of the preceding claims, characterised in that the method further comprises logging additional data into the data processing apparatus using a user terminal, the logged data is representative of conditions related to a selected process segment and/or event.
13. The method according to any one of the preceding claims, characterised in that the method further comprises at least one of:
- tracking the movement of a first object relative to at least a second object or a fixpoint,
- determining a first event and at least a second event, wherein both the first and second events involve the same object(s),
- determining the positions of a machine or tool and a selected object relative to a dedicated point of operation.
14. The method according to any one of the preceding claims, characterised in that the image data is anonymised using an anonymisation algorithm implemented in the data processing apparatus, wherein the anonymised image data is stored in a database.
15. The method according to any one of the preceding claims, characterised in that the method further comprises steps of:
- updating a training dataset based on at least the events, the image data, or the sensor data, and
- retraining one or more of the algorithms implemented in the computer unit using the updated training dataset.
16. The method according to any one of the preceding claims, characterised in that the events of the industrial process are arranged in a chronological or predetermined order in the structured dataset.
17. A system for determining at least one event of an industrial process at an industrial site, said system comprising:
- at least one sensor configured to collect sensor data representing conditions at/or related to the industrial site, - a data processing apparatus comprising a computer unit, wherein the sensor(s) is connected to the data processing apparatus via a communications link for inputting the collected data to the data processing apparatus,
- wherein the computer unit of the data processing apparatus is adapted to execute the steps of the method of any of claims 1 to 16.
18. The system according to claim 17 comprising at least one camera configured to capture image data of at least a part of the industrial site, the image data comprising a plurality of image frames, and wherein the camera(s) is connected to the data processing apparatus via a communications link for inputting the image data to the data processing apparatus.
19. The system according to claim 17 or 18, characterised in that the data processing apparatus is configured to communicate with a positioning system via a communications link, wherein the positioning system is configured to input position data to the data processing apparatus.
20. The system according to any one of claims 17-19, characterised in that the computer unit is configured to combine the image data from multiple cameras into a multi-dimensional mapping of the industrial site.
21. The system according to any one of claims 17-20, characterised in that the system further comprises
- at least two radio communications devices configured to communicate with each other via a radio communications link, the radio communication between the radio communications devices being transmitted to the data processing apparatus, and/or
- at least one microphone connected to the data processing apparatus, wherein the at least one microphone is configured to pick up audio signals from at least one position at the industrial site.
22. The system according to any one of claims 17-21, characterised in that the system further comprises a plurality of telemetry devices located within the industrial site and/or at least one meteorological unit or system, wherein the telemetry devices and/or the meteorological unit or system are configured to communicate with the data processing apparatus via one or more communications links.
23. The system according to any one of claims 17-22, characterised in that the industrial site is an installation or shipping vessel, a shipping terminal, an installation or construction site, a heavy loading or unloading area, or a production or pre-assembly factory or area.
24. The system according to any one of claims 17-23, characterised in that the industrial site is:
- a dynamic site where the mapping of said dynamic site changes during the progress of the industrial process, or
- a factory where the industrial process is performed at one or more production lines.
25. A computer program comprising instructions which, when loaded and run on the system of any one of claims 17-24, causes the data processing apparatus to execute the method steps of any one of claims 1 to 16.
26. A computer-readable medium having stored thereon the computer program of claim 25.
27. Use of the method of determining at least one event of an industrial process at an industrial site according to any one of claims 1-16, wherein the industrial site changes during the industrial process.
28. Use of the method of determining at least one event of an industrial process at an industrial site according to any one of claims 1-16 for asset management.
PCT/DK2025/050028 2024-02-21 2025-02-21 A method of determining an event of an industrial process at an industrial site and a system thereof Pending WO2025176272A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DKPA202530631A DK202530631A1 (en) 2024-02-21 2025-10-13 A method of determining an event of an industrial process at an industrial site and a system thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPCT/DK2024/050032 2024-02-21
PCT/DK2024/050032 WO2025176269A1 (en) 2024-02-21 2024-02-21 A method of managing an industrial site and a system thereof

Publications (1)

Publication Number Publication Date
WO2025176272A1 true WO2025176272A1 (en) 2025-08-28

Family

ID=90105180

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/DK2024/050032 Pending WO2025176269A1 (en) 2024-02-21 2024-02-21 A method of managing an industrial site and a system thereof
PCT/DK2025/050028 Pending WO2025176272A1 (en) 2024-02-21 2025-02-21 A method of determining an event of an industrial process at an industrial site and a system thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/DK2024/050032 Pending WO2025176269A1 (en) 2024-02-21 2024-02-21 A method of managing an industrial site and a system thereof

Country Status (2)

Country Link
DK (1) DK202530631A1 (en)
WO (2) WO2025176269A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018191555A1 (en) 2017-04-14 2018-10-18 Drishti Technologies. Inc Deep learning system for real time analysis of manufacturing operations
CA3015879A1 (en) 2018-08-29 2020-02-29 Westernone Inc. Construction site monitoring system and method
CN111242574A (en) 2020-01-08 2020-06-05 中国建筑第二工程局有限公司西南分公司 Intelligent site inspection management system and method based on GPS technology
CN111429107A (en) 2020-04-08 2020-07-17 乌鲁木齐富迪信息技术有限公司 Intelligent construction site management system
US20210004591A1 (en) 2019-09-14 2021-01-07 Ron Zass Sequence of events monitoring in construction sites
US20210117684A1 (en) 2019-10-17 2021-04-22 Drishti Technologies, Inc. Cycle detection techniques
WO2021110226A1 (en) 2019-12-02 2021-06-10 Claviate Aps A method of monitoring a production area and a system thereof
US11175650B2 (en) 2017-11-03 2021-11-16 Drishti Technologies, Inc. Product knitting systems and methods
US11803955B1 (en) * 2020-04-30 2023-10-31 Everguard, Inc. Multimodal safety systems and methods

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018191555A1 (en) 2017-04-14 2018-10-18 Drishti Technologies. Inc Deep learning system for real time analysis of manufacturing operations
US11175650B2 (en) 2017-11-03 2021-11-16 Drishti Technologies, Inc. Product knitting systems and methods
CA3015879A1 (en) 2018-08-29 2020-02-29 Westernone Inc. Construction site monitoring system and method
US20210004591A1 (en) 2019-09-14 2021-01-07 Ron Zass Sequence of events monitoring in construction sites
US20210117684A1 (en) 2019-10-17 2021-04-22 Drishti Technologies, Inc. Cycle detection techniques
US11321944B2 (en) 2019-10-17 2022-05-03 Drishti Technologies, Inc. Cycle detection techniques
WO2021110226A1 (en) 2019-12-02 2021-06-10 Claviate Aps A method of monitoring a production area and a system thereof
CN111242574A (en) 2020-01-08 2020-06-05 中国建筑第二工程局有限公司西南分公司 Intelligent site inspection management system and method based on GPS technology
CN111429107A (en) 2020-04-08 2020-07-17 乌鲁木齐富迪信息技术有限公司 Intelligent construction site management system
US11803955B1 (en) * 2020-04-30 2023-10-31 Everguard, Inc. Multimodal safety systems and methods

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ALIM, SABUR A., SOME COMMONLY USED SPEECH FEATURE EXTRACTION ALGORITHMS
ANONYMOUS: "Coordinated Universal Time - Wikipedia", 20 February 2024 (2024-02-20), XP093257391, Retrieved from the Internet <URL:https://en.wikipedia.org/w/index.php?title=Coordinated_Universal_Time&oldid=1209221482> *
ANONYMOUS: "Feature extraction - Wikipedia", 13 December 2017 (2017-12-13), XP055568302, Retrieved from the Internet <URL:https://en.wikipedia.org/w/index.php?title=Feature_extraction&oldid=815172925> [retrieved on 20190313] *
BA-BAEE, ELHAM ET AL., AN OVERVIEW OF AUDIO EVENT DETECTION METHODS FROM FEATURE EXTRACTION TO CLASSIFICATION
WILSON, DANIEL, OBJECT TRACKING AND GEO-LOCALIZATION FROM STREET IMAGES
XIONG, KAIXIN, CAPE: CAMERA VIEW POSITION EMBEDDING FOR MULTI-VIEW 3D OBJECT DETECTION

Also Published As

Publication number Publication date
WO2025176269A1 (en) 2025-08-28
DK202530631A1 (en) 2025-10-29

Similar Documents

Publication Publication Date Title
US9845164B2 (en) System and method of monitoring an industrial plant
CN106249707B (en) Information collection system and information collection method
CN114463932B (en) Non-contact construction safety distance active dynamic identification early warning system and method
KR102674832B1 (en) Tracking of maintenance trajectory in elevator system
CN119773891A (en) Intelligent robot for power inspection and inspection method thereof
CN119204624A (en) A construction project progress management platform
CN112367397A (en) Monitoring and early warning method and system for field work, computer equipment and storage medium
CN102938712A (en) Digital operation field management and control system
US12147248B2 (en) Methods, systems, and devices for inspecting structures and objects
CN120607187B (en) A method, system, and medium for collaborative control of tower crane clusters in smart construction sites.
CN109440834B (en) Foundation pit monitoring system
CN120086774A (en) Composite positioning driven power grid on-site personnel motion trajectory tracking method and system
WO2025176272A1 (en) A method of determining an event of an industrial process at an industrial site and a system thereof
CN120449246A (en) A foundation pit construction monitoring method, system, product and medium
Yan et al. Real-Time Digital Twin–Driven 3D Near-Miss Detection System at Construction Sites
WO2025176271A1 (en) A method of determining contractual compliance of an industrial process and a system thereof
CN119048929A (en) Unmanned aerial vehicle distribution network supervision digital nuclear quantity and acceptance checking method
US20240189996A1 (en) Building a robot mission based on mission metrics
Sun et al. Synchronous Acquisition Control System and Coordinate Correction Algorithm for Subway Tunnel Comprehensive Detection Equipment
CN113407595A (en) Tunnel construction safety supervision system and method
CN120926984B (en) A 3D Attitude Detection Method and System for Tunnel Boring Machine Attitude Deviation Anomalies
WO2022079629A1 (en) Method and system for detecting defects in civil works
CN120316888B (en) Building health state detection system
CN115604317B (en) Prompting system and method for industrial daily inspection
CN111211947A (en) Internet of things online monitoring service processing system and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25708342

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: PA202530631

Country of ref document: DK

WWP Wipo information: published in national office

Ref document number: PA202530631

Country of ref document: DK