WO2025085941A1 - A detection system - Google Patents
A detection system Download PDFInfo
- Publication number
- WO2025085941A1 WO2025085941A1 PCT/ZA2024/050056 ZA2024050056W WO2025085941A1 WO 2025085941 A1 WO2025085941 A1 WO 2025085941A1 ZA 2024050056 W ZA2024050056 W ZA 2024050056W WO 2025085941 A1 WO2025085941 A1 WO 2025085941A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection system
- data
- processing unit
- machinery
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/09675—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where a selection from the received information takes place in the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096791—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
Definitions
- This invention relates to a detection system.
- this invention relates to a detection system for detecting proximity of and communication channels used by surrounding vehicles, machinery and equipment.
- the position-related data may include signal transmission time.
- a signal receipt time may be recorded for allowing distance to the surrounding object to be calculated using a time of flight of the signal.
- the signal receipt time may be obtained by a clock source or time-stamping unit.
- the signal receipt time may be stored in a memory unit.
- the clock source and/or memory unit may be incorporated into the first receiving unit and/or the processing unit.
- a synchronising means may be provided for synchronising the clock source used to obtain the signal receipt time and a clock source of the communication systems associated with vehicles, equipment and/or machinery.
- the synchronising means may be GPS- or network-based. It is to be appreciated that the respective clock sources may be synchronised via a network protocol such as NTP (Network Time Protocol).
- NTP Network Time Protocol
- the synchronising means may incorporate TWT (Two-Way Time Transfer) for allowing a back-and-forth signal exchange for clock synchronisation.
- time stamps may be exchanged in order to determine average error which may be used to correct and/or account for any clock drifts or offsets.
- the position-related data may include any one or more of the group including, but not limited to, geographic co-ordinates, GPS co-ordinates, distance to nearby equipment, machinery or operators, and historical position data.
- the position-related data may include uncertainty or error metrics.
- the coordinates may be in the form of cartesian and/or polar coordinates which may be represented in two or three- dimensions.
- the first receiving unit may be configured to receive additional data from communication systems associated with vehicles, equipment and/or machinery.
- the additional data relating to the vehicles, equipment and/or machinery may include identification information.
- the identification information may include any one or more of the group including, but not limited to, fleet number, identification number, ownership information, regulatory information, machine type, equipment name, serial number, department information, division information, job or task information, and asset tag number.
- the additional data relating to the vehicles, equipment and/or machinery may include operator information.
- the operator information may include any one or more of the group including, but not limited to, operator name or identifier, license number, certification information, and shift information.
- the additional data relating to the vehicles, equipment and/or machinery may include status or operational information.
- the status or operational information may include any one or more of the group including, but not limited to, operating mode, power status, fuel level, battery level, error codes, diagnostic information, maintenance schedule, operating temperature, operating pressures, operating environment information, emergency information, and velocity information.
- the operating environment information may include any one or more of the group including, but not limited to, temperature, humidity, noise levels, vibration levels, pressure readings, altitude, particulate levels, air quality, gas levels, surface stability, surface slopes, tilt or inclination level, radiation levels, and corrosive or chemical material levels.
- the first receiving unit may be configured to receive the position-related data from a proximity sensor.
- the first receiving unit may be arranged in communication with the proximity sensor.
- the proximity sensor may be incorporated into the first receiving unit or integrated into a single unit.
- the proximity sensor may be in the form of a signal transceiver.
- the signal transceiver may be configured to transmit and receive a signal which may be in the form of a sound wave and/or an electromagnetic signal.
- the sound wave may be associated with SONAR (Sound Navigation and Ranging), which may utilise a frequency in the range of 1 kHz to 5MHz.
- the electromagnetic signal may have any frequency associated with the electromagnetic spectrum.
- the frequency may be associated with any of the group including radio, infrared, visible light, and ultraviolet.
- the frequency may be associated with LiDAR (Light Detection and Ranging) which may utilise near-infrared frequencies in the range of 700nm to 1 550nm, or eye-safe infrared frequencies of about 1 550nm.
- LiDAR Light Detection and Ranging
- the position-related data received from the proximity sensor may relate to surrounding objects which may be in the form of any one or more of the group including, but not limited to, equipment, vehicles, machinery, infrastructure, geological features, safety features, supplies, and materials.
- the machinery may include any one or more of the group including, but not limited to, excavators, bulldozers, dump trucks, loaders, drilling rigs, haulage vehicles, crushers, screens, conveyors, shovels, rock drills, dredges, ventilation systems, processing units and or plants, support machinery, exploration tools, pumps, winches, and safety equipment.
- the infrastructure may include any one or more of the group including, but not limited to, shafts, tunnels, haulage systems, power supplies, water management systems, safety facilities, ore handling and processing facilities, personnel facilities, environmental controls, safety barriers, and communication systems.
- the first receiving unit may be configured to receive the position-related data from communication systems associated with surrounding objects which may be in the form of vehicles, equipment and/or machinery and from a proximity sensor.
- the second receiving unit may be configured to receive the communication data from communication signals transmitted by communication systems of vehicles, equipment, and/or machinery being operated by the remote operator, preferably being configured to receive the communication data from transmissions from a plurality of remote operators.
- the second receiving unit may include an antenna or aerial for receiving or intercepting the communication signals.
- the second receiving unit may be configured to receive the additional data from communication systems associated with vehicles, equipment and/or machinery.
- the first and second receiving units may be separate units. Alternatively, the first and second receiving units may be integrated into a single receiving unit.
- a signal tuner may be provided for allowing the second receiving unit to tune into a desired frequency.
- the signal tuner may form part of the second receiving unit and/or the processing unit.
- a filter may be provided for filtering frequencies received from the remote operator.
- the filter may be in the form of an RF (Radio Frequency) filter. It is to be appreciated that the filter facilitates frequency or channel isolation by blocking unwanted frequencies.
- the filter may form part of the second receiving unit and/or the processing unit.
- An oscilloscope may be provided for analysing waveform characteristics of the communication signals received from the remote operator.
- the oscilloscope may form part of the second receiving unit and/or the processing unit.
- a frequency analyser may be provided for analysing frequencies being received from the remote operator.
- the analyser may be in the form of a spectrum analyser.
- the frequency analyser may form part of the first receiving unit, second receiving unit and/or the processing unit.
- the frequency analyser may form part of a separate unit which may be arranged in communication with the first receiving unit, second receiving unit and/or the processing unit.
- a frequency counter may be provided for measuring frequency of received communication signals for facilitating identification of precise frequencies being transmitted by a particular operator.
- the frequency counter may form part of the first receiving unit, second receiving unit and/or the processing unit.
- the frequency counter may form part of a separate unit which may be arranged in communication with the first receiving unit, second receiving unit and/or the processing unit.
- a software-based tuner may be employed for tuning into various frequencies and/or decoding various signal types.
- the software-based tuner may be based on SDR (Software-Defined Radio).
- SDR Software-Defined Radio
- the software-based tuner may form part of the first receiving unit, second receiving unit and/or the processing unit.
- the software-based tuner may form part of the processing unit.
- a communication protocol analyser may be provided for facilitating identification of data being transmitted by the remote operators.
- the communication protocol analyser may form part of the first receiving unit, second receiving unit and/or the processing unit.
- the communication protocol analyser may form part of the processing unit.
- An encryption decoder may be provided for decoding signals received by the first receiving unit and/or the second receiving unit.
- the encryption decoder may form part of the first receiving unit, second receiving unit and/or the processing unit.
- the encryption decoder may form part of the processing unit.
- a condition sensor may be provided for sensing environmental conditions in the vicinity of the user which may influence signal and/or data analysis.
- the environmental conditions may include any one or more of the group including, but not limited to, temperature, humidity, and pressure.
- the environmental conditions may further include a measure of a quantity of particles and/or impurities contained in the air surrounding the user, such as smoke, fog, dust, and rain.
- the processing unit may form part of any computing device of the group including, but not limited to, desktop computers, laptops, tablets, smartphones, servers, workstations, embedded systems, mainframes, supercomputers, wearables, smart home devices, gaming consoles, thin clients, and edge devices.
- the processing unit may form part of an embedded computer system which may be utilised in mobile machinery such as vehicles or equipment mounted on mobile platforms.
- the processing unit may form part of a smart device or tablet, which may be utilised by a worker.
- the processing unit may be configured to produce the image data in real time. It is to be appreciated that producing the image data and showing an image to the user in real time allows effective monitoring of positions of objects and communication signals used by remote operators.
- the processing unit may be configured to analyse the received position-related data, the communication data, and/or the additional data to determine which actions are to be performed to generate the image data. It is to be appreciated that the additional data relates to the vehicles, equipment and/or machinery being operated by remote operators.
- the processing unit may be configured to filter the received data to remove any irrelevant or corrupt information.
- the processing unit may be configured to parse the received data to identify relevant portions.
- the processing unit may be configured to transform the received data into a format which may be interpreted by the display, the format preferably being structured to allow the data to be represented by pixels. For example, the processing unit may convert raw co-ordinate data into X-Y positions on a graphical map or turn signal strength into visual bars or indicators.
- the processing unit may be configured to perform various calculations on the received data for calculating desired system outputs to be displayed to the user. For example, calculations may include determining distances to objects, vehicles, machinery and equipment or applying algorithms to overlay map information or to represent communication frequencies to which surrounding communication systems are set.
- the processing unit may be configured to combine processed data into a coherent set of information which may be displayed to the user. For example, map data, operator locations, communication channel data, and additional data, are merged into a single data set, representing spatial relationships and current communication data.
- the processing unit may be configured to plot the processed data on a map or graphical layout which may display the locations of the user, surrounding operators, vehicles, equipment and machinery, distances thereto, and the communication data which may be displayed as icons or text.
- the processing unit may be configured to transform and/or render the processed data into the image data to be displayed by the display.
- the image data may be in the form of pixels, vectors or layers. It is to be appreciated that this step may involve assigning visual properties to the data, such as drawing lines, icons, text labels, colour indicators, and the like.
- the processing unit may be configured to transmit the processed and rendered image data to the display in realtime.
- a display controller may be provided for managing the transfer of the image data to the display.
- the display controller may be configured to adjust resolution of the image data to match the display’s resolution.
- the display controller may be integrated into the processing unit. Alternatively, the display controller may form part of a separate unit.
- the image data may be in the form of pixel information which may include colour, brightness, and position.
- the image data may be in the form of any one or more of the group including, but not limited to, an image, 3D object, and text.
- the image data may correspond to a visual representation of any one or more of the group including the position-related data of surrounding objects, vehicles, equipment and machinery, the communication data transmitted by the remote operators of vehicles, equipment and machinery, and additional data relating to the vehicles, equipment and/or machinery being operated by remote operators.
- the image data may include any other system variables which may be dependent on user requirements or preferences.
- the visual representation may include any one or more of the group including, but not limited to, a map, graphical overlay, colour coding, real-time updates, warnings or alerts, user interface elements, overlay on live video feed, and data layers.
- the colour coding may correspond to a type of object, vehicle, equipment.
- the type of object may be categorised according to any one or more of the group including, but not limited to, size, movement, velocity, speed, acceleration, relative position, fleet number, communication channels utilised by the object, machinery type, equipment type, and vehicle type.
- the user may be in the form of a worker or an operator of vehicles, machinery or equipment.
- Figure 1 is a schematic diagram showing a general overview of the detection system.
- reference numeral 10 refers generally to a detection system in accordance with the present invention.
- the detection system 10 includes a first receiving unit 12 for receiving position-related data 14 of surrounding objects 13, a second receiving unit 16 for receiving communication data 18 transmitted by a remote operator 20, a processing unit 22 arranged in communication with the first and second receiving units 12 and 16 for analysing and manipulating the position-related data 14 and the communication data 18 to generate image data 24 relating thereto, and a display 26 arranged in communication with the processing unit 22 for displaying the image data 24 to a user 28.
- the first receiving unit 12 is configured to receive the position-related data 14 from communication systems 30 associated with surrounding objects in the form of vehicles, equipment and machinery 15. It is to be appreciated that the communication systems 30 associated with vehicles, equipment and machinery 15 are configured to transmit signals containing the position-related data 14.
- the position-related data 14 includes signal transmission time.
- a signal receipt time is recorded for allowing distance to the surrounding object 13 to be calculated using a time of flight of the signals.
- the signal receipt time is obtained by a clock source or time-stamping unit (not shown).
- the signal receipt time is stored in a memory unit (not shown).
- the clock source and memory unit (not shown) are incorporated into either or both of the first receiving unit 12 and the processing unit 22.
- a synchronising means (not shown) is provided for synchronising the clock source (not shown) used to obtain the signal receipt time and a clock source (not shown) of the communication systems 30 associated with vehicles, equipment and machinery 15.
- the synchronising means (not shown) is GPS- or network-based.
- the first receiving unit 12 is configured to receive the position-related data 14 from a proximity sensor 34.
- the first receiving unit 12 can be arranged in communication with the proximity sensor 34.
- the proximity sensor 34 can be incorporated into the first receiving unit 12 or integrated into a single unit.
- the proximity sensor 34 is in the form of a signal transceiver (not shown).
- the signal transceiver (not shown) is configured to transmit and receive a signal which is in the form of a sound wave and/or an electromagnetic signal.
- the sound wave is associated with SONAR (Sound Navigation and Ranging), which utilises a frequency in the range of 1 kHz to 5MHz.
- the electromagnetic signal has any frequency which is associated with any of the group including radio, infrared, visible light, and ultraviolet.
- the machinery includes any one or more of the group including, but not limited to, excavators, bulldozers, dump trucks, loaders, drilling rigs, haulage vehicles, crushers, screens, conveyors, shovels, rock drills, dredges, ventilation systems, processing units and or plants, support machinery, exploration tools, pumps, winches, and safety equipment.
- the infrastructure includes any one or more of the group including, but not limited to, shafts, tunnels, haulage systems, power supplies, water management systems, safety facilities, ore handling and processing facilities, personnel facilities, environmental controls, safety barriers, and communication systems.
- the first receiving unit 12 can be configured to receive the position-related data 14 from communication systems 30 associated with surrounding objects in the form of vehicles, equipment and machinery 15 and from a proximity sensor 34.
- the second receiving unit 16 is configured to receive the communication data 18 from communication signals transmitted by communication systems 30 of vehicles, equipment, or machinery 15 being operated by the remote operator 20, typically being configured to receive the communication data 18 from transmissions from a plurality of remote operators 20.
- the second receiving unit 16 includes an antenna or aerial (not shown) for receiving or intercepting the communication signals (not shown).
- the first and second receiving units 12 and 16 can be separate units. Alternatively, the first and second receiving units 12 and 16 can be integrated into a single receiving unit (not shown).
- a signal tuner (not shown) is provided for allowing the second receiving unit 16 to tune into a desired frequency.
- the signal tuner (not shown) can form part of the second receiving unit 16 or the processing unit 22.
- a filter (not shown) is provided for filtering frequencies received from the remote operator 20.
- the filter (not shown) is in the form of an RF (Radio Frequency) filter. It is to be appreciated that the filter (not shown) facilitates frequency or channel isolation by blocking unwanted frequencies.
- the filter (not shown) can form part of the second receiving unit 16 or the processing unit 22.
- An oscilloscope (not shown) is provided for analysing waveform characteristics of the communication signals (not shown) received from the remote operator 20.
- the oscilloscope (not shown) can form part of the second receiving unit 16 or the processing unit 22.
- a frequency analyser (not shown) is provided for analysing frequencies being received from the remote operator 20.
- the analyser (not shown) is in the form of a spectrum analyser.
- the frequency analyser (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22.
- the frequency analyser (not shown) can form part of a separate unit (not shown) which is arranged in communication with the first receiving unit 12, second receiving unit 16 and the processing unit 22.
- a frequency counter (not shown) is provided for measuring frequency of received communication signals (not shown) for facilitating identification of precise frequencies being transmitted by a particular operator (not shown).
- the frequency counter can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22.
- the frequency counter can form part of a separate unit (not shown) which is arranged in communication with the first receiving unit 12, second receiving unit 16 and the processing unit 22.
- a software-based tuner (not shown) is employed for tuning into various frequencies and/or decoding various signal types.
- the software-based tuner (not shown) is based on SDR (Software-Defined Radio).
- the software-based tuner (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22. In a preferred embodiment, the software-based tuner (not shown) forms part of the processing unit 22.
- a communication protocol analyser (not shown) is provided for facilitating identification of data being transmitted by the remote operators 20.
- the communication protocol analyser (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22.
- the communication protocol analyser (not shown) forms part of the processing unit 22.
- An encryption decoder (not shown) is provided for decoding signals received by the first receiving unit 12 or the second receiving unit 16.
- the encryption decoder can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22.
- the encryption decoder (not shown) forms part of the processing unit 22.
- a condition sensor (not shown) is provided for sensing environmental conditions in the vicinity of the user 28 which may influence signal and/or data analysis.
- the environmental conditions include any one or more of the group including, but not limited to, temperature, humidity, and pressure.
- the environmental conditions further include a measure of a quantity of particles and/or impurities contained in the air surrounding the user, such as smoke, fog, dust, and rain.
- the processing unit 22 can form part of any computing device of the group including, but not limited to, desktop computers, laptops, tablets, smartphones, servers, workstations, embedded systems, mainframes, supercomputers, wearables, smart home devices, gaming consoles, thin clients, and edge devices.
- the processing unit 22 form part of an embedded computer system (not shown) which is utilised in mobile machinery such as vehicles or equipment mounted on mobile platforms.
- the processing unit 22 can form part of a smart device or tablet, which is utilised by a worker.
- the processing unit 22 is configured to produce the image data 24 in real time. It is to be appreciated that producing the image data 24 and showing an image to the user 28 in real time allows effective monitoring of positions of objects and communication signals used by remote operators 20.
- the processing unit 22 is configured to analyse the received position-related data 14, the communication data 18, and the additional data 32 to determine which actions are to be performed to generate the image data 24. It is to be appreciated that the additional data 32 relates to the vehicles, equipment and machinery 15 being operated by remote operators 20.
- the processing unit 22 is configured to filter the received data to remove any irrelevant or corrupt information.
- the processing unit 22 is configured to parse the received data to identify relevant portions.
- the processing unit 22 is configured to transform the received data into a format which can be interpreted by the display 26, the format typically being structured to allow the data to be represented by pixels. For example, the processing unit 22 may convert raw co-ordinate data into X-Y positions on a graphical map or turn signal strength into visual bars or indicators.
- the processing unit 22 is configured to perform various calculations on the received data for calculating desired system outputs to be displayed to the user 28. For example, calculations include determining distances to objects, vehicles, machinery and equipment and applying algorithms to overlay map information or to represent communication frequencies to which surrounding communication systems 30 are set.
- the processing unit 22 is configured to combine processed data into a coherent set of information which is displayed to the user 28. For example, map data, operator locations, communication channel data, and additional data, are merged into a single data set, representing the spatial relationships and current communication data.
- the processing unit 22 is configured to plot the processed data on a map or graphical layout (not shown) which can display the locations of the user, surrounding operators, vehicles, equipment and machinery, distances thereto, and the communication data which can be displayed as icons or text.
- the processing unit 22 is configured to transform or render the processed data into the image data 24 to be displayed by the display 26.
- the image data 24 can be in the form of pixels, vectors or layers. It is to be appreciated that this step may involve assigning visual properties to the data, such as drawing lines, icons, text labels, colour indicators, and the like.
- the processing unit 22 is configured to transmit the processed and rendered image data 24 to the display 26 in real-time.
- a display controller (not shown) is provided for managing the transfer of the image data 24 to the display 26.
- the display controller (not shown) is configured to adjust resolution of the image data 24 to match the display’s resolution.
- the display controller (not shown) is integrated into the processing unit 22. Alternatively, the display controller may form part of a separate unit.
- the image data 24 is in the form of pixel information which includes colour, brightness, and position.
- the image data 24 is in the form of any one or more of the group including, but not limited to, an image, 3D object, and text.
- the image data 24 corresponds to a visual representation of any one or more of the group including the position-related data 14 of surrounding objects, vehicles, equipment and machinery, 13 and 15, the communication data 30 transmitted by the remote operators 20 of vehicles, equipment and machinery 15, and additional data 32 relating to the vehicles, equipment and machinery 15 being operated by remote operators 20.
- the image data 24 includes any other system variables which can be dependent on user requirements or preferences.
- the visual representation includes any one or more of the group including, but not limited to, a map, graphical overlay, colour coding, real-time updates, warnings or alerts, user interface elements, overlay on live video feed, and data layers.
- the colour coding can correspond to a type of object, vehicle, equipment.
- the type of object can be categorised according to any one or more of the group including, but not limited to, size, movement, velocity, speed, acceleration, relative position, fleet number, communication channels utilised by the object, machinery type, equipment type, and vehicle type.
- the display 26 is in the form of any suitable device capable of displaying information.
- the display 26 is in the form of a monitor or screen which includes any one or more of the group including, but not limited to, LCD, OLED, LED and Touchscreen.
- the display 26 is arranged in communication, wired or wireless, with the processing unit 22.
- a plurality of displays 26 can be provided for displaying multiple types of information.
- the user 28 is in the form of a worker or an operator of vehicles, machinery or equipment 15. It is, of course, to be appreciated that the detection system in accordance with the invention is not limited to the precise constructional and functional details as hereinbefore described with reference to the accompanying drawings and which may be varied as desired.
- Enhanced Safety and Risk Mitigation The accurate positioning and communication data provided by the system aid in avoiding potential collisions and operational hazards. By alerting users to the proximity of other objects and communication channels in use, it contributes to preventing damage to equipment, reducing the risk of user injury, and minimizing unnecessary repair and liability costs.
- the system offers visual representations of object positions and communication frequencies through its display interfaces. This includes real-time updates, graphical overlays, and color-coded indicators that enhance user awareness and facilitate quick decision-making.
- the detection system enhances operational efficiency by providing comprehensive situational awareness. It enables better coordination among workers and equipment, optimizes workflow, and supports safe and efficient operations in environments where traditional monitoring methods may be insufficient.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
Abstract
A detection system (10) which includes a first receiving unit (12) for receiving position-related data (14) of surrounding objects, a second receiving unit (16) for receiving communication data (18) transmitted by a remote operator (20), a processing unit (22) arranged in communication with the first and second receiving units (12) and (16) for analysing and manipulating the position-related data (14) and the communication data (18) to generate image data (24) relating thereto, and a display (26) arranged in communication with the processing unit (22) for displaying the image data (24) to a user (28).
Description
A DETECTION SYSTEM
TECHNICAL FIELD
This invention relates to a detection system. In particular, this invention relates to a detection system for detecting proximity of and communication channels used by surrounding vehicles, machinery and equipment.
SUMMARY OF THE INVENTION
According to the invention, there is provided a detection system including: - a first receiving unit for receiving position-related data of surrounding objects; a second receiving unit for receiving communication data transmitted by a remote operator; a processing unit arranged in communication with the first and second receiving units for analysing and manipulating the position-related data and the communication data to generate image data relating thereto; and a display arranged in communication with the processing unit for displaying the image data to a user.
The first receiving unit may be configured to receive the position-related data from communication systems associated with surrounding objects which may be in the form of vehicles, equipment and/or machinery. It is to be appreciated that the communication systems associated with vehicles, equipment and/or machinery may be configured to transmit a signal containing position-related data.
The position-related data may include signal transmission time. In this form, a signal receipt time may be recorded for allowing distance to the surrounding object to be calculated using a time of flight of the signal. The signal receipt time may be obtained by a clock source or time-stamping unit. The signal receipt time may be stored in a memory unit. The clock source and/or memory unit may be incorporated into the first receiving unit and/or the processing unit. A synchronising means may be
provided for synchronising the clock source used to obtain the signal receipt time and a clock source of the communication systems associated with vehicles, equipment and/or machinery. The synchronising means may be GPS- or network-based. It is to be appreciated that the respective clock sources may be synchronised via a network protocol such as NTP (Network Time Protocol). In an alternative form, the synchronising means may incorporate TWT (Two-Way Time Transfer) for allowing a back-and-forth signal exchange for clock synchronisation. In an alternative form of the invention, time stamps may be exchanged in order to determine average error which may be used to correct and/or account for any clock drifts or offsets.
The position-related data may include any one or more of the group including, but not limited to, geographic co-ordinates, GPS co-ordinates, distance to nearby equipment, machinery or operators, and historical position data. The position-related data may include uncertainty or error metrics. The coordinates may be in the form of cartesian and/or polar coordinates which may be represented in two or three- dimensions.
The first receiving unit may be configured to receive additional data from communication systems associated with vehicles, equipment and/or machinery. The additional data relating to the vehicles, equipment and/or machinery may include identification information. The identification information may include any one or more of the group including, but not limited to, fleet number, identification number, ownership information, regulatory information, machine type, equipment name, serial number, department information, division information, job or task information, and asset tag number. The additional data relating to the vehicles, equipment and/or machinery may include operator information. The operator information may include any one or more of the group including, but not limited to, operator name or identifier, license number, certification information, and shift information. The additional data relating to the vehicles, equipment and/or machinery may include status or operational information. The status or operational information may include any one or more of the group including, but not limited to, operating mode, power status, fuel level, battery level, error codes, diagnostic information, maintenance schedule, operating
temperature, operating pressures, operating environment information, emergency information, and velocity information. The operating environment information may include any one or more of the group including, but not limited to, temperature, humidity, noise levels, vibration levels, pressure readings, altitude, particulate levels, air quality, gas levels, surface stability, surface slopes, tilt or inclination level, radiation levels, and corrosive or chemical material levels.
The first receiving unit may be configured to receive the position-related data from a proximity sensor. The first receiving unit may be arranged in communication with the proximity sensor. Alternatively, the proximity sensor may be incorporated into the first receiving unit or integrated into a single unit. The proximity sensor may be in the form of a signal transceiver. The signal transceiver may be configured to transmit and receive a signal which may be in the form of a sound wave and/or an electromagnetic signal. The sound wave may be associated with SONAR (Sound Navigation and Ranging), which may utilise a frequency in the range of 1 kHz to 5MHz. The electromagnetic signal may have any frequency associated with the electromagnetic spectrum. Preferably, the frequency may be associated with any of the group including radio, infrared, visible light, and ultraviolet. Further preferably, the frequency may be associated with LiDAR (Light Detection and Ranging) which may utilise near-infrared frequencies in the range of 700nm to 1 550nm, or eye-safe infrared frequencies of about 1 550nm. The position-related data received from the proximity sensor may relate to surrounding objects which may be in the form of any one or more of the group including, but not limited to, equipment, vehicles, machinery, infrastructure, geological features, safety features, supplies, and materials. The machinery may include any one or more of the group including, but not limited to, excavators, bulldozers, dump trucks, loaders, drilling rigs, haulage vehicles, crushers, screens, conveyors, shovels, rock drills, dredges, ventilation systems, processing units and or plants, support machinery, exploration tools, pumps, winches, and safety equipment. The infrastructure may include any one or more of the group including, but not limited to, shafts, tunnels, haulage systems, power supplies, water management systems, safety facilities, ore handling and processing facilities, personnel facilities, environmental controls, safety barriers, and communication systems.
The first receiving unit may be configured to receive the position-related data from communication systems associated with surrounding objects which may be in the form of vehicles, equipment and/or machinery and from a proximity sensor.
The second receiving unit may be configured to receive the communication data from communication signals transmitted by communication systems of vehicles, equipment, and/or machinery being operated by the remote operator, preferably being configured to receive the communication data from transmissions from a plurality of remote operators. The second receiving unit may include an antenna or aerial for receiving or intercepting the communication signals. The second receiving unit may be configured to receive the additional data from communication systems associated with vehicles, equipment and/or machinery.
The first and second receiving units may be separate units. Alternatively, the first and second receiving units may be integrated into a single receiving unit.
A signal tuner may be provided for allowing the second receiving unit to tune into a desired frequency. The signal tuner may form part of the second receiving unit and/or the processing unit.
A filter may be provided for filtering frequencies received from the remote operator. The filter may be in the form of an RF (Radio Frequency) filter. It is to be appreciated that the filter facilitates frequency or channel isolation by blocking unwanted frequencies. The filter may form part of the second receiving unit and/or the processing unit.
An oscilloscope may be provided for analysing waveform characteristics of the communication signals received from the remote operator. The oscilloscope may form part of the second receiving unit and/or the processing unit.
A frequency analyser may be provided for analysing frequencies being received from the remote operator. The analyser may be in the form of a spectrum analyser. The frequency analyser may form part of the first receiving unit, second receiving unit and/or the processing unit. Alternatively, the frequency analyser may form part of a separate unit which may be arranged in communication with the first receiving unit, second receiving unit and/or the processing unit.
A frequency counter may be provided for measuring frequency of received communication signals for facilitating identification of precise frequencies being transmitted by a particular operator. The frequency counter may form part of the first receiving unit, second receiving unit and/or the processing unit. Alternatively, the frequency counter may form part of a separate unit which may be arranged in communication with the first receiving unit, second receiving unit and/or the processing unit.
A software-based tuner may be employed for tuning into various frequencies and/or decoding various signal types. The software-based tuner may be based on SDR (Software-Defined Radio). The software-based tuner may form part of the first receiving unit, second receiving unit and/or the processing unit. Preferably, the software-based tuner may form part of the processing unit.
A communication protocol analyser may be provided for facilitating identification of data being transmitted by the remote operators. The communication protocol analyser may form part of the first receiving unit, second receiving unit and/or the processing unit. Preferably, the communication protocol analyser may form part of the processing unit.
An encryption decoder may be provided for decoding signals received by the first receiving unit and/or the second receiving unit. The encryption decoder may form part of the first receiving unit, second receiving unit and/or the processing unit. Preferably, the encryption decoder may form part of the processing unit.
A condition sensor may be provided for sensing environmental conditions in the vicinity of the user which may influence signal and/or data analysis. The environmental conditions may include any one or more of the group including, but not limited to, temperature, humidity, and pressure. The environmental conditions may further include a measure of a quantity of particles and/or impurities contained in the air surrounding the user, such as smoke, fog, dust, and rain.
The processing unit may form part of any computing device of the group including, but not limited to, desktop computers, laptops, tablets, smartphones, servers, workstations, embedded systems, mainframes, supercomputers, wearables, smart home devices, gaming consoles, thin clients, and edge devices. Preferably, the processing unit may form part of an embedded computer system which may be utilised in mobile machinery such as vehicles or equipment mounted on mobile platforms. Alternatively, the processing unit may form part of a smart device or tablet, which may be utilised by a worker. The processing unit may be configured to produce the image data in real time. It is to be appreciated that producing the image data and showing an image to the user in real time allows effective monitoring of positions of objects and communication signals used by remote operators.
The processing unit may be configured to analyse the received position-related data, the communication data, and/or the additional data to determine which actions are to be performed to generate the image data. It is to be appreciated that the additional data relates to the vehicles, equipment and/or machinery being operated by remote operators.
The processing unit may be configured to filter the received data to remove any irrelevant or corrupt information. The processing unit may be configured to parse the received data to identify relevant portions. The processing unit may be configured to transform the received data into a format which may be interpreted by the display, the format preferably being structured to allow the data to be represented by pixels. For
example, the processing unit may convert raw co-ordinate data into X-Y positions on a graphical map or turn signal strength into visual bars or indicators. The processing unit may be configured to perform various calculations on the received data for calculating desired system outputs to be displayed to the user. For example, calculations may include determining distances to objects, vehicles, machinery and equipment or applying algorithms to overlay map information or to represent communication frequencies to which surrounding communication systems are set. The processing unit may be configured to combine processed data into a coherent set of information which may be displayed to the user. For example, map data, operator locations, communication channel data, and additional data, are merged into a single data set, representing spatial relationships and current communication data. The processing unit may be configured to plot the processed data on a map or graphical layout which may display the locations of the user, surrounding operators, vehicles, equipment and machinery, distances thereto, and the communication data which may be displayed as icons or text. The processing unit may be configured to transform and/or render the processed data into the image data to be displayed by the display. The image data may be in the form of pixels, vectors or layers. It is to be appreciated that this step may involve assigning visual properties to the data, such as drawing lines, icons, text labels, colour indicators, and the like. The processing unit may be configured to transmit the processed and rendered image data to the display in realtime.
A display controller may be provided for managing the transfer of the image data to the display. The display controller may be configured to adjust resolution of the image data to match the display’s resolution. The display controller may be integrated into the processing unit. Alternatively, the display controller may form part of a separate unit.
The image data may be in the form of pixel information which may include colour, brightness, and position. The image data may be in the form of any one or more of the group including, but not limited to, an image, 3D object, and text. The image data may correspond to a visual representation of any one or more of the group
including the position-related data of surrounding objects, vehicles, equipment and machinery, the communication data transmitted by the remote operators of vehicles, equipment and machinery, and additional data relating to the vehicles, equipment and/or machinery being operated by remote operators. The image data may include any other system variables which may be dependent on user requirements or preferences.
The visual representation may include any one or more of the group including, but not limited to, a map, graphical overlay, colour coding, real-time updates, warnings or alerts, user interface elements, overlay on live video feed, and data layers. The colour coding may correspond to a type of object, vehicle, equipment. The type of object may be categorised according to any one or more of the group including, but not limited to, size, movement, velocity, speed, acceleration, relative position, fleet number, communication channels utilised by the object, machinery type, equipment type, and vehicle type.
The display may be in the form of any suitable device capable of displaying information. The display may be in the form of a monitor or screen which may include any one or more of the group including, but not limited to, LCD, OLED, LED and Touchscreen. The display may be arranged in communication, wired or wireless, with the processing unit. A plurality of displays may be provided for displaying multiple types of information.
The user may be in the form of a worker or an operator of vehicles, machinery or equipment.
BRIEF DESCRIPTION OF THE DRAWINGS
A detection system in accordance with the invention will now be described by way of the following, non-limiting examples with reference to the accompanying drawing.
In the drawing: -
Figure 1 is a schematic diagram showing a general overview of the detection system.
DETAILED DESCRIPTION OF THE INVENTION
Referring now to figure 1 , reference numeral 10 refers generally to a detection system in accordance with the present invention. The detection system 10 includes a first receiving unit 12 for receiving position-related data 14 of surrounding objects 13, a second receiving unit 16 for receiving communication data 18 transmitted by a remote operator 20, a processing unit 22 arranged in communication with the first and second receiving units 12 and 16 for analysing and manipulating the position-related data 14 and the communication data 18 to generate image data 24 relating thereto, and a display 26 arranged in communication with the processing unit 22 for displaying the image data 24 to a user 28.
In one form of the invention, the first receiving unit 12 is configured to receive the position-related data 14 from communication systems 30 associated with surrounding objects in the form of vehicles, equipment and machinery 15. It is to be appreciated that the communication systems 30 associated with vehicles, equipment and machinery 15 are configured to transmit signals containing the position-related data 14.
In this form, the position-related data 14 includes signal transmission time. A signal receipt time is recorded for allowing distance to the surrounding object 13 to be calculated using a time of flight of the signals. The signal receipt time is obtained by a clock source or time-stamping unit (not shown). The signal receipt time is stored in a memory unit (not shown). The clock source and memory unit (not shown) are incorporated into either or both of the first receiving unit 12 and the processing unit 22. A synchronising means (not shown) is provided for synchronising the clock source (not shown) used to obtain the signal receipt time and a clock source (not shown) of the communication systems 30 associated with vehicles, equipment and machinery 15.
The synchronising means (not shown) is GPS- or network-based. It is to be appreciated that the respective clock sources (not shown) can be synchronised via a network protocol such as NTP (Network Time Protocol). In an alternative form, the synchronising means (not shown) can incorporate TWT (Two-Way Time Transfer) for allowing a back-and-forth signal exchange for clock synchronisation. As a further alternative, time stamps can be exchanged in order to determine average error which is used to correct and/or account for any clock drifts or offsets.
The position-related data 14 includes any one or more of the group including, but not limited to, geographic co-ordinates, GPS co-ordinates, distance to nearby vehicles, equipment, or machinery 15, and historical position data. The position- related data 14 includes uncertainty or error metrics. The co-ordinates can be in the form of cartesian and/or polar coordinates which can be represented in two or three- dimensions.
The first receiving unit 12 is configured to receive additional data 32 from communication systems 30 associated with vehicles, equipment and machinery. The additional data 32 relating to the vehicles, equipment and machinery 15 includes identification information (not shown). The identification information (not shown) includes any one or more of the group including, but not limited to, fleet number, identification number, ownership information, regulatory information, machine type, equipment name, serial number, department information, division information, job or task information, and asset tag number. The additional data 32 relating to the vehicles, equipment and machinery 15 includes operator information (not shown). The operator information (not shown) includes any one or more of the group including, but not limited to, operator name or identifier, license number, certification information, and shift information. The additional data 32 relating to the vehicles, equipment and machinery 15 includes status or operational information (not shown). The status or operational information (not shown) includes any one or more of the group including, but not limited to, operating mode, power status, fuel level, battery level, error codes, diagnostic information, maintenance schedule, operating temperature, operating pressures, operating environment information, emergency information, and velocity information.
The operating environment information (not shown) includes any one or more of the group including, but not limited to, temperature, humidity, noise levels, vibration levels, pressure readings, altitude, particulate levels, air quality, gas levels, surface stability, surface slopes, tilt or inclination level, radiation levels, and corrosive or chemical material levels.
In another embodiment of the invention, the first receiving unit 12 is configured to receive the position-related data 14 from a proximity sensor 34. The first receiving unit 12 can be arranged in communication with the proximity sensor 34. Alternatively, the proximity sensor 34 can be incorporated into the first receiving unit 12 or integrated into a single unit. The proximity sensor 34 is in the form of a signal transceiver (not shown). The signal transceiver (not shown) is configured to transmit and receive a signal which is in the form of a sound wave and/or an electromagnetic signal. The sound wave is associated with SONAR (Sound Navigation and Ranging), which utilises a frequency in the range of 1 kHz to 5MHz. The electromagnetic signal has any frequency which is associated with any of the group including radio, infrared, visible light, and ultraviolet. In a preferred embodiment, the frequency is associated with LiDAR (Light Detection and Ranging) which utilises near-infrared frequencies in the range of 700nm to 1 550nm, or eye-safe infrared frequencies of about 1 550nm. The position-related data 14 received from the proximity sensor 34 relates to surrounding objects 13 in the form of any one or more of the group including, but not limited to, equipment, vehicles, machinery, infrastructure, geological features, safety features, supplies, and materials. The machinery includes any one or more of the group including, but not limited to, excavators, bulldozers, dump trucks, loaders, drilling rigs, haulage vehicles, crushers, screens, conveyors, shovels, rock drills, dredges, ventilation systems, processing units and or plants, support machinery, exploration tools, pumps, winches, and safety equipment. The infrastructure includes any one or more of the group including, but not limited to, shafts, tunnels, haulage systems, power supplies, water management systems, safety facilities, ore handling and processing facilities, personnel facilities, environmental controls, safety barriers, and communication systems.
The first receiving unit 12 can be configured to receive the position-related data 14 from communication systems 30 associated with surrounding objects in the form of vehicles, equipment and machinery 15 and from a proximity sensor 34.
The second receiving unit 16 is configured to receive the communication data 18 from communication signals transmitted by communication systems 30 of vehicles, equipment, or machinery 15 being operated by the remote operator 20, typically being configured to receive the communication data 18 from transmissions from a plurality of remote operators 20. The second receiving unit 16 includes an antenna or aerial (not shown) for receiving or intercepting the communication signals (not shown).
The first and second receiving units 12 and 16 can be separate units. Alternatively, the first and second receiving units 12 and 16 can be integrated into a single receiving unit (not shown).
A signal tuner (not shown) is provided for allowing the second receiving unit 16 to tune into a desired frequency. The signal tuner (not shown) can form part of the second receiving unit 16 or the processing unit 22.
A filter (not shown) is provided for filtering frequencies received from the remote operator 20. The filter (not shown) is in the form of an RF (Radio Frequency) filter. It is to be appreciated that the filter (not shown) facilitates frequency or channel isolation by blocking unwanted frequencies. The filter (not shown) can form part of the second receiving unit 16 or the processing unit 22.
An oscilloscope (not shown) is provided for analysing waveform characteristics of the communication signals (not shown) received from the remote operator 20. The oscilloscope (not shown) can form part of the second receiving unit 16 or the processing unit 22.
A frequency analyser (not shown) is provided for analysing frequencies being received from the remote operator 20. The analyser (not shown) is in the form of a spectrum analyser. The frequency analyser (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22. Alternatively, the frequency analyser (not shown) can form part of a separate unit (not shown) which is arranged in communication with the first receiving unit 12, second receiving unit 16 and the processing unit 22.
A frequency counter (not shown) is provided for measuring frequency of received communication signals (not shown) for facilitating identification of precise frequencies being transmitted by a particular operator (not shown). The frequency counter (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22. Alternatively, the frequency counter (not shown) can form part of a separate unit (not shown) which is arranged in communication with the first receiving unit 12, second receiving unit 16 and the processing unit 22.
A software-based tuner (not shown) is employed for tuning into various frequencies and/or decoding various signal types. The software-based tuner (not shown) is based on SDR (Software-Defined Radio). The software-based tuner (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22. In a preferred embodiment, the software-based tuner (not shown) forms part of the processing unit 22.
A communication protocol analyser (not shown) is provided for facilitating identification of data being transmitted by the remote operators 20. The communication protocol analyser (not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22. In a preferred embodiment, the communication protocol analyser (not shown) forms part of the processing unit 22.
An encryption decoder (not shown) is provided for decoding signals received by the first receiving unit 12 or the second receiving unit 16. The encryption decoder
(not shown) can form part of the first receiving unit 12, second receiving unit 16 or the processing unit 22. In a preferred embodiment, the encryption decoder (not shown) forms part of the processing unit 22.
A condition sensor (not shown) is provided for sensing environmental conditions in the vicinity of the user 28 which may influence signal and/or data analysis. The environmental conditions include any one or more of the group including, but not limited to, temperature, humidity, and pressure. The environmental conditions further include a measure of a quantity of particles and/or impurities contained in the air surrounding the user, such as smoke, fog, dust, and rain.
The processing unit 22 can form part of any computing device of the group including, but not limited to, desktop computers, laptops, tablets, smartphones, servers, workstations, embedded systems, mainframes, supercomputers, wearables, smart home devices, gaming consoles, thin clients, and edge devices. In a preferred embodiment, the processing unit 22 form part of an embedded computer system (not shown) which is utilised in mobile machinery such as vehicles or equipment mounted on mobile platforms. Alternatively, the processing unit 22 can form part of a smart device or tablet, which is utilised by a worker. The processing unit 22 is configured to produce the image data 24 in real time. It is to be appreciated that producing the image data 24 and showing an image to the user 28 in real time allows effective monitoring of positions of objects and communication signals used by remote operators 20.
The processing unit 22 is configured to analyse the received position-related data 14, the communication data 18, and the additional data 32 to determine which actions are to be performed to generate the image data 24. It is to be appreciated that the additional data 32 relates to the vehicles, equipment and machinery 15 being operated by remote operators 20.
The processing unit 22 is configured to filter the received data to remove any irrelevant or corrupt information. The processing unit 22 is configured to parse the
received data to identify relevant portions. The processing unit 22 is configured to transform the received data into a format which can be interpreted by the display 26, the format typically being structured to allow the data to be represented by pixels. For example, the processing unit 22 may convert raw co-ordinate data into X-Y positions on a graphical map or turn signal strength into visual bars or indicators. The processing unit 22 is configured to perform various calculations on the received data for calculating desired system outputs to be displayed to the user 28. For example, calculations include determining distances to objects, vehicles, machinery and equipment and applying algorithms to overlay map information or to represent communication frequencies to which surrounding communication systems 30 are set. The processing unit 22 is configured to combine processed data into a coherent set of information which is displayed to the user 28. For example, map data, operator locations, communication channel data, and additional data, are merged into a single data set, representing the spatial relationships and current communication data. The processing unit 22 is configured to plot the processed data on a map or graphical layout (not shown) which can display the locations of the user, surrounding operators, vehicles, equipment and machinery, distances thereto, and the communication data which can be displayed as icons or text. The processing unit 22 is configured to transform or render the processed data into the image data 24 to be displayed by the display 26. The image data 24 can be in the form of pixels, vectors or layers. It is to be appreciated that this step may involve assigning visual properties to the data, such as drawing lines, icons, text labels, colour indicators, and the like. The processing unit 22 is configured to transmit the processed and rendered image data 24 to the display 26 in real-time.
A display controller (not shown) is provided for managing the transfer of the image data 24 to the display 26. The display controller (not shown) is configured to adjust resolution of the image data 24 to match the display’s resolution. The display controller (not shown) is integrated into the processing unit 22. Alternatively, the display controller may form part of a separate unit.
The image data 24 is in the form of pixel information which includes colour, brightness, and position. The image data 24 is in the form of any one or more of the group including, but not limited to, an image, 3D object, and text. The image data 24 corresponds to a visual representation of any one or more of the group including the position-related data 14 of surrounding objects, vehicles, equipment and machinery, 13 and 15, the communication data 30 transmitted by the remote operators 20 of vehicles, equipment and machinery 15, and additional data 32 relating to the vehicles, equipment and machinery 15 being operated by remote operators 20. The image data 24 includes any other system variables which can be dependent on user requirements or preferences.
The visual representation (not shown) includes any one or more of the group including, but not limited to, a map, graphical overlay, colour coding, real-time updates, warnings or alerts, user interface elements, overlay on live video feed, and data layers. The colour coding (not shown) can correspond to a type of object, vehicle, equipment. The type of object can be categorised according to any one or more of the group including, but not limited to, size, movement, velocity, speed, acceleration, relative position, fleet number, communication channels utilised by the object, machinery type, equipment type, and vehicle type.
The display 26 is in the form of any suitable device capable of displaying information. The display 26 is in the form of a monitor or screen which includes any one or more of the group including, but not limited to, LCD, OLED, LED and Touchscreen. The display 26 is arranged in communication, wired or wireless, with the processing unit 22. A plurality of displays 26 can be provided for displaying multiple types of information.
The user 28 is in the form of a worker or an operator of vehicles, machinery or equipment 15.
It is, of course, to be appreciated that the detection system in accordance with the invention is not limited to the precise constructional and functional details as hereinbefore described with reference to the accompanying drawings and which may be varied as desired.
Although only certain embodiments of the invention have been described herein, it will be understood by any person skilled in the art that other modifications, variations, and possibilities of the invention are possible. Such modifications, variations and possibilities are therefore to be considered as falling within the spirit and scope of the invention and hence form part of the invention as herein described and/or exemplified. It is further to be understood that the examples are provided for illustrating the invention further and to assist a person skilled in the art with understanding the invention and is not meant to be construed as unduly limiting the reasonable scope of the invention.
The inventor believes that the detection system in accordance with the present invention offers several significant advantages:
1. Accurate, Real-Time Monitoring: The system provides precise and immediate monitoring capabilities for relative positioning data of objects such as equipment, vehicles, and personnel. By utilizing time-of-flight calculations enhanced with environmental condition sensing and GPS, it is believed to achieve high accuracy even in challenging conditions.
2. Enhanced Operational Efficiency in Low Visibility Environments: Even in underground mining environments where visibility is relatively poor, the system enables users to accurately track and locate surrounding objects, vehicles, equipment and machinery. This improves operational efficiency by facilitating better planning and coordination of activities.
3. Fleet and Channel Number Identification: The system facilitates the identification of fleet numbers and communication channel frequencies used by various users and equipment. By analysing communication signals, it allows effective management and coordination of multiple units operating simultaneously.
4. Seamless Integration with Existing Infrastructure: The detection system is designed to integrate seamlessly with existing infrastructure. This compatibility reduces the need for additional hardware installations and minimizes deployment time and costs, making it a practical solution for upgrading current operations.
5. Reduction of Communication Failures: By monitoring and analysing the communication frequencies utilized by equipment and personnel, the system reduces the likelihood of communication failures and interference. This proactive detection helps maintain reliable communication channels, which is critical for safety and coordination.
6. Enhanced Safety and Risk Mitigation: The accurate positioning and communication data provided by the system aid in avoiding potential collisions and operational hazards. By alerting users to the proximity of other objects and communication channels in use, it contributes to preventing damage to equipment, reducing the risk of user injury, and minimizing unnecessary repair and liability costs.
7. User-Friendly Visual Representation: The system offers visual representations of object positions and communication frequencies through its display interfaces. This includes real-time updates, graphical overlays, and color-coded indicators that enhance user awareness and facilitate quick decision-making.
8. Scalability and Flexibility: With the ability to detect various types of signals and integrate additional data such as operational and environmental information, the system is scalable and adaptable to different operational needs. It can be customized to suit specific requirements, making it versatile for various industrial applications.
9. Improved Environmental Awareness: By incorporating condition sensors that monitor atmospheric conditions like temperature, humidity, and particulate matter, the system accounts for environmental factors that may affect signal propagation. This leads to more reliable measurements and enhances overall system performance.
10. Contribution to Operational Efficiency: Overall, the detection system enhances operational efficiency by providing comprehensive situational awareness. It enables better coordination among workers and equipment, optimizes workflow, and supports safe and efficient operations in environments where traditional monitoring methods may be insufficient.
In summary, with the detection system's ability to provide accurate, real-time data on object positioning and communication frequencies, combined with its seamless integration and user-friendly interfaces, it is believed to be a valuable tool for improving safety, efficiency, and reliability in underground mining and other challenging environments.
Claims
1 . A detection system including: - a first receiving unit for receiving position-related data of surrounding objects; a second receiving unit for receiving communication data transmitted by a remote operator; a processing unit arranged in communication with the first and second receiving units for analysing and manipulating the position-related data and the communication data to generate image data relating thereto; and a display arranged in communication with the processing unit for displaying the image data to a user.
2. A detection system as claimed in claim 1 wherein the first receiving unit is configured to receive the position-related data from communication systems associated with surrounding objects in the form of vehicles, equipment or machinery.
3. A detection system as claimed in claim 1 or 2 wherein the position-related data includes signal transmission time.
4. A detection system as claimed in claim 3 wherein a signal receipt time is recorded for allowing distance to the surrounding object to be calculated using a time of flight of the signal.
5. A detection system as claimed in any one or more of the preceding claims wherein the position-related data includes any one or more of the group including, but not limited to, geographic co-ordinates, GPS co-ordinates, distance to nearby equipment, machinery or operators, and historical position data.
6. A detection system as claimed in any one or more of the preceding claims wherein the first receiving unit is configured to receive additional data from communication systems associated with vehicles, equipment or machinery.
7. A detection system as claimed in claim 6 wherein the additional data relating to the vehicles, equipment or machinery includes identification information.
8. A detection system as claimed in claim 7 wherein the identification information includes any one or more of the group including, but not limited to, fleet number, identification number, ownership information, regulatory information, machine type, equipment name, serial number, department information, division information, job or task information, and asset tag number.
9. A detection system as claimed in any one or more of claims 6 to 8 wherein the additional data relating to the vehicles, equipment or machinery includes operator information.
10. A detection system as claimed in claim 9 wherein the operator information includes any one or more of the group including, but not limited to, operator name or identifier, license number, certification information, and shift information.
11. A detection system as claimed in any one or more of the claims 6 to 10 wherein the additional data relating to the vehicles, equipment and/or machinery includes status or operational information.
12. A detection system as claimed in claim 11 wherein the status or operational information includes any one or more of the group including, but not limited to, operating mode, power status, fuel level, battery level, error codes, diagnostic information, maintenance schedule, operating temperature, operating pressures, operating environment information, emergency information, and velocity information.
13. A detection system as claimed in claim 12 wherein the operating environment information includes any one or more of the group including, but not limited to, temperature, humidity, noise levels, vibration levels, pressure readings, altitude, particulate levels, air quality, gas levels, surface stability, surface slopes, tilt or inclination level, radiation levels, and corrosive or chemical material levels.
14. A detection system as claimed in any one or more of the preceding claims wherein the first receiving unit is configured to receive the position-related data from a proximity sensor.
15. A detection system as claimed in claim 14 wherein the proximity sensor is in the form of a signal transceiver.
16. A detection system as claimed in claim 15 wherein the signal transceiver is configured to transmit and receive a signal which is in the form of a sound wave or an electromagnetic signal.
17. A detection system as claimed in any one or more of the claims 14 to 16 wherein the position-related data received from the proximity sensor relates to surrounding objects in the form of any one or more of the group including, but not limited to, equipment, vehicles, machinery, infrastructure, geological features, safety features, supplies, and materials.
18. A detection system as claimed in claim 17 wherein the machinery includes any one or more of the group including, but not limited to, excavators, bulldozers, dump trucks, loaders, drilling rigs, haulage vehicles, crushers, screens, conveyors, shovels, rock drills, dredges, ventilation systems, processing units and or plants, support machinery, exploration tools, pumps, winches, and safety equipment.
19. A detection system as claimed in claim 17 wherein the infrastructure includes any one or more of the group including, but not limited to, shafts, tunnels, haulage systems, power supplies, water management systems, safety facilities, ore handling and processing facilities, personnel facilities, environmental controls, safety barriers, and communication systems.
20. A detection system as claimed in any one or more of the preceding claims wherein the first receiving unit is configured to receive the position-related data from communication systems associated with surrounding objects which is in the form of vehicles, equipment or machinery and from a proximity sensor.
21. A detection system as claimed in any one or more of the preceding claims wherein the second receiving unit is configured to receive the communication data from communication signals transmitted by communication systems of vehicles, equipment, and machinery being operated by the remote operator.
22. A detection system as claimed in any one or more of the preceding claims wherein the first and second receiving units are integrated into a single receiving unit.
23. A detection system as claimed in any one or more of the preceding claims wherein a communication protocol analyser is provided for facilitating identification of data being transmitted by the remote operators.
24. A detection system as claimed in claim 23 wherein the communication protocol analyser forms part of the first receiving unit, second receiving unit or the processing unit.
25. A detection system as claimed in any one or more of the preceding claims wherein a condition sensor is provided for sensing environmental conditions in the vicinity of the user which may influence signal or data analysis.
26. A detection system as claimed in claim 25 wherein the environmental conditions include any one or more of the group including, but not limited to, temperature, humidity, and pressure.
27. A detection system as claimed in claim 25 or 26 wherein the environmental conditions include a measure of a quantity of particles or impurities contained in the air surrounding the user.
28. A detection system as claimed in any one or more of the preceding claims wherein the processing unit forms part of any computing device of the group including, but not limited to, desktop computers, laptops, tablets, smartphones, servers, workstations, embedded systems, mainframes, supercomputers, wearables, smart home devices, gaming consoles, thin clients, and edge devices.
29. A detection system as claimed in any one or more of the preceding claims wherein the processing unit is configured to produce the image data in real-time.
30. A detection system as claimed in claim 28 or 29 wherein the processing unit is configured to analyse the received position-related data, the communication data, and the additional data to determine which actions are to be performed to generate the image data.
31. A detection system as claimed in claim 30 wherein the processing unit is configured to filter the received data to remove any irrelevant or corrupt information.
32. A detection system as claimed in 30 or 31 wherein the processing unit is configured to parse the received data to identify relevant portions.
33. A detection system as claimed in wherein the processing unit is configured to transform the received data into a format which is interpreted by the display.
34. A detection system as claimed in wherein the processing unit is configured to perform various calculations on the received data for calculating desired system outputs to be displayed to the user.
35. A detection system as claimed in any one or more of the claims 30 to 34 wherein the processing unit is configured to combine processed data into a coherent set of information which is displayed to the user.
36. A detection system as claimed in claim 35 wherein the processing unit is configured to plot the processed data on a map or graphical layout which displays the locations of the user, surrounding operators, vehicles, equipment and machinery, distances thereto, and the communication data.
37. A detection system as claimed in claim 36 wherein the processing unit is configured to transform and render the processed data into the image data to be displayed by the display.
38. A detection system as claimed in claim 37 wherein the processing unit is configured to transmit the processed and rendered image data to the display in realtime.
39. A detection system as claimed in any one or more of the preceding claims wherein the image data corresponds to a visual representation of any one or more of the group including the position-related data of surrounding objects, vehicles, equipment and machinery, the communication data transmitted by the remote operators of vehicles, equipment and machinery, and additional data relating to the vehicles, equipment and machinery being operated by remote operators.
40. A detection system as claimed in claim 39 wherein the visual representation includes any one or more of the group including, but not limited to, a map, graphical overlay, colour coding, real-time updates, warnings or alerts, user interface elements, overlay on live video feed, and data layers.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| ZA202309068 | 2023-10-18 | ||
| ZA2023/09068 | 2023-10-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025085941A1 true WO2025085941A1 (en) | 2025-04-24 |
Family
ID=95449351
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/ZA2024/050056 Pending WO2025085941A1 (en) | 2023-10-18 | 2024-10-16 | A detection system |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025085941A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110184617A1 (en) * | 2008-05-21 | 2011-07-28 | Adc Automotive Distance Control Systems Gmbh | Driver assistance system for avoiding collisions of a vehicle with pedestrians |
| US10156848B1 (en) * | 2016-01-22 | 2018-12-18 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle routing during emergencies |
| US10395332B1 (en) * | 2016-01-22 | 2019-08-27 | State Farm Mutual Automobile Insurance Company | Coordinated autonomous vehicle automatic area scanning |
-
2024
- 2024-10-16 WO PCT/ZA2024/050056 patent/WO2025085941A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110184617A1 (en) * | 2008-05-21 | 2011-07-28 | Adc Automotive Distance Control Systems Gmbh | Driver assistance system for avoiding collisions of a vehicle with pedestrians |
| US10156848B1 (en) * | 2016-01-22 | 2018-12-18 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle routing during emergencies |
| US10395332B1 (en) * | 2016-01-22 | 2019-08-27 | State Farm Mutual Automobile Insurance Company | Coordinated autonomous vehicle automatic area scanning |
| US11625802B1 (en) * | 2016-01-22 | 2023-04-11 | State Farm Mutual Automobile Insurance Company | Coordinated autonomous vehicle automatic area scanning |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP1364229B1 (en) | Slope monitoring system | |
| JP6583384B2 (en) | Plant equipment management system and management method | |
| JP6657498B2 (en) | Method, apparatus, and computer program for presenting operational information of a mobile platform | |
| US20140052676A1 (en) | Portable performance support device and method for use | |
| US20100127853A1 (en) | Method and apparatus for locating and tracking objects in a mining environment | |
| US10777027B2 (en) | Working vehicle, remote diagnosis system, and remote diagnosis method | |
| WO2015106799A1 (en) | Mine vehicle, mine control system and mapping method | |
| US10598780B2 (en) | Method and system for displaying an area | |
| US20240265632A1 (en) | System and method for monitoring three-dimensional location-based worker safety management | |
| US10805597B2 (en) | Terminal device, control device, data-integrating device, work vehicle, image-capturing system, and image-capturing method | |
| US20150153449A1 (en) | Imaging system for obscured environments | |
| JP2015152483A (en) | POSITION INFORMATION ACQUISITION SYSTEM AND POSITION INFORMATION ACQUISITION METHOD | |
| US12363502B2 (en) | Power tool geofence tracking and dashboard | |
| KR20220120507A (en) | Position displacement monitoring method using GNSS and sensor data and a constant measurement system for hazardous areas using the same | |
| CN119763296A (en) | Security warning method, device, equipment, storage medium and computer program product | |
| JP2016516987A (en) | Miner position tracking and mapping | |
| WO2025085941A1 (en) | A detection system | |
| US20230260338A1 (en) | Deterioration diagnosis system, deterioration diagnosis device, deterioration diagnosis method, and recording medium | |
| US20200050860A1 (en) | System and method for use of augmented reality in outfitting a dynamic structural space | |
| JP2012023414A (en) | Simulation apparatus, simulation method, and program | |
| CN111122779A (en) | Method, device and detection system for gas detection by inspection equipment | |
| EP3214586A1 (en) | Method for maintenance support and maintenance support system | |
| JP7742019B2 (en) | Radio equipment monitoring device and radio equipment monitoring method | |
| US20240401479A1 (en) | Method and a control node for controlling a mining rig | |
| JP2020016466A (en) | Position management system and position management method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24880792 Country of ref document: EP Kind code of ref document: A1 |