AU2024225595A1 - Ground hazard detection - Google Patents
Ground hazard detectionInfo
- Publication number
- AU2024225595A1 AU2024225595A1 AU2024225595A AU2024225595A AU2024225595A1 AU 2024225595 A1 AU2024225595 A1 AU 2024225595A1 AU 2024225595 A AU2024225595 A AU 2024225595A AU 2024225595 A AU2024225595 A AU 2024225595A AU 2024225595 A1 AU2024225595 A1 AU 2024225595A1
- Authority
- AU
- Australia
- Prior art keywords
- ground
- vehicle
- hazard
- objects
- falling object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0145—Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/096758—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where no selection takes place on the transmitted or the received information
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/20—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
- G08G1/207—Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles with respect to certain areas, e.g. forbidden or allowed areas with possible alerting when inside or outside boundaries
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Atmospheric Sciences (AREA)
- Electromagnetism (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
Disclosed is a method of detecting a ground hazard for a vehicle. The method includes receiving a three dimensional point cloud from a scan for a region around the vehicle, the point cloud being formed using distance information from at least one sensor mounted to the vehicle. The method also includes estimating at least one surface in the point cloud and determining a plurality of objects in the scan data located on the at least one surface. The method also includes classifying at least one of the plurality of objects as a ground hazard according to a physical property of the at least one object, the ground hazard being classified as a ground hazard because of a risk of damage to a tyre.
Description
GROUND HAZARD DETECTION
Technical Field
[001] The present invention generally relates to ground hazard detection and more particularly to ground hazard detection for surface vehicles.
Background
[002] Many vehicles operate in rugged and difficult environments, such as mining, construction, earthmoving and industrial sites. Such environments can have road surfaces with objects that are capable of causing significant damage to tyres of vehicles using the roads. One example of such a vehicle is a mining truck used to haul ore that operates in an above ground mine, where the road may has large rocks and stones capable of damaging the tyres on the mining truck.
[003] When the tyres of mining trucks contact large/sharp rocks on haul road surfaces significant tyre damage can result. Some mine operators can replace costly tyres at double the rate that would otherwise be expected if the equipment was operated on roads without such rocks. Changing tyres on mining vehicles is an expensive process, with costs involved in purchasing new tyres, labour to replace the tyres as well as reduced availability of the vehicle as the tyres are changed. Changing tyres on large vehicles, such as a mining truck, may take an entire 12 hour shift during which the mining truck is unavailable, resulting in lost utility of the truck and underutilisation of other resources such as a truck drive. Further, damage to tyres may be a safety concern with issues such as loss of vehicle control and the danger posed by a tyre when it explosively deflates.
[004] The preferred embodiments of the present invention seek to address at least one of these disadvantages, to provide the public with a useful innovation.
[005] The reference in this specification to any prior publication (or information derived from the prior publication), or to any matter which is known, is not, and should not be taken as an acknowledgement or admission or any form of suggestion that the prior publication (or information derived from the prior publication) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
Summary
[006] This Summary is provided to introduce a selection of concepts in a simplified form which will be elaborated upon below in the Detailed Description. This Summary is not intended to identify essential features of the claimed subject matter, nor is it intended to be used to limit the scope for the claimed subject matter.
[007] One embodiment includes a method of detecting a ground hazard for a vehicle, the method comprising: receiving a three dimensional point cloud from a scan for a region around the vehicle, the point cloud being formed using distance information from at least one sensor mounted to the vehicle; estimating at least one surface in the point cloud; determining a plurality of objects in the scan data located on the at least one surface; and classifying at least one of the plurality of objects as a ground hazard according to a physical property of the at least one object, the ground hazard being classified as a ground hazard because of a risk of damage to a tyre.
[008] In one embodiment, the at least one sensor is a LiDAR unit.
[009] In one embodiment, the at least one surface is a ground surface that is substantially horizontal.
[010] In one embodiment, the at least one surface is a falling object surface that is substantially vertical.
[Oi l] In one embodiment, the falling object surface is located within a region of interest of the vehicle.
[012] In one embodiment, the plurality of objects are located above and below the falling object surface.
[013] In one embodiment, the at least one surface includes a ground surface and a falling object surface.
[014] In one embodiment, a deviation from vertical of the falling object surface is determined according to an acceleration of the vehicle.
[015] In one embodiment, a future location of the ground hazards is estimated by a location of the ground hazard tracked between the scan and a previous scan.
[016] In one embodiment, the physical property is selected from the set consisting of a shape property and a location property.
[017] In one embodiment, the physical property is selected from the set consisting of width, height, flatness, sharpness, distance to a bench, and location relative to the vehicle.
[018] In one embodiment, the method further comprises: determining a ground hazard rating for a region of a ground surface that vehicles can operate on, the ground hazard rating being determined using a size and a location for the hazard
[019] In one embodiment, operation of the vehicle is modified based on the ground hazard rating.
[020] In one embodiment, vehicles selected for operating in the region of the ground surface is selected based on the ground hazard rating.
[021] In one embodiment, the ground plane and falling object planes are used concurrently to allow objects to be identified as an object selected from the set consisting of a falling object and an object located on the road.
[022] One embodiment includes a system for detecting ground hazards, the system comprising: at least one depth sensor for generating a three dimensional point cloud from a scan for a region around a vehicle, the point cloud being formed using distance information from at least one sensor mounted to the vehicle; and a processor coupled to the at least one depth sensor, wherein a method performed by the processor comprises: determining at least one surface in the point cloud; determining a plurality of objects in the scan data located on the at least one surface; and classifying at least one of the plurality of objects as a hazard according to a size of the at least one object.
[023] In one embodiment, the at least one surface is a falling object surface.
[024] In one embodiment, the falling object surface is located within a region of interest of the vehicle.
[025] In one embodiment, the plurality of objects are located above and below the falling object surface.
Brief Description of Figures
[026] At least one embodiment of the present invention is described, by way of example only, with reference to the accompanying figures.
[027] Figure 1 illustrates a functional block diagram of an example processing system that can be utilised to embody or give effect to a particular embodiment;
[028] Figure 2 illustrates an example network infrastructure that can be utilised to embody or give effect to a particular embodiment;
[029] Figure 3A and 3B illustrate placement of a range detection unit for ground hazard detection on a mining truck according to one embodiment;
[030] Figure 4 illustrates a ground hazard detection system according to one embodiment;
[031] Figure 5 illustrates a ground hazard detection process that may be performed by the ground hazard detection system of Figure 4;
[032] Figure 6 illustrates a ground hazard map that may be generated by the ground hazard detection system of Figure 4;
[033] Figure 7 illustrates a hazard size chart for ground hazards detected by the ground hazard detection system of Figure 4;
[034] Figure 8 illustrates a falling object plane for a mining truck according to one embodiment.
Detailed Description
[035] The following description, given by way of example only, is described in order to provide a more precise understanding of one or more of the embodiments. In the figures, like reference numerals are used to identify like parts throughout the figures.
[036] The disclosed ground hazard detection systems and ground hazard detection processes may be used to located objects on a driving surface, such as a road, that may pose a risk to a vehicle, in particular a risk of damage to a tyre. The ground hazard detection uses a point cloud, sometime stored as a depth image, produced by a LiDAR unit, to locate objects on the driving surface. The located objects may be classified as a ground hazard according to one or more physical properties of the object. Once classified as a ground hazard, location and physical
property information may be recorded, by the ground hazard detection system, for the object. The information can be reported by the ground hazard detection system to a remote server for review by a site manager, and/or used to notify a driver of the vehicle so that the driver can avoid the ground hazard.
[037] Disclosed is a method, and system for implementing the method, of detecting one or more ground hazards. The method uses a depth mapping unit and receives a scan from the unit for a region around a vehicle as a three dimensional point cloud, the point cloud being formed using distance information from at least one sensor mounted to the vehicle. From the points in the point cloud, at least one surface in the point cloud is estimated. A plurality of objects in the scan data, located on the at least one surface, is determined before at least one of the plurality of objects id classified as a ground hazard according to a physical property of the at least one object.
[038] A particular embodiment of the present invention can be realised using a processing system, an example of which is shown in Fig. 1. In particular, the processing system 100 generally includes at least one processor 102, or processing unit or plurality of processors, memory 104, at least one input device 106 and at least one output device 108, coupled together via a bus or group of buses 110. In certain embodiments, input device 106 and output device 108 could be the same device. An interface 112 can also be provided for coupling the processing system 100 to one or more peripheral devices, for example interface 112 could be a PCI card or PC card. At least one storage device 114 which houses at least one database 116 can also be provided. The memory 104 can be any form of memory device, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc. The processor 102 could include more than one distinct processing device, for example to handle different functions within the processing system 100.
[039] Input device 106 receives input data 118 and can include, for example, a keyboard, a pointer device such as a pen-like device or a mouse, audio receiving device for voice controlled activation such as a microphone, data receiver or antenna such as a modem or wireless data adaptor, data acquisition card, etc. Input data 118 could come from different sources, for example keyboard instructions in conjunction with data received via a network. Output device 108 produces or generates output data 120 and can include, for example, a display device or monitor in which case output data 120 is visual, a printer in which case output data 120 is printed, a port for example a USB port, a peripheral component adaptor, a data transmitter or
antenna such as a modem or wireless network adaptor, etc. Output data 120 could be distinct and derived from different output devices, for example a visual display on a monitor in conjunction with data transmitted to a network. A user could view data output, or an interpretation of the data output, on, for example, a monitor or using a printer. The storage device 114 can be any form of data or information storage means, for example, volatile or nonvolatile memory, solid state storage devices, magnetic devices, etc.
[040] In use, the processing system 100 is adapted to allow data or information to be stored in and/or retrieved from, via wired or wireless communication means, the at least one database 116. The interface 112 may allow wired and/or wireless communication between the processing unit 102 and peripheral components that may serve a specialised purpose. The processor 102 receives instructions as input data 118 via input device 106 and can display processed results or other output to a user by utilising output device 108. More than one input device 106 and/or output device 108 can be provided. It should be appreciated that the processing system 100 may be any form of terminal, server, specialised hardware, or the like.
[041] The processing system 100 may be a part of a networked communications system 200, as shown in Fig. 2. Processing system 100 could connect to network 202, for example the Internet or a WAN. Input data 118 and output data 120 could be communicated to other devices via network 202. Other terminals, for example, thin client 204, further processing systems 206 and 208, notebook computer 210, mainframe computer 212, PDA 214, pen-based computer or tablet 216, server 218, etc., can be connected to network 202. A large variety of other types of terminals or configurations could be utilised. The transfer of information and/or data over network 202 can be achieved using wired communications means 220 or wireless communications means 222. Server 218 can facilitate the transfer of data between network 202 and one or more databases 224. Server 218 and one or more databases 224 provide an example of an information source.
[042] Other networks may communicate with network 202. For example, telecommunications network 230 could facilitate the transfer of data between network 202 and mobile, cellular telephone or smartphone 232 or a PDA-type device 234, by utilising wireless communication means 236 and receiving/transmitting station 238. Satellite communications network 240 could communicate with satellite signal receiver 242 which receives data signals from satellite 244 which in turn is in remote communication with satellite signal transmitter 246. Terminals, for example further processing system 248, notebook computer 250 or satellite
telephone 252, can thereby communicate with network 202. A local network 260, which for example may be a private network, LAN, etc., may also be connected to network 202. For example, network 202 could be connected with Ethernet 262 which connects terminals 264, server 266 which controls the transfer of data to and/or from database 268, and printer 270. Various other types of networks could be utilised.
[043] The processing system 100 is adapted to communicate with other terminals, for example further processing systems 206, 208, by sending and receiving data, 118, 120, to and from the network 202, thereby facilitating possible communication with other components of the networked communications system 200.
[044] Thus, for example, the networks 202, 230, 240 may form part of, or be connected to, the Internet, in which case, the terminals 206, 212, 218, for example, may be web servers, Internet terminals or the like. The networks 202, 230, 240, 260 may be or form part of other communication networks, such as LAN, WAN, Ethernet, token ring, FDDI ring, star, etc., networks, or mobile telephone networks, such as GSM, CDMA, 4G, 5G etc., networks, and may be wholly or partially wired, including for example optical fibre, or wireless networks, depending on a particular implementation.
Figure 3A shows a rear view of a mining truck 300. The mining truck 300 is an example of a wheeled surface vehicle configured to use a ground hazard detection system. While Figure 3A shows the mining truck 300, other surface vehicles may also use the ground hazard detection system, such as passenger vehicles, farm vehicles, military vehicles, other types of mining vehicles or vehicles for non-commercial use. The ground hazard detection system is typically used for a wheeled vehicle to prevent or reduce tyre damage from ground hazards, such as sharp rocks. The mining vehicle has tyres 310 and a dump bed 320 for carrying material, such as ore. The mining truck 300 also has a range detection unit, such as a rear mounted LiDAR unit 330 located at the rear of the vehicles between the tyres 310. As positioned, the rear mounted LiDAR unit 330 has a suitable view of the road behind the mining truck while being protected from rocks and other material by an overhang of the dump bed 320.
Figure 3B shows a top view 350 of a mining truck 360. The mining truck 360 moves forward in a direction of travel 365. The mining truck 360 has four LiDAR units attached. Each LiDAR unit provides a scanning area, such as rear LiDAR scanning area 370, left side LiDAR scanning area 375, right side LiDAR scanning area 380 and front LiDAR scanning area 385. The LiDAR
scanning areas may have different shapes and coverage depending on the type of LiDAR used and position of the LiDAR units. In some examples, the LiDAR scanning areas overlap to provide 360 degree coverage around the mining truck 360.
[045] Figure 4 shows a ground hazard detection system 400, which is an example ground hazard detection system. The ground hazard detection system 400 may be configured using a computer such as the processing system 100 described above and communicate over a network, such as network 202. The ground hazard detection system may be a standalone unit that is added to a vehicle, such as wheeled surface vehicle like a mining vehicle, or built into the mining vehicle by the manufacturer. The ground hazard detection system 400 may have one or more LiDAR units attached.
[046] The ground hazard detection system 400 communicates using a wireless communications unit 402, such as a 4G or 5G mobile phone network modem or WiFi modem. The wireless communications unit 402 may communicate with a cloud services 404 and use encrypted communications, such as a virtual private network. The ground hazard detection system 400 can communicate with a remote User Interface (UI) 406 using the wireless communications unit 402, either directly or via the cloud services 404. The wireless communications unit 402 connects to a local network 408 on the vehicle, such as an Ethernet network, that provides communication between units of the ground hazard detection system 400. The local network 408 provides communication between a vehicle processing unit 420 and various devices, such as the wireless communications unit 402. A GPS unit 410, peripherals 412, LiDAR unit 414, inertial measurement unit 416 and a visible light camera 418 also connect to the vehicle processing unit 420, either using the local network 408 or through other communication means, such as USB connection of I/O connections on the vehicle processing unit 420. Operating system processes 430 are executed by the vehicle processing unit 420 to provide functions such as operating system updates, remote access to the operating system and provide access to the virtual private network for communications with the cloud services 404.
[047] The vehicle processing unit 420 has three data stores, a GPS data store 470 for storing co-ordinates and speed data from the GPS unit 410, a sensor store 472 where data from the peripherals 412 and/or the inertial measurement unit 416 may be stored, and a recording data store 474 for storing data from the LiDAR unit 414 and/or the visible light camera 418. Any
data or recording stored can also include timestamp data to allow data in any of the data stores to be cross-referenced based on time.
[048] The vehicle processing unit 420 includes sub-units of a circuit condition monitoring service 422 and a robotic operating unit 424. In one embodiment of the ground hazard detection system 400, the circuit condition monitoring service 422 and the robotic operating unit 424 run on top of an operating system executing by the vehicle processing unit 420. For example, the circuit condition monitoring service 422 and the robotic operating unit 424 may operate as services on the operating system. In one example, the robotic operating unit 424 uses the Robot Operating System (ROS) from the Open Source Robotics Foundation, Inc. In another example, the circuit condition monitoring service 422 and the robotic operating unit 424 may be further divided and executed as a number of services.
[049] The circuit condition monitoring service 422 receives data from the GPS unit 410 using a GPS interface 440 that provides information to a GPS data logger 442 which stores the GPS data in the GPS data store 470. A peripheral interface 444 provides data from the peripherals 412 to a peripheral mapper 446, which may be implemented as a SDK (Software Development Kit) interface. Peripherals 412 collects information about a state of the vehicle, vehicle state values, on which the ground hazard detection system 400 operates. In one example, the peripherals 412 connect to a truck interface to determine the vehicle state values for a state of the truck from inputs such as an ignition state, on or off, what gear the truck is in, such as reverse, accelerator data, brake data, and steering wheel data. The data from the peripherals 412 is mapped from the inputs to a corresponding field, such as ignition state, by the peripheral mapper 446. The vehicle state values are sent to the sensor store 472 by a sensor logger 454 where on-board sensors, external sensors, and signal information are aggregated. The circuit condition monitoring service 422 also receives input from the LiDAR unit 414 using a LiDAR interface 448. A LiDAR control and logger 450 stores data from the LiDAR unit 414 and can also send configuration information to the LiDAR unit 414 to configure operation of the unit. In one example, the LiDAR unit 414 is activated within an area of operation, configured as a geo-fenced area. The area of operation for the LiDAR unit 414 may be downloaded to the vehicle processing unit 420, using the wireless communications unit 402. The circuit condition monitoring service 422 activates and deactivates the LiDAR unit 414 using the area of operation and the current location of the vehicle from the GPS unit 410 so that the LiDAR is only operating in desired areas within the area of operation.
[050] The circuit condition monitoring service 422 also has a status web server 452 for internal system control and diagnosis. The status web server 452 may include data from the GPS unit 410, peripherals 412 and the LiDAR unit 414 and be accessed using a monitor attached to the vehicle processing unit 420. The status web server 452 may also display additional information provided from the robotic operating unit 424, such as inertial measurement unit 416 information and visible light camera 418 data. Similar information is also provided to a UI updating unit 432 where relevant data is suppled to generate a user interface dashboard that may be displayed on the local monitor or on the remote UI 406 that is located at a remote location, not on the vehicle.
[051] The robotic operating unit 424 receives data from the LiDAR unit 414 using the LiDAR interface 456. Data from the inertial measurement unit 416, such as three-dimensional acceleration information, roll, pitch and yaw are received by an IMU (Inertial Measurement Unit) interface 458. Data from the visible light camera 418 is received by a visible light camera interface 460 to allow video data to be captured and stored in the recording data store 474. In one example, the LiDAR interface 456, the IMU interface 458 and the visible light camera interface 460 are implemented as ROS (Robot Operating System) interfaces.
[052] The robotic operating unit 424 also has a camera LiDAR projection unit 462 where the point cloud data, or frame, from the LiDAR unit 414 are combined so that the LiDAR data can be overlaid on the video at a visualisation unit 464. The camera LiDAR projection unit 462 combines the output of a falling rock detection unit 466 and the ground hazard detection unit 468. The combined LiDAR of the falling rock detection unit 466 and the ground hazard detection unit 468 are combined with the video received from the visible light camera 418 at the visualisation unit 464. The combined LiDAR and camera images may be stored in the recording data store 474 and can be displayed to the driver. In one embodiment, the output of the visualisation unit 464 is shown as part of the user interface of the status web server 452 or the remote UI 406.
[053] The robotic operating unit 424 also includes the ground hazard detection unit 468 and the falling rock detection unit 466. Details regarding the operation of the falling rock detection unit 466 and the ground hazard detection unit 468 are described below in relation to Figure 5. The output of the falling rock detection unit 466 and the ground hazard detection unit 468 are ground hazard location, physical properties, such as size and shape information, and estimated future ground hazard location. This information is passed to a hazard assessment 434 where
danger posed to the vehicle by any ground hazards can be predicted. If a ground hazard is determined to be likely to contact and damage a part of the vehicle, such as a tyre, then a driver may be notified. If a ground hazard is detected and determined to miss the vehicle, but is still considered hazardous to a vehicle, then the location of the ground hazard may be recorded and reported using the remote UI 406. In this way, driver notification is provided for objects that are predicted to contact and damage part of the vehicle, typically tyres of the vehicle, while non-contacting hazardous objects are reported to a central server for marking on a site map.
[054] Figure 5 shows an example of a ground hazard detection process 500 that may be executed on the ground hazard detection system 400 described above in relation to Figure 4 as the ground hazard detection unit 468. The ground hazard detection process 500 may detect ground hazards already present on the ground or conduct fall detection to detect hazards falling to the ground. As will be described, a modified version of the ground hazard detection process 500 can be used in the falling rock detection unit 466 to detect rocks falling from the back of a truck, when a LiDAR unit is mounted to the rear of the truck or to detect objects falling from the truck in other directions for side or front mounted LiDAR units such as the LiDAR units positioned to provide scanning for the left side LiDAR scanning area 375, the right side LiDAR scanning area 380 or the front LiDAR scanning area 385.
[055] The ground hazard detection process 500 uses a three dimensional point cloud, or frame, provided from a LiDAR unit such as LiDAR unit 414, to locate objects on the ground near the vehicle. A ground surface is determined by fitting a surface to a subset of points in the point cloud. Points in the point cloud that are not located on the surface may be grouped with adjacent points to form objects. One or more physical properties of the object may be compared to a predetermined value of a physical property to determine if the object may be classified as a ground hazard. Alternatively, the physical properties may be used as input to a classifier, or machine learning algorithm, to determine if the object is a hazard. Suitable examples of classifiers include support vector machines, neural networks, or decision trees. This approach may locate, or detect, below-surface ground hazards that project below the surface, such as potholes or ditches on the road, as well as above surface ground hazards that project above the surface, such as rocks. Any ground hazards that are identified may have their position recorded along with a ground hazard severity value. The ground hazard severity value is a measure of how much damage the ground hazard may cause based on one or more physical properties of the hazard. In one example the ground hazard severity value is a measure between zero and ten
with a value of zero indicating a very low likelihood of damage occurring if a vehicle tyre comes in contact with the ground hazard, while a value of 10 indicates a very high likelihood of damage occurring if a vehicle tyre comes into contact with the ground hazard.
[056] The ground hazard detection process 500 starts with a pass-through filter 510 where one or more filters may be applied to the three dimension point cloud data captured by the LiDAR unit. Two types of filters may be used. The first filter is a sensor noise filter where noise from the LiDAR may be removed, such as energy or spatial noise. The second type of filter is an environment filter where the LiDAR readings are correct, but require removal. An example of an environment filter includes a low intensity filter to remove dust from the LiDAR data.
[057] The filtered data from the pass-through filter 510 is then processed by a ground filter 520 where a ground plane, or ground surface, is estimated from the LiDAR data. In one example the ground plane is estimated as a flat surface while in another example the ground surface is estimated as a non-flat surface. Data points that are identified as being located on the ground plane are removed from the LiDAR data.
[058] A separate instance of ground segmentation, potentially with a different set of parameters and/or algorithm, can be performed to separate accessible or drivable regions from inaccessible regions, such as dirt piles. In one example, large regions of heavily sloped terrain are designated as inaccessible regions and any data points in such areas are excluded from being identified as hazards in the following stages.
[059] At distance clustering 530 the LiDAR data is received with data in the point cloud data located on the ground plane, or ground surface, removed. The distance clustering 530 applies a clustering algorithm, such as Euclidean clustering, to group together data point in the point cloud data to form an object. Each of the remaining data points from the frame are processed to see if they can be grouped to form a larger object, where the object may be above the ground plane or below the ground plane. Once an object is identified, object information, consisting of the data point in the frame with annotations linking the data to the object, typically using an object ID, is passed to rock filters 540. The rock filters 540 processes the objects to classify them as a ground hazard by comparing at least one predetermined threshold to a physical property of the object. The objects can be processed to determine physical properties such as width, height, flatness, sharpness, distance to the bench when on a mine site, and location
relative to the vehicle. The width, height, flatness and sharpness may be grouped as object shape properties, while distance to the bench and location relative to the vehicle may be grouped as location properties. The location relative to the vehicle may be combined with GPS data to determine co-ordinates for each object. The rock filters 540 may compare size properties of the object, either height, width or a combination of the two, to a predetermined threshold to classify the object as a ground hazard. In one example, the physical property may be an approximated area of the object determined by multiplying the height and width. An alternative approximated area can be determined using a convex hull enclosing the object. Alternatively, the height may be used as a physical property, with any object over a predetermined height being classified as a ground hazard.
[060] At contour tracking 550 a Kalman filter based contour tracking algorithm is applied to track ground hazards identified by the rock filters 540. In addition to the object data for the current frame, the contour tracking 550 uses previous frames, already processed by the ground hazard detection process 500, to track ground hazards as they move between frames so that the location of the ground hazard is tracked between the frame of a current scan and the frame of a previous scan. The motion of the ground hazards may occur as the vehicle moves. The contour tracking 550 may use one or more previous frames of LiDAR data, but typically uses a previous frame to determine where ground hazards detected in the previous frame are located in the current frame. In one embodiment, the contour tracking 550 may use two seconds of data, which is 20 frames when receiving 10 frames a second.
[061] At motion prediction 560 the motion of the ground hazards identified in the frame are estimated using the location of the ground hazards in the current and previous frames. That is, a future location of the ground hazards is estimated using a motion of the ground hazard, determined at the contour tracking 550. This may be important for tracking ground hazards as they move in to sensor blind spots. The motion of the ground hazards can be estimated using a motion estimation technique such as extrapolation, to estimate a future location of the ground hazards. The motion prediction 560 may use current and historical data from the sensor store 472 and in some embodiments may also include current and historical GPS data from the GPS data store 470. In one example, the direction of travel of the vehicle may be determined using the current selected gear as the vehicle state value.
[062] The ground hazard detection process 500 outputs information for detected ground hazards at hazard reporting 570. The information may include a current location, size, speed,
an object ID, shape information, and estimated future location information or a location region. Such information may be used by the hazard assessment 434.
[063] The above description of the ground hazard detection process 500 relates to operation of the ground hazard detection unit 468. The ground hazard detection process 500 may also be used as a process executed by the falling rock detection unit 466 by making the following changes. The falling rock detection is used for vehicles that are carrying material, such as a mining truck carrying material such as ore. The purpose of the falling rock detection is to identify hazardous objects that fall off the vehicle and to identify and locate them as a ground hazard once they land on the ground. Once the falling rocks are on the ground they can be treated as ground hazards as described above in relation to ground hazard detection process 500.
[064] The main change of the ground hazard detection process 500 is modification of the distance clustering 530, the rock filters 540 and the contour tracking 550. The ground plane, or ground surface, is substantially horizontal and extends away from the vehicle when detecting road ground hazards. The ground surface on which the ground hazards typically come to rest. For falling rock detection, the non-ground output from the ground filter 520 is transformed to coincide with a falling object plane, or surface. The falling object surface is typically substantially vertical and is a surface or plane along which the objects will fall to the ground surface. The falling object surface may represent a general trajectory of falling objects. The translation of the falling object plane may be considered a matter of convenience, serving as a datum for parameterising a three-dimensional region of interest for distance clustering 530. The region of interest, for example, may be defined as localised within a radius, or within a predetermined distance, of the truck body (vehicle), or may be extended to allow for detection of falling objects from other machinery, such as objects falling from load unit buckets. In one example, two or more regions of interest are used, with each of the regions being associated with a vehicle, such as the vehicle or an excavator In one example, one or more regions of interest may be associated with equipment, such as a loading station. The change in orientation and/or reference frame allows the contour tracking 550 to be applied to the tracking of objects in the vertical or substantially vertical direction. The falling rock detection may detect objects above and below the falling object surface when point cloud data from for the object within a clustering region-of-interest at distance clustering 530.
[065] The rock filters 540 may use customised filters with modified parameters to distinguish potentially-hazardous objects from non-hazardous small objects and dust, falling from other vehicles or equipment such as the bench, in a mining environment, and loading units. Paths, or trajectories, of objects tracked by the contour tracking 550 characterise detected objects as exhibiting an expected falling object trajectory. A falling object that is characterised as a hazardous is treated as a ground hazard, and the ground hazard detection process 500 continues. In one example, when the falling object plane is linked to the vehicle, the shape of the falling object plane may be modified based on acceleration of the vehicle with higher rates of acceleration, implying a deviation from vertical for the falling object plane, or surface, with higher acceleration resulting in a larger deviation from vertical. When the falling object plane is linked to the vehicle and the vehicle is stationary, the falling object plane may be substantially vertical. The falling object plane may be linked to the vehicle when the falling object plane is used to detect objects falling off the vehicle, such as falling off the back of a dump truck. As the vehicle starts to accelerate, the deviation of the falling object plane from vertical increases. The direction of the change will depend on where the falling object plane is, relative to the vehicle, and the direction of acceleration of the vehicle. A falling object plane at the front of the vehicle will have the base of the plane, where the plane meets the ground, move towards the vehicle as the vehicle accelerates forwards, while the top of the falling object plane will either not move or move only a small amount, relative to the vehicle. A falling object plane at the rear of the vehicle will have the base of the falling object plane move away from the rear of the vehicle when the vehicle accelerates forwards, while the top of the falling object plane will either not move or move only a small amount, relative to the vehicle. The deviation of the front and rear falling object planes will be in the opposite direction when the vehicle accelerates in backwards. Typically, the base of the falling object plane moves in a direction opposite to the direction of acceleration. An example of how the falling object plane moves is shown in relation to a mining truck 800 of figure. 8. The mining truck 800 has a cargo bed 820 where material, such as ore, is transported. When the mining truck 800 is stationary a falling object plane 860 may be detected. When the mining truck 800 accelerates forward, in the direction of arrow 830, the falling object plane will deviate from vertical, as shown by accelerating falling object plane 850. The falling object planes 860 and 850 are linked to the mining truck 800 as falling object fall off the cargo bed 820.
[066] In one embodiment trajectory filtering may be applied to approximating a parabolic trajectory of falling objects as linear, for short distances. The approximation may incorporate
estimates of vehicle dynamics, such as acceleration, as well as vehicle state, such as a pose of the vehicle that may include one or more of roll, pitch and yaw.
[067] In one embodiment, the ground plane is also determined, in addition to the falling object plane, and any readings closer than the ground plane are considered to belong to falling objects. Once the falling object plane, or surface, is identified, the ground hazard detection process 500 continues. However, the ground plane and falling object planes are used concurrently to allow objects to be identified as a falling object or an object located on the road. In one example where hazard detection operates for falling objects and ground objects, a falling object is located on the falling object plane and, once the object lands, transferred to the ground plane.
[068] In one embodiment, the falling rock detection implementation of the ground hazard detection process 500 may also skip the motion prediction 560. Motion of ground hazards detected in the point cloud can be ignored as the ground hazard falls to the ground. Once on the ground the ground hazard will be identified and tracked as a ground hazard by the ground hazard detection unit 468. The ground hazard may be tracked by the falling rock detection unit 466 and then object information, such as object physical properties and object ID sent to the ground hazard detection unit 468 so that the new object can be tracked on the ground.
[069] Figure 6 shows a heat map showing a ground hazard map 600 as detected by the ground hazard detection system. The ground hazard map 600 may form part of the remote UI 406 of the ground hazard detection system 400. The ground hazard map 600 shows a satellite image 680 of a mine site. Locations of ground hazards are placed on the map with a density of the ground hazards being represented by colours on the ground hazard map 600. The density of ground hazards may be considered a ground hazard rating for a region of a ground surface that vehicles can operate on, the ground hazard rating being determined using the size and a location for the hazard. A high ground hazard zone 610 is located in a region where 290 ground hazards have been detected within the region shown. A moderate ground hazard zone 620 is shown with 39 ground hazards detected in the region. A low ground hazard zone 630 is also shown, without a display of a ground hazard count.
[070] The ground hazard map 600 also has ground hazard playback controls 640. The ground hazard playback controls 640 work in a similar manner to video playback controls. However, during the playback, the satellite image 680 may remain the same while the ground hazard zones are shown varying over time. The ground hazard playback controls 640 includes a
playback speed 650 where the number of frames per second can be varied. A playback timeline 660 shows a relative position of the current display of the ground hazard map 600 with the left most position of the playback timeline 660 being the oldest time of the playback and the right most position showing a most recent state of ground hazards. A user may move the playback timeline 660 to change the time of the ground hazard state displayed. A current playback time 665 shows a timestamp for the ground hazard state currently displayed. The playback may also be manipulated using the playback controls 670 that can include play, pause, increase playback speed, decrease playback speed, move to start, move to end, select new time and date for ground hazard state.
[071] The ground hazard map 600 and ground hazard rating may be combined with site management policies to vary vehicle operation, such as speed limits or vehicle payloads, for regions of operation on a site. The speed limit may be lowered based on a ground hazard density. In one example, a maximum speed of vehicles may be reduced for any vehicles in a high ground hazard zone, such as the high ground hazard zone 610. The reduced maximum speed may be set using signs or may be transmitted to vehicles where a speed limiter is applied within a geo-fenced area. In one example a high ground hazard zone is determined by a range of the ground hazard density, measured as ground hazards per square meter, with ranges set based on the density measure so that a medium ground hazard zone having a lower ground hazard density than the high ground hazard zone and a low ground hazard zone having an even lower ground hazard density.
[072] A hazard size distribution 700 will now be described in relation to Figure 7. The hazard size distribution 700 may be produced by the ground hazard detection system 400, executing the ground hazard detection process 500 as described above. The hazard size distribution 700 may form part of the user interface for the ground hazard detection system or may be produced from data collected by the ground hazard detection system 400.
[073] The hazard size distribution 700 shows a size 750 of the hazard for the x axis and a count 760 of hazards for the y-axis. A smoothed hazard height distribution 710 is shown on top of a hazard height instance count 720 histogram. A smoothed hazard width distribution 730 is also show on a hazard width instance count 740 histogram.
[074] The hazard size distribution 700 may be used to determine parameters for the rock filters 540 described above in relation to the ground hazard detection process 500 of Figure 5.
In one example, a height of the ground hazard may be set to be greater than the average height from the hazard size distribution 700.
Variations
[075] The ground hazard detection system described above uses a LiDAR unit, such as the LiDAR unit 414 of the ground hazard detection system 400. LiDAR is an example of an active sensor as the units supply their own illumination. Alternative detection systems may be used to detect ground hazards near a vehicle and may be active or passive sensors. One alternative is a passive senor, such as stereo imaging cameras, or other depth sensing cameras capable of producing a suitable point cloud. Systems that are able to meet minimum requirements may be used in place of a LiDAR unit. An example of such minimum requirements may be a depth sensing device capable of determining angular resolution of a minimum of 0.2 degrees, a minimum frame rate of 5 Hz, a minimum range of 30 meters and with a field of view that is wide enough to cover the rear of a vehicle.
[076] The use of a visible light camera, such as visible light camera 418 is optional and is not required for the ground hazard detection. The use of a camera can allow a user friendly view of identified ground hazards to be presented to users of the system.
[077] While a local status user interface, such as status web server 452 or remote UI 406 has been described, a warning system may be provided to a driver of the vehicle. In one embodiment, the visualisation unit 464 output of the object data overlaid on the camera image may be presented to the driver as a form of reversing camera. Such a reversing camera allows the driver to have ground hazards identified when reversing the vehicle. An alternative, or additional user interface may be an audible warning system when a ground hazard is approaching the vehicle.
[078] The above description refers to use of the ground hazard detection system and the ground hazard detection process on a vehicle such as a mining truck. The system may also be mounted on other vehicles and objects including unmanned aerial vehicle, a static pole, other mining equipment such as an excavator, a bulldozer, a load haul dump, or autonomous vehicles or robots. While the falling rock detection unit 466 is described with the example of material falling from a truck, material may be detected falling from other vehicles as well. One example is material falling from a shovel bucket from an excavator or a bucket of a load haul dump.
[079] The ground hazard detection process 500 described above may also have additional steps for determining if a ground hazard is a risk of damaging a tyre will come in to contact with a tyre of a vehicle. The current direction of motion of the vehicle can be determined from vehicle inputs, such as steering wheel position and direction of travel from the gear selection, as well GPS information, and/or by successive LiDAR scan registrations using SLAM (Simultaneous Localisation and Mapping), potentially augmented by IMU (Inertial Measurement Unit) data. A path may be determined for each of the plurality of tyres of the vehicle, or in some embodiments a path may be determined for the left and/or the right side tyres. Next, a determination is made if a ground hazard is on the path of any of the tyres of the vehicle. The driver of the vehicle may be notified if a ground hazard is on the path for one or more of the tyres. In one embodiment, the vehicle may be automatically stopped to prevent damage to the tyres. This process may be advantageous for vehicles where the tyre paths are much smaller than a width of the vehicle.
[080] While the ground hazard detection system 400 has been described using the vehicle processing unit 420, the operation of the vehicle processing unit 420 may be distributed over one or more computers.
Advantages and Interpretations
[081] The disclosed ground hazard detection system and processes allow for the identification of ground hazards that could cause tyre damage and reduce the availability and utility of tyred vehicles. Detection of ground hazards is particularly useful in harsh environments such as a mining, earthmoving, construction or industrial sites. By identifying ground hazards, action may be taken by individual drivers, or operators of a site, to modify driver behaviour or effect repair of a road surface. Maintenance of vehicles may also be scheduled based on information about the surface that the vehicles have been travelling on. In one example, vehicles may be scheduled for operation based on a quality of the tyres of the vehicle and the ground hazards present on the ground the vehicles will travel on. In this example, vehicles with tyres in good condition may be used on surfaces with one or more high ground hazard zones while vehicles with tyres in a poorer condition may be used on surfaces with moderate, or even only low ground hazard zones. The condition of the tyres may be determined by a distance travelled on each tyre or, alternatively or in combination, based on tyre condition from maintenance reports. In such situations, vehicles scheduling is selected based on a ground hazard rating for the surface the vehicle will operate on. The detection of objects falling from a vehicle, using falling
rock detection, also allows identification and tracking of new ground hazards as they fall to the road surface.
[082] The ability to identify a location of ground hazards may allow drivers, or site operators, to identify high and low risk areas of a site. Operation of vehicles may then be adjusted to reduce possible negative impacts from ground hazards, such as reducing vehicle speeds or sending out work crews to remove the hazards.
[083] Falling rock detection may also be used to identify actions of drivers or operators of loading vehicles that produce an increase in ground hazards. For example, it may be that loading of mining trucks beyond a certain weight of material dramatically increases falling rocks from the vehicle. In another example, the number of ground hazards falling from a mining truck may be higher in one region due to the nature of the material being loaded. In such circumstances, the frequency of maintenance may be increased to remove the ground hazards.
[084] Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
Claims
The claims defining the invention are as follows:
1. A method of detecting a ground hazard for a vehicle, the method comprising: receiving a three dimensional point cloud from a scan for a region around the vehicle, the point cloud being formed using distance information from at least one sensor mounted to the vehicle; estimating at least one surface in the point cloud; determining a plurality of objects in the scan data located on the at least one surface; and classifying at least one of the plurality of objects as a ground hazard according to a physical property of the at least one object, the ground hazard being classified as a ground hazard because of a risk of damage to a tyre.
2. The method according to claim 1, wherein the at least one sensor is a LiDAR unit.
3. The method according to either of claim 1 or 2, wherein the at least one surface is a ground surface that is substantially horizontal.
4. The method according to either of claim 1 or 2, wherein the at least one surface is a falling object surface that is substantially vertical.
5. The method according to claim 4, wherein the falling object surface is located within a region of interest of the vehicle.
6. The method according to claim 4, wherein the plurality of objects are located above and below the falling object surface.
7. The method according to either of claim 1 or 2, wherein the at least one surface includes a ground surface and a falling object surface.
8. The method according to any one of claims 4 to 7, wherein a deviation from vertical of the falling object surface is determined according to an acceleration of the vehicle.
9. The method according to any one of claims 1 to 8, wherein a future location of the ground hazards is estimated by a location of the ground hazard tracked between the scan and a previous scan.
10. The method according to any one of claims 1 to 9, wherein the physical property is selected from the set consisting of a shape property and a location property.
11. The method according to any one of claims 1 to 10, wherein the physical property is selected from the set consisting of width, height, flatness, sharpness, distance to a bench, and location relative to the vehicle.
12. The method according to claim 1, further comprising: determining a ground hazard rating for a region of a ground surface that vehicles can operate on, the ground hazard rating being determined using a size and a location for the hazard
13. The method according to claim 12, wherein operation of the vehicle is modified based on the ground hazard rating.
14. The method according to claim 12, wherein vehicles selected for operating in the region of the ground surface is selected based on the ground hazard rating.
15. The method according to claim 7, wherein the ground plane and falling object planes are used concurrently to allow objects to be identified as an object selected from the set consisting of a falling object and an object located on the road.
17. A system for detecting ground hazards, the system comprising: at least one depth sensor for generating a three dimensional point cloud from a scan for a region around a vehicle, the point cloud being formed using distance information from at least one sensor mounted to the vehicle; and a processor coupled to the at least one depth sensor, wherein a method performed by the processor comprises: determining at least one surface in the point cloud; determining a plurality of objects in the scan data located on the at least one surface; and classifying at least one of the plurality of objects as a hazard according to a size of the at least one object.
18. The system according to either of claim 17, wherein the at least one surface is a falling object surface.
19. The system according to claim 18, wherein the falling object surface is located within a region of interest of the vehicle.
20. The system according to claim 18, wherein the plurality of objects are located above and below the falling object surface.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2023900428A AU2023900428A0 (en) | 2023-02-20 | Ground Hazard Detection | |
| AU2023900428 | 2023-02-20 | ||
| PCT/AU2024/050126 WO2024173984A1 (en) | 2023-02-20 | 2024-02-20 | Ground hazard detection |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| AU2024225595A1 true AU2024225595A1 (en) | 2025-09-11 |
Family
ID=92499978
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU2024225595A Pending AU2024225595A1 (en) | 2023-02-20 | 2024-02-20 | Ground hazard detection |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2024225595A1 (en) |
| WO (1) | WO2024173984A1 (en) |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8564657B2 (en) * | 2009-05-29 | 2013-10-22 | Honda Research Institute Europe Gmbh | Object motion detection system based on combining 3D warping techniques and a proper object motion detection |
| US9711050B2 (en) * | 2015-06-05 | 2017-07-18 | Bao Tran | Smart vehicle |
-
2024
- 2024-02-20 WO PCT/AU2024/050126 patent/WO2024173984A1/en not_active Ceased
- 2024-02-20 AU AU2024225595A patent/AU2024225595A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024173984A1 (en) | 2024-08-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11691648B2 (en) | Drivable surface identification techniques | |
| US11567197B2 (en) | Automated object detection in a dusty environment | |
| US11494930B2 (en) | Techniques for volumetric estimation | |
| US11560690B2 (en) | Techniques for kinematic and dynamic behavior estimation in autonomous vehicles | |
| US11131082B2 (en) | Work vehicle with a payload tracking system | |
| US12094151B2 (en) | Image processing system, image processing method, learned model generation method, and data set for learning | |
| Azar et al. | Earthmoving equipment automation: A review of technical advances and future outlook | |
| US11755028B2 (en) | Mobile work machine with object detection using vision recognition | |
| US20200079165A1 (en) | Hitch assist system | |
| US20160353049A1 (en) | Method and System for Displaying a Projected Path for a Machine | |
| CN112477879B (en) | Mobile work machines with object detection and machine path visualization | |
| US20200369290A1 (en) | System and method for configuring worksite warning zones | |
| US20230230257A1 (en) | Systems and methods for improved three-dimensional data association using information from two-dimensional images | |
| US20210174088A1 (en) | System and method for detecting objects | |
| US12110660B2 (en) | Work machine 3D exclusion zone | |
| WO2024173984A1 (en) | Ground hazard detection | |
| US20250164248A1 (en) | Systems and methods for obstacle analysis associated with a travel path of a machine | |
| US20190102902A1 (en) | System and method for object detection | |
| JP2023088863A (en) | Method and unit for evaluating performance of obstacle detection system | |
| KR20250125794A (en) | Object detection system and method based on fusion of sensors and construction machine including the system | |
| Ruff | New technology to monitor blind areas near surface mining equipment | |
| CN118033574A (en) | Target detection and target detection model training method and device, and mobile equipment | |
| CN117355863A (en) | Object tracking device and object tracking method | |
| Guivant | Multi agent learning of the surface quality for fleet operation in damageprone roads |