WO2024186814A2 - Systems and methods for adjusting a driving path using occluded regions - Google Patents
Systems and methods for adjusting a driving path using occluded regions Download PDFInfo
- Publication number
- WO2024186814A2 WO2024186814A2 PCT/US2024/018516 US2024018516W WO2024186814A2 WO 2024186814 A2 WO2024186814 A2 WO 2024186814A2 US 2024018516 W US2024018516 W US 2024018516W WO 2024186814 A2 WO2024186814 A2 WO 2024186814A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- occlusion
- map
- vehicle
- objects
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0011—Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
Definitions
- Embodiments of the present disclosure relate to object tracking and, in particular, to tracking objects through occluded regions.
- AVs self-driving or otherwise autonomous vehicles
- the ability to detect one or more objects, obstacles, and/or road surface conditions within an environment of the AV to detect one or more objects, obstacles, and/or road surface conditions within an environment of the AV.
- the objects in order to safely navigate, the objects not only need to be detected, but they also need to be tracked in order to maintain knowledge of a position of the moving objects.
- Occlusion occurs when there is a track on an object, but then the vehicle loses site of the track at some point. Eventually, as a vehicle moves, many objects may become occluded. For example, objects may drop behind another object, move into or out of the AV’s field of view, or may be lost around a bend, etc. Occlusion can also be caused by the vehicle itself. For example, if the vehicle cuts a corner, it may not be possible to see objects low on the ground on the corner. In these instances, the vehicle itself may occlude objects that were previously tracked. [0004] When an object is occluded, the object does not disappear. This means that the object could still pose a risk to the vehicle in the future. Therefore, for at least these reasons, systems and methods for tracking objects through occluded regions is needed to safely and efficiently navigate an environment and prevent possible collision.
- a method to adjust a driving path in a planner-map may comprise: identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; detecting one or more occlusion areas from sensor data from the one or more sensors; tracking one or more tracks of one or more objects within the one or more occlusion areas; adding uncertainty to the one or more tracks within the one or more occlusion areas; and adjusting the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
- the uncertainty may be managed in different ways.
- the uncertainty of the one or more tracks is based on a time elapse of an object corresponding to the track that is within the one or more occlusion areas.
- the uncertainty of the one or more tracks increases as time elapses.
- the uncertainty is based on a size of the one or more occlusion areas.
- the uncertainty increases with the size of the one or more occlusion areas.
- the method may also include adjusting the path based on the one or more tracks within the one or more occlusion areas.
- a system may adjust a path, including: a plannermap configured to identify a driving surface and determine a path on the driving surface; one or more sensors configured to generate sensor data to identify one or more objects relative to the driving surface; an occlusion publisher configured to detect one or more occlusion areas from the sensor data from the one or more sensors; a tracker configured to track one or more tracks of one or more objects within the one or more occlusion areas, wherein the planner-map is configured to add uncertainty of the one or more tracks of one or more objects within the one or more occlusion areas, and adjust the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
- the system may include any combination of additional components and functions.
- the planner-map is configured to add the uncertainty of the one or more tracks based on a time elapse of an object corresponding to the track that is within the one or more occlusion areas.
- the planner-map is configured to increase the uncertainty of the one or more tracks as time elapses.
- the planner-map is configured to add the uncertainty based on a size of the one or more occlusion areas.
- the planner-map is configured to increase the uncertainty with the size of the one or more occlusion areas.
- the planner-map is configured to adjust the path based on the one or more tracks within the one or more occlusion areas.
- the planner-map, occlusion publisher, and tracker comprise one or more processors having non-transitory machine-readable instructions stored in one or more memory locations that is configured to perform functions when executed by the one or more processors.
- a method may adjust a driving path in a plannermap, including identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; determining one or more occlusion areas created by the one or more objects; adding uncertainty of a potential additional object within the one or more occlusion areas; adjusting the path on the driving surface based on the uncertainty of the potential additional object within the one or more occlusion areas.
- a method to remove tracks from a tracker when the tracks are within an occluded region may comprise: defining a map of a driving area; defining an occlusion area within the map; detecting an object using sensor data from one or more sensors; determining that the object entered the occlusion area; creating an estimated object location for the object within the occlusion area; dropping the estimated object location for the object when the occlusion area is cleared.
- the occlusion area may be managed in different ways. For example, the occlusion area is cleared when an occluding object creating the occlusion area is no longer occluding the one or more sensors. For example, the occlusion area is cleared when the object is detected by the one or more sensors after leaving the occlusion area.
- the method may also include other features or steps, or combinations thereof.
- the method may include increasing a precision of observation for objects leaving the occlusion area.
- the method may include increasing a precision of observation for objects entering the occlusion area.
- the method may include using the increased precision of observation for objects entering and leaving the occlusion area to confirm that the object entering the occlusion area is the same as the object leaving the occlusion area before dropping the estimated object location for the object when the occlusion areas is cleared.
- the method may include updating the map of the driving area to include estimates of the driving area within the occlusion area.
- the method may include the estimates of the driving area comprise an extension of lane detections.
- the method may include periodically updating the estimated object location to an updated estimated object location for the object while the object remains in the occlusion area.
- the method may include updating the updated estimated object location by increasing its area as time elapses from when the object entered the occlusion area.
- the method may include updating the updated estimated object location by increasing its size to fdl the occlusion area.
- the method may include creating a keep alive duration for the object entering the occlusion area configured to retain a tracking of the object within the occlusion area for the keep alive duration.
- the method may include adjusting the keep alive duration based on an estimated trajectory of the object in the occlusion area, and a time elapsed from when the object entered the occluded area.
- the method may include dropping the estimated object location for the object when the occlusion area is cleared is by updating the keep alive duration to zero.
- a method may manage an occluded region, including defining a map of a driving area; defining an occlusion area within the map; detecting an object using sensor data from one or more sensors; determining that the object entered the occlusion area; creating an estimated object location for the object within the occlusion area; dropping the estimated object location for the object when the occlusion area is cleared.
- a system may remove tracks from a tracker when the tracks are within an occluded region, including a planner-map configured to identify a driving surface and determine a path on the driving surface; one or more sensors configured to generate sensor data to identify one or more objects relative to the driving surface; an occlusion publisher configured to detect one or more occlusion areas from the sensor data from the one or more sensors; a tracker configured to determine that the object entered the occlusion area; wherein the planner-map is configured to create an estimated object location for the object within the occlusion area, and drop the estimated object location for the object when the occlusion area is cleared.
- the planner-map may be configured in different combinations.
- the planner-map may be configured to clear the occlusion area when an occluding object creating the occlusion area is no longer occluding the one or more sensors.
- the planner-map may be configured to clear the occlusion area when the object is detected by the one or more sensors after leaving the occlusion area.
- the planner-map, occlusion publisher, and tracker comprise one or more processors having non-transitory machine-readable instructions stored in one or more memory locations that is configured to perform functions when executed by the one or more processors.
- a method to maintain tracking of an object that passes into an occluded area may comprise: defining a map of a driving area; defining one or more occlusion areas within the map; detecting an object using sensor data from one or more sensors; creating an object track for the object detected using sensor data; determining that the object track entered one of the one or more occlusion areas; maintaining the object track while the object track remains in the one or more occlusion areas.
- the method may include different combinations of additional features or steps.
- the object track may be maintained while the object remains undetected by the one or more sensors.
- the method may include dropping the object track when the one of the one or more occlusion areas is cleared.
- the method may include dropping the object track when the object exits the one of the one or more occlusion areas by detecting the object with the one or more sensors.
- the method may include dropping the object track when a probably that the object is no longer in one or more occlusion areas has surpassed a threshold.
- the method may include updating a probability location of the object while the object remains undetected by the one or more sensors and within the one or more occlusion areas.
- the method may include a certainty of the probability location manifests in an area of probability in which the object may be within the one or more occlusion areas along the object track.
- the method may include the probability location being determined with less certainty the longer the object remains in the one or more occlusion areas.
- the less certainty results in a larger areas of probability associated with the object along the object track.
- An exemplary embodiment may include a method to maintain tracking of an object that passes into an occluded area, including defining a map of a driving area having a driving surface including one or more drive lanes for a vehicle to traverse; detecting an object, separate from the vehicle, using sensor data from one or more sensors; defining one or more occlusion areas within the map, wherein an occlusion area is an area that is obstructed by another object, and/or is in an area not detectable by the one or more sensors, and/or is within a set proximity to the vehicle; determining that the object track entered one of the one or more occlusion areas; creating an object track for the object detected using sensor data, wherein the object track is determined from data received from the one or more sensors before the object enters the one or more occlusion areas and is configured to provide a predicted location as the object remains within the one or more occlusion areas; determining a predicted location of the object within the one or more occlusion areas, wherein the predicted location comprises
- the method may also include driving the vehicle and dropping the object track when the occlusion region is cleared.
- a system to maintain tracking of an object that passes into an occluded area may include detection system configured to define a map of a driving area; one or more sensors configured to generate sensor data; a detector configured to detect an object from the sensor data; an occlusion publisher configured to define one or more occlusion areas within the map; a tracker configured to create an object track for the object detected by the detector and configured to maintain the object track when the object is determined to enter the one or more occlusion areas.
- the system may include any combination of features.
- the system may include a tracker configured to drop the object track when the occlusion region is cleared.
- the system may include a tracker configured to drop the object track when the object exits the one of the one or more occlusion areas by detecting the object with the one or more sensors.
- the system may include a tracker configured to drop the object track when a probably that the object is no longer in one or more occlusion areas has surpassed a threshold.
- the system may include a tracker configured to update a probability location of the object while the object remains undetected by the one or more sensors and within the one or more occlusion areas.
- the system may include a tracker configured to define a certainty of the probability location and manifest the certainty as an area of probability in which the object may be within the one or more occlusion areas along the object track.
- the system may include a tracker configured to determine the probability location with less certainty the longer the object remains in the one or more occlusion areas.
- the system may include a tracker is configured to define a reduction in the certainty as a larger areas of probability associated with the object along the object track.
- the detection system, detector, occlusion publisher, and tracker comprise one or more processors having non-transitory machine-readable instructions stored in one or more memory locations that is configured to perform functions when executed by the one or more processors.
- FIG. 1 shows an example environment including a vehicle including a system for tracking objects through occluded regions according to embodiments described herein.
- FIG. 2 is an example view of an environment from a vehicle including a system for tracking objects through occluded regions according to embodiments described herein.
- FIGS. 3A-3B are an example flowchart of a method for tracking objects through an occluded area, according to various embodiments of the present disclosure.
- FIG. 4 illustrates an exemplary obstruction map according to exemplary embodiments of the present disclosure overlaid on the view of the environment of FIG. 2.
- FIGS. 5-6 illustrate example architectures for tracking objects through an occluded area according to various embodiments of the present disclosure.
- FIG. 7 illustrates example elements of a computing device, according to various embodiments of the present disclosure.
- FIG. 8 shows example architecture of a vehicle, according to various embodiments of the present disclosure.
- unit means units for processing at least one function and operation, and can be implemented by hardware components or software components and combinations thereof.
- An “electronic device” or a “computing device” refers to a device that includes a processor and memory. Each device may have its own processor and/or memory, or the processor and/or memory may be shared with other devices as in a virtual machine or container arrangement.
- the memory may contain or receive programming instructions that, when executed by the processor, cause the electronic device to perform one or more operations according to the programming instructions.
- memory refers to a non-transitory device on which computer-readable data, programming instructions or both are stored. Except where specifically stated otherwise, the terms “memory,” “memory device,” “computer-readable storage medium,” “data store,” “data storage facility” and the like are intended to include single device embodiments, embodiments in which multiple memory devices together or collectively store a set of data or instructions, as well as individual sectors within such devices.
- processor and “processing device” refer to a hardware component of an electronic device that is configured to execute programming instructions. Except where specifically stated otherwise, the singular term “processor” or “processing device” is intended to include both single-processing device embodiments and embodiments in which multiple processing devices together or collectively perform a process.
- the terms “instructions” and “programs” may be used interchangeably herein.
- the instructions may be stored in object code format for direct processing by the processor, or in any other computing device language, including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods, and routines of the instructions are explained in more detail below.
- the instructions may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor.
- the instructions may be stored as computing device code on the computing device-readable medium.
- data may be retrieved, stored or modified by processors in accordance with a set of instructions.
- the data may be stored in computing device registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files.
- the data may also be formatted in any computing device-readable format.
- module refers to a set of computer-readable programming instructions, as executed by a processor, that cause the processor to perform one or more specified function(s).
- vehicle refers to any motor vehicle, powered by any suitable power source, capable of transporting one or more passengers and/or cargo.
- vehicle includes, but is not limited to, autonomous vehicles (i.e., vehicles not requiring a human operator and/or requiring limited operation by a human operator, either onboard or remotely), automobiles (e.g., cars, trucks, sports utility vehicles, vans, buses, commercial vehicles, class 8 trucks, etc.), boats, drones, trains, and the like.
- autonomous vehicle refers to a vehicle capable of implementing at least one navigational change without driver input.
- a “navigational change” refers to a change in one or more of steering, braking, or acceleration of the vehicle.
- a vehicle need not be fully automatic (e.g., fully operation without a driver or without driver input). Rather, an autonomous vehicle includes those that can operate under driver control during certain time periods and without driver control during other time periods.
- Autonomous vehicles may also include vehicles that control only some aspects of vehicle navigation, such as steering (e.g., to maintain a vehicle course between vehicle lane constraints), but may leave other aspects to the driver (e.g., braking). In some cases, autonomous vehicles may handle some or all aspects of braking, speed control, and/or steering of the vehicle. Autonomous vehicles may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, busses, recreational vehicles, agricultural vehicles, construction vehicles etc. According to various embodiments, autonomous vehicles may include a throttle control system and a braking system. Autonomous vehicles may include one or more engines and/or one or more computing devices. The one or more computing devices may be separate from the automated speed control system or the braking system.
- the computing device may include a processor and/or a memory.
- the memory may be configured to store programming instructions that, when executed by the processor, are configured to cause the processor to perform one or more tasks.
- autonomous vehicles may include a receiver configured to process the communication between autonomous vehicles and a teleoperation system.
- the term “trajectory” or “map” is used broadly to include, for example, a motion plan or any path or route from one place to another; for instance, a path of travel of an anticipated object such as a pedestrian as she/he crosses a street from one side to the other side of a street.
- controller/control unit refers to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein.
- the memory is configured to store the modules and the processor is specifically configured to execute these modules to perform one or more processes that are described further below.
- control logic of the present disclosure may be embodied as non- transitory computer readable media on a computer readable medium containing executable programming instructions executed by a processor, controller, or the like.
- Examples of computer readable media include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices.
- the computer readable medium can also be distributed in network-coupled computer systems so that the computer readable media may be stored and executed in a distributed fashion such as, e.g., by a telematics server or a Controller Area Network (CAN).
- a telematics server or a Controller Area Network (CAN).
- CAN Controller Area Network
- the term “about” is understood as within a range of normal tolerance in the art.
- the function of the object described having an approximate feature (“about”) can be determined by a person of skill in the art based on the normal tolerances for such part, the object or function of the part, the position and relation to other objects or parts of the invention, and other information as would be used by a person of skill in the art. As an example, about may be within 2 standard deviations of the mean. About can be understood as within 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, 0.05%, or 0.01% of the stated value.
- Exemplary embodiments include methods and systems to adjust a driving path in a planner-map, including: identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; detecting one or more occlusion areas from sensor data from the one or more sensors; tracking one or more tracks of one or more objects within the one or more occlusion areas; adding uncertainty to the one or more tracks within the one or more occlusion areas; and adjusting the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
- FIG. 1 an example sensor-equipped vehicle 105 on a roadway 110 is provided, in accordance with various embodiments of the present disclosure.
- the vehicle 105 may comprise one or more sensors such as, for example, one or more LiDAR sensors 115, one or more radio detection and ranging (RADAR) sensors 120, one or more cameras 125, and/or one or more ultrasonic transducers 145, among other suitable sensors.
- the one or more sensors may be in electronic communication with one or more computing devices 130.
- the one or more computing devices 130 may be separate from the one or more sensors and/or may be incorporated into the one or more sensors.
- the vehicle 105 may comprise a LiDAR system which may comprise one or more LiDAR sensors 115 and/or one or more computing devices 130.
- the vehicle 105 may comprise a camera system which may comprise one or more cameras 125 and/or one or more computing devices 130.
- the LiDAR sensor 115 may be configured to emit light directed to strike a surface (e.g., the roadway 110, one or more obstacles 150, rain, snow, etc.) within the environment of the vehicle 105.
- a surface e.g., the roadway 110, one or more obstacles 150, rain, snow, etc.
- the one or more obstacles 150 may comprise one or more objects, one or more geographic hindrances to travel, and/or one or more other suitable obstacles 150.
- the one or more obstacles 150 may comprise one or more pedestrians 155, one or more vehicles 160, 162, one or more pieces of vegetation 165, and/or one or more other suitable obstacles 150.
- the light emitted from the LiDAR sensor 115 comes into contact with the surface, the light is deflected. Some of the deflected light may be reflected to bounce back to the LiDAR sensor 1 15.
- the LiDAR sensor 1 1 may be configured to measure data pertaining to the light bounced back (for example, the distance traveled by the light, the length of time it took for the light to travel from and to the LiDAR sensor 115, the intensity of the light returning to the LiDAR sensor 115, and so on as understood by a person of ordinary skill in the art).
- This data may then be used to generate a point cloud (i.e., data points, in a coordinate system, that represent locations of obstacles within an environment) of some or all of the environment around the vehicle 105, generally recreating an object map of the road surface of the roadway 110, obstacles 150 within the environment, and so on.
- a point cloud i.e., data points, in a coordinate system, that represent locations of obstacles within an environment
- the LiDAR sensor 115 may be coupled to the vehicle 105 and/or may be configured to generate one or more point clouds of an environment surrounding the vehicle 105.
- the environment may fully surround the vehicle or may encompass a portion of the vehicle’ s 105 surroundings.
- the LiDAR sensor 115 may be in electronic communication and/or coupled to the one or more cameras 125.
- one or more obstacles 150 may be occluded from view of vehicle 150 by one or more other obstacles 150.
- pedestrian 155 may be occluded from view by vegetation 165
- vehicle 162 may be occluded from view by vehicle 160.
- Occlusion may occur in a variety of scenarios after an obstacle has been detected by the one or more sensors of the vehicle 105.
- a smaller obstacle may be occluded by a larger obstacle.
- smaller obstacles such as, e g., pedestrians 155, bicycles, motorcycles, etc. may move behind a larger obstacle, in relation to the vehicle 105, a larger obstacle may move in front of the smaller obstacle, in relation to the vehicle 105, and/or the vehicle 105 may move, causing a larger obstacle to be positioned between the smaller obstacle and the vehicle 105.
- the one or more sensors coupled to the vehicle 105 may no longer be able to track the occluded obstacle.
- An occlusion may also occur when an object is no longer within a field of view of one or more sensors of the vehicle.
- the sensors 115, 120, 125, 145 may define a field of view 170 of detection and tracking for the vehicle 105.
- the vehicle 105 may be able to detect the vegetation 165 until the vehicle 105 passes the vegetation and the vegetation 165 passes out of the field of view 170 of the sensors. Therefore, an occlusion may occur whenever an object can no longer be tracked by a vehicle or may be harder to be tracked by a vehicle, such as in areas of no or weak perception by the one or more sensors of the vehicle.
- the occluded object may still be relevant in safely and efficiently navigating an environment and preventing possible collision with the occluded obstacle and/or one or more other obstacles.
- the vehicle 105 may include an object detection and analysis system 175 for detecting and classifying objects in a vicinity of the vehicle.
- Object detection and analysis may be performed by methods such as those shown and described by applications of Applicant, co-pending herewith, including, for example: US Patent Application Numbers 18/065,417; 18/065,419; 18/065,421, filed December 13, 2022; and US Patent Application Number 18/062, 228, filed December 6, 2022, each of which are incorporated herein in their entirety.
- the object may be tracked as it remains in a field of view of one or more sensors 115, 120, 125, 145 of vehicle 105.
- Exemplary embodiments of the system 175 of tracking occluded objects for path planning may be configured to continue to track or estimate a trajectory of the occluded object in order to continue to account for the object in the path planning of the autonomous vehicle.
- Exemplary embodiments of the system 175 may also or alternatively identify occlusions created by areas not observable by sensors (whether because of an obstruction caused by an object or by the positional configuration of the one or more sensors).
- the computing device 130 may comprise a processor 135 and/or a memory 140.
- the memory 140 may be configured to store programming instructions that, when executed by the processor 135, are configured to cause the processor 135 to perform one or more tasks such as, e.g., defining a map of a driving area, identifying a driving surface, determining a path on the driving surface, determining a driving path on the map of the driving area, identifying driving lanes on the map of the driving area, detecting one or more obstacles 150, tracking the one or more obstacles 150, determining one or more occlusion areas on the map and/or in relation to the vehicle or other reference frame, determining whether the one or more obstacles 150 are occluded, creating one or more obj ect tracks for the one or more obj ects, determining whether the one or more object racks enter the one or more occlusion areas, maintaining the one or more object tracks in the one or more occlusion areas, dropping the one or more tasks such as, e.
- a planner-map may be configured to identify a driving surface and determine a path on the driving surface, sensor(s) for generating sensor data of objects and driving areas detection, detector(s) for identifying one or more objects, driving areas, lanes, or other attributes from the sensor data generated from the sensor(s), occlusion publisher for detecting one or more occlusion areas using information from the sensor(s) and/or detector(s), tracker for maintaining one or more tracks/trajectories for the one or more objects, planner (or planner-map) configured to determine a driving path of a vehicle, change or update a driving path of the vehicle based on the occlusion areas and/or the tracks maintained within the occlusion areas, and/or determine one or more vehicle actions.
- the functional parts are shown and described as separate components as an exemplary embodiment for ease of explanation.
- the functional parts may be separate combinations of components, such as separate processor(s), memor(y/ies), and programing instructions. However, the functional parts may be integrated into the same combination of processor(s), memor(y/ies), and programming instructions. Therefore, the present invention disclosure covers the combinations of system components in which these functions are integrated and/or separated into any number of component parts from a single processing system to multiple processing systems that communicate with each other to perform the system functions described herein.
- a claimed component is claimed to perform more than one function, the system may actually have more than one component part performing the different functions, and the group of the component parts are understood to be the claimed component performing the claimed functions.
- a planner-map may be configured to generate a map of the driving surface, as well as determine a driving path through the map.
- the planner-map may actually be configured as a mapping device to generate the map, and a planner to generate the driving path.
- the combination of the planner and the map device are considered in combination to be the planner-map.
- FIG. 2 illustrates an exemplary image of an environment in front of a vehicle 205.
- the environment may be analyzed to detect one or more obstacles 250 within the field of view 270 of one or more sensors.
- the detected objects 250 are illustrated as cubes covering the volume in which the obstacles encompasses.
- the obstacles may, for example, be other vehicles.
- the motorcycle 280 may have been detected in a field of view of one or more sensors, such as from a rear or side of the vehicle 205, but is not yet in the field of view 270 of the forward facing sensors. Accordingly, the motorcycle 280 may be within a zone that is not detected or may only be weakly detected by the one or more sensors. The motorcycle 280 may, therefore, be considered as occluded, or otherwise outside the detection of one or more sensors. However, the motorcycle 280 still exists and should be relevant to the path planning of the vehicle 205 during any autonomous vehicle path implementations.
- the motorcycle may enter the field of view 270 but be occluded by one or more of the other objects 250 if their relative speeds put the motorcycle 280 behind another object.
- Other examples of field of view limits of one or more sensors may cause an object to be occluded.
- a fender 285 from an accident or other debris on the road may end up getting dropped from detection as it approaches the vehicle 205 because of limitations of the near field, field of view.
- an occlusion is understood to occur when an object cannot be detected by the combination of sensors because it is obscured from the sensor(s) by another object that is detected by the combination of sensors.
- An occlusion is understood to also or alternatively occur when an object cannot be detected by the combination of sensors because it is outside the field of view of the combination of sensors, either in the far field or the near field.
- An occlusion is understood to also or alternatively occur when an object may pass between regions of sensors and is within a dead zone or area proximate to the vehicle that is not detected by the combination of one or more sensors of the vehicle.
- the obstruction occurs when the obstructed object is within a proximity to the vehicle 205 so that its presence is relevant to the determinations of path planning of an autonomous vehicle.
- the method may start by detecting and analyzing objects to determine relevant objects within the field of one or more sensors of the vehicle.
- Relevant objects may be those that are in proximity of the vehicle that a person of skill in the art would consider relevant to the path planning of the vehicle during autonomous control.
- the objects may be classified and tracked for analysis and estimations of any path planning for the vehicle when autonomously driven.
- an occlusion map may be defined.
- the occlusion map may be defined based on an area of interest around the vehicle in which objects should continue to be tracked even if occluded.
- the occlusion map may be defined based on limitations of the field of view of the combination of sensors, and/or other limitations of the detected area around the vehicle based on the sensor(s) position and orientation around the vehicle.
- an occlusion map may include blind spots created by the combination of sensors and/or may include additional space from the vehicle in areas that are relevant to path determinations of the vehicle.
- the occlusion map may be pre-defined and coded into the system. The occlusion map may therefore be static in time.
- the occlusion map may be dynamically determined based on the environment, detected scene, and/or detected objects. For example, the occlusion map may be increased to encompass curves in the road or objects in a proximity of the vehicle anticipating potential obstructions caused by the detected environment and/or conditions. The occlusion map may therefore be updated to account for the detected objects and/or environment that do or could cause obstructions within a proximity of the vehicle.
- the method may include setting an initial keep alive time.
- the system may start with a predefined keep alive duration.
- the keep alive duration may be a defined amount of time in which an object is tracked once it is no longer detected by the sensor(s) of the vehicle.
- the keep alive time may be, for example, 1 second. Other durations are also possible, such as from more than zero seconds to up to 10 seconds, 30 seconds, or more.
- the method may then analyze conditions of the occlusion and/or of the occluded objects to determine whether the keep alive time should be adjusted. For example, the system may determine when an object is occluded. The system may detect whether conditions are present related to the occlusion to determine if the keep alive time should be modified to keep the object track for a longer amount of time. For example, the system may determine a trajectory of the occluded object based on previous positions of the detected object. If the determined trajectory of the occluded object is within the occlusion map, the system may be configured to update the keep alive time for that occluded object.
- the system may consider other factors, such as if the object is occluded by another object, the proximity of the occluded object to the vehicle when it was occluded, the direction of travel of the occluded object when it was occluded, the relative position of the occluded object to the vehicle (such as if the occluded object is behind, to the side of, or in front of the vehicle), the anticipated or expected trajectory of the occluded object, whether the occluded object is stationary or moving, current estimated position relative to the vehicle, or any combination thereof to determine whether the occluded object stay alive time should be increased or if the object should continued to be tracked.
- factors such as if the object is occluded by another object, the proximity of the occluded object to the vehicle when it was occluded, the direction of travel of the occluded object when it was occluded, the relative position of the occluded object to the vehicle (such as if the occluded object is behind, to the side of, or in
- the method may optionally re-evaluate the keep alive time of the occluded object. For example, the system may detect whether the object has been un-occluded, such as being detected by one or more sensors of the vehicle. The keep alive for the occluded object may be terminated, as the object is being directly tracked again. The system may also detect whether other conditions may exist in which the occluded object may no longer be tracked.
- the system may determine that the occluded object is no longer relevant to the path planning of the autonomous vehicle and may terminate the keep alive of the occluded object. If the obscuring object is still detected, but the occluded object is not, the system may increase the keep alive time or may maintain the obstructed object to continue to track an expected trajectory.
- the system may also employ a confidence evaluation to assess whether the occluded object is still relevant to the path planning of the vehicle for autonomous control.
- the confidence level of the occluded object may diminish over time.
- the confidence level described herein may also be considered an uncertainty because the object is not directly detected by the one or more sensors.
- the system may determine an expected position of the obstructed object based on the trajectory of the obstructed object when it was detected.
- the system may determine an expected trajectory of the obstructed object (track) based on the trajectory of the obstructed object when it was detected.
- the system may determine a confidence level (uncertainty) based on the passage of time and/or on the determinations of the expected position and trajectory based on the detected data of the obstructed object when it was not occluded.
- the system may determine based on these factors an updated keep alive time for the occluded object. For example, if the expected trajectory of the occluded object is expected to take the occluded object far enough away from the vehicle within an estimated time, the keep alive time may be updated to an amount of time at or beyond the estimated time (or may be set to zero once the amount of time has elapsed).
- an occluded object may still be behind another object (an obscuring object)
- the occluded object keep alive time may be updated.
- the confidence that the occluded object is still behind the obscuring object may diminish, and eventually, the system may determine that the object should no longer be tracked so that the keep alive time is reduced or terminated.
- the method may terminate or may continue with other occluded objects.
- the occluded object may then be tracked for the duration of the keep alive time or the duration of the updated keep alive time if/when it is re-evaluated.
- the system is not limited to only a count-down timer from the given timer before a track is cleared from the occlusion areas.
- the “keep alive time” may also be considered as a presumption in which the tracks are simply maintained in the system until certain conditions are met, including, for example, and combination of determining the object creating the track has left the occlusion area, the object is detected again by one or more sensors, the occlusion area is no longer present in the sensors field of view, the occlusion area is no longer in a proximity to the vehicle that the tracks within the occlusion area are relevant (or as relevant) to vehicle path planning, etc. Therefore, a “keep alive time” essentially is infinite and terminates or is set at zero when the condition is met, even though an actual data structure for “time” is not stored in the system.
- the system may not include or adjust a keep alive time. Instead, the system may instead simply analyze whether the object should continue to be tracked, even as it resides (or is estimated to reside) in the occlusion areas. In this case, the system may still consider the same conditions, but instead of adjusting the stay alive time, determine whether the occluded object should continue to be tracked or not.
- the keep alive time may be set to an adjusted keep alive duration.
- the adjusted keep alive duration may be infinite (or until the vehicle is no longer driving or on), may be infinite until the obstructed object is detected again, or may be set to a maximum value, such as 10 minutes, 15 minutes, 30 minutes, a hour, etc.
- the method may proceed as illustrated in FIG. 3B.
- the steps shown and described with respect to FIG. 3B may be used in any combination with the features described with respect to FIG. 3 A.
- step 312 including increasing the precision of objects leaving an area corresponding to the occlusion map may be implemented, while the other steps are not.
- Multiple steps may be implemented together as well, such as, for example including 312, 314, and 316 with the steps of FIG. 3A.
- the method may optionally proceed by implementing occluded region updates. Therefore, for occluded objects (objects in the areas of the occlusion map), the system may be configured to adjust the tracks of the objects and/or account for the uncertainties that relate to the continued existence of the track as it remains in the occlusion areas and not directly tracked by the one or more sensors. For example, the system may analyze different other information about the occlusion area and/or events related or around the occlusion area. The tracks of objects within the occlusion area may then be updated if the tracks are likely influenced by something else, such as another object entering the occlusion area. As another example, the system may incorporate an uncertainty into the track of an object within the occlusion area.
- the uncertainty may be used to account for possible variability of the track of the occluded object as the occluded object remains in the occlusion area. For example, as the object enters an occlusion area, the track of the object close to the time of entry into the occlusion area will be known with more certainty as the object is likely to continue based on the information just before entering the occlusion area. However, as the object remains in the occlusion area, the position becomes less known because the object may change directions, change speed, etc. The system may therefore track an uncertainty associated with the object so that the variability in its predicted location may be taken into account. In an exemplary embodiment, the variability or uncertainty may be accounted for by widening or increasing an area associated with a predicted location of the object.
- the object when the object enters the occlusion area, its predicted location may be determined with a high level of certainty. At time 10, however, the object may have stopped, changed direction, or changed speed.
- the predicted location or track of the occluded object may therefore include a band or region in which the object is considered to be within.
- the predicted location may include anywhere from where the object entered, including anywhere along the trajectory generated from when the object entered the occlusion area, or may include anywhere radiated out from such a location (to account for variations in direction).
- the system may therefore maintain a track of the occluded object in which the track is not a narrow trajectory (or may start as a narrow trajectory but does not stay a narrow trajectory) but an area the propagates out from the point the object entered the occlusion area, and over time expands over the occlusion region propagating from the entry location across the occlusion area.
- the system may define a measurement or path in which the occluded object is considered to be uniformly present within the occlusion area. Essentially, the system no longer knows where in the occlusion area the object may be.
- the occluded object may be considered trapped in the occlusion area. Therefore, if an object enters an occluded region and is not observed to exit the occluded region, then every update on a periodic basis may evidence that the object is still in the occluded region. Over time, the system may not know where it is in the occluded region.
- the object may be “measured” anywhere in that contiguous region of the occlusion area.
- the posterior state estimate may be a mixture of a prior position with an uncertainty measurement.
- the method may include tracking objects with a higher precision that are detected entering, leaving, and/or in a certain proximity to areas associated with the occlusion map. The objects in proximity or adjacent to areas of the occlusion map may be observed with higher precision.
- These objects adjacent to the area of the occlusion map may be more precisely detected so that information about an object going into or out of the occlusion map may be known with greater details.
- the higher precision may be used to compare objects going into and out of the areas of the occlusion map to determine whether the object(s) are the same object or different.
- the higher precision may be used to reduce false positives of tracks moving out of areas of the occlusion map so that obstructed objects are not removed from tracking (their keep alive duration terminated) when they are still obstructed.
- Higher precision may include considering additional details of the object, using faster sampling rates, higher image processing to determine greater details, obtaining more information to create more accurate trajectory estimations, or other characteristics in analyzing the detected objects from the one or more sensors.
- the system may include tracking hair color, hair length, clothes, cloth colors, gender, height, weight, etc. when the person is going into or leaving an area associated with the occlusion map or while in an areas adjacent to areas of the occlusion map.
- the person when the person is away from areas of the occlusion map, the person may be identified only as an object, a person, or an object to be avoided, and the specific attributes of the person are not determined or analyzed. “Higher” is understood to be a relative term compared to the processing of the images and/or data from the one or more sensors between two areas, such as those areas adjacent the areas of the occlusion map and those areas away from the areas of the occlusion map.
- the method may optionally include analyzing different occlusion maps or integrated occlusion maps into a single occlusion map. For example, there may be obstructions created by the sensor configurations, the conditions of the sensors, the environment (or scenery), other objects, etc.
- the method may include analyzing occluded objects based on the different occlusion maps or by integrating the occlusion maps before analyzing the occluded object.
- the system may first analyze an first occlusion map based on the vehicle (and/or sensor/detector) setup including unmonitored areas and/or sensor/detector limitations.
- the system may be configured to then analyze a second occlusion map based on the environment, such as caused by scenery and/or other detected objects within the environment of the vehicle.
- the method may include using other sensors and/or detectors to determine a status of an occluded object and/or to update the keep alive duration of the occluded object.
- the system may be configured to determine that an object went into an area associated of the occlusion map. After a predefined duration, the system may determine that the occluded object has not left the area of the occlusion map.
- the system may interrogate additional sensors or receive additional data in order to determine whether the occluded object is still in or likely still in or out or likely out of the area of the occlusion map.
- the system may be communicate with other systems or external detectors to receive additional data about object in the area of the occlusion map to confirm the presence or absence of occluded object within the area of the occlusion map.
- the occlusion map may be shared with the planner.
- the planner may be configured to perform the path planning as described herein for the autonomous vehicle.
- the planner may therefore use the occlusion map in order to make decisions and/or updates about a path of the vehicle.
- the occlusion map may be used to inform the planner that information within the occlusion map are unknown or known with certain levels of uncertainty or with less certainty.
- the planner may therefore use this information to make decisions that avoid traveling into areas of the occlusion map.
- the planner may therefore consider or optimize interactions of an entire environment in which areas of the scene may be hidden by occlusions occurring within the occlusion map.
- the planner may also consider areas in which objects are likely to exit and occlusion area to avoid those areas, and/or be ready to react to those situations as they may happen.
- Exemplary embodiments of the methods described herein may result in a tunable keep alive time that permits the system to remember or track an occluded object for a longer amount of time.
- Exemplary embodiments may be used to track objects continuously around the vehicle more easily.
- Exemplary embodiments may be used to consider occlusion areas as defined by the occlusion map. Creating and/or tracking an occlusion map may be used for understanding what parts of the environment are positively seen or directly detected and for which areas of the environment create corridor problems or if object may be hidden behind other objects.
- Exemplary embodiments may be used to keep track of objects that follow a vehicle but may be hidden from the sensors of the vehicle, such as when a trailer is being pulled by the vehicle or other occlusion scenario that is not currently accounted for in detecting objects around a vehicle.
- Exemplary embodiments may be used to define occlusion areas in any proximity or relative position to the vehicle.
- occlusion areas may include obstructions in the vertical direction above or about the vehicle.
- an occlusion may be generated from an on ramp off ramp, overpass, overhanging items, etc.
- the occlusion map can be a voxel grid in three dimensions or a pixel grid in two dimensions. Exemplary embodiments described herein may be used so that the planner can generate worst case planning or actuate cautiously to preserve the ability to detect and avoid obstacles if such worst case is happening when the an occluded objects is exposed by exiting the occlusion.
- a method may maintain tracking of an object that passes into an occluded area, including define a map of a driving area; define one or more occlusion areas within the map; detect an object using sensor data from one or more sensors; create an object track for the object detected using sensor data; determine that the object track entered one of the one or more occlusion areas; maintain the object track while the object track remains in the one or more occlusion areas.
- the method may include maintaining and/or dropping the object tracks based on one or more criteria, including, for example: the object track may be maintained while the object remains undetected by the one or more sensors; dropping the object track when the one of the one or more occlusion areas are cleared; dropping the object track when the object exits the one of the one or more occlusion areas by detecting the object with the one or more sensors; dropping the object track when a probably that the object is no longer in one or more occlusion areas has surpassed a threshold.
- the method may include maintaining and/or updating a probability location of the one or more tracked objects in the occlusion areas (including the one or more object tracks).
- the method may include any combination of: determining a probability location of an object tracked in the occlusion area; determining a certainty of the probability location manifests in an area of probability in which the object may be within the one or more occlusion areas along the object track; determine the probability location with less certainty the longer the object remains in the one or more occlusion areas, wherein the less certainty results in a larger areas of probability associated with the object along the object track.
- the method may adjust a driving path in a plannermap, and may include identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; detecting one or more occlusion areas from sensor data from the one or more sensors; tracking one or more tracks of one or more objects within the one or more occlusion areas; adding uncertainty to the one or more tracks within the one or more occlusion areas; adjusting the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
- the method may include managing track(s) of objects within the occlusion areas including updating the tracks with uncertainty to account for the unknown position of the object within the occlusion area.
- the method may include any combination of: the uncertainty of the one or more tracks is based on a time elapse of an object corresponding to the track that is within the one or more occlusion areas, the uncertainty of the one or more tracks increases as time elapses, the uncertainty is based on a size of the one or more occlusion areas, the uncertainty increases with the size of the one or more occlusion areas, the path is adjusted based on the one or more tracks within the one or more occlusion areas.
- the method may adjust a driving path of a vehicle, including identifying a driving surface; determining a path on the driving surface; identifying one or more objects relative to the driving surface using one or more sensors; determining one or more occlusion areas created by the one or more objects; adding uncertainty of a potential additional object within the one or more occlusion areas; adjusting the path on the driving surface based on the uncertainty of the potential additional object within the one or more occlusion areas.
- the method may remove tracks from a tracker when the tracks are within an occluded region.
- the method may include defining a map of a driving area; defining an occlusion area within the map; detecting an object using sensor data from one or more sensors; determining that the object entered the occlusion area; creating an estimated object location for the object within the occlusion area.
- the method may include many combinations of additional or alternative features.
- the method may include any combination of: dropping the estimated object location for the object when the occlusion area is cleared; the occlusion area is cleared when an occluding object creating the occlusion area is no longer occluding the one or more sensors; the occlusion area is cleared when the object is detected by the one or more sensors after leaving the occlusion area; increasing a precision of observation for objects leaving the occlusion area; increasing a precision of observation for objects entering the occlusion area; using the increased precision of observation for objects entering and leaving the occlusion area to confirm that the object entering the occlusion area is the same as the object leaving the occlusion area before dropping the estimated object location for the object when the occlusion areas is cleared; updating the map of the driving area to include estimates of the driving area within the occlusion area; the estimates of the driving area may be an extension of lane detections; periodically updating the estimated object location to an updated estimated object location
- FIG. 4 illustrates an exemplary occlusion map according to embodiments described herein.
- an exemplary occlusion map 490 may be determined based on areas that are outside of the detection of the sensor(s) of the vehicle, i.e dead zones 492, may include areas of reduced visibility or detection, i.e. near field 494, or may be occluded by one or more objects, i.e. obstructions 496.
- An occlusion map may be defined as the areas of an environment that are not detected by or not well detected by one or more sensors of the vehicle. The far field limitations of the combination of sensors may be excluded from the occlusion map as objects far enough away from the vehicle may be of little relevance to the path planning of the autonomous vehicle, and therefore do not require objects within this space to continually be tracked.
- the occlusion map may be defined as the areas within a proximity of the vehicle that are also without or with limited detection by one or more combination of sensors for detecting objects around the vehicle.
- the occlusion map may be defined based on an area of interest around the vehicle in which objects should continue to be tracked even if occluded.
- the occlusion map may be defined based on limitations of the field of view of the combination of sensors, or other limitations of the detected area around the vehicle based on the sensor(s) position and orientation around the vehicle.
- an occlusion map may include blind spots created by the combination of sensors and/or may include additional space from the vehicle in areas that are relevant to path determinations of the vehicle.
- the occlusion map may be pre-defined and coded into the system. The occlusion map may therefore be static in time.
- the occlusion map may be dynamically determined based on the environment, detected scene, and/or detected objects.
- Exemplary embodiments of the occlusion publisher 515 may be configured to receive and/or handle the track states of obstructed objects to manage keep alive times of those obstructed objects.
- the occlusion publisher 515 may receive track states of obstructed objects from the tracker at the end of every tracker cycle and manage the keep alive times for each of the obstructed objects and their associated tracks.
- the occlusion publisher 515 may be configured to communicate with the tracker 520 to provide information to the tracker 520 including one or more of the occlusion maps, keep alive times, and/or tracks of occluded objects.
- the system may see this occlusion at tO and the occlusion at tl so it is tracked over time.
- the tracker then consumes this occlusion temporal information in order to reason about object track updates.
- the tracker may be configured to extinguish tracks that have exceeded their keep alive times.
- a first detector/sensor 610 and a second detector/sensor 615 are illustrated. Although two detectors/sensors are shown, any combination of detectors/sensors may be used.
- the detectors may receive information 605.
- Each detector 610, 615 may include an occlusion updater 620, 625.
- the detector occlusion updater 620, 625 may be configured to create an occlusion map based on the detector 620, 625.
- Each detector may therefore determine a field of view of the detector/sensor and/or detect one or more objects within the field of view of the detector/sensor.
- the detector may then be configured through the detector occlusion updater to define an occlusion map based on the range of the detector/sensor and/or what is detected by the detector.
- each detector 610, 615 may optionally track and/or update track states within its own occlusion regions.
- Each detector 610, 615 may optionally track and/or update track states within its own occlusion region that is created by objects detected by the detector.
- each detector determines its own area of occlusion based on a world state.
- Exemplary embodiments may be configured so that each detector 610, 615 is aware of its word state, such as by receiving information 605.
- the received information 605 may include the vehicle pose, sensor extrinsics/intrinsics, tracks, and obstacles.
- Each detector 610, 615 may be configured to associate objects detected by that detector to a track. If an object is detected without a track, then the detector may create a new track. The detectors may determine which tracks no longer have a detection (i.e., a previously identified/detected object is no longer detected/identified).
- the detector may be configured to determine whether the track entered an occlusion region. If the track is determined to have entered an occlusion region, then an occlusion region may be assigned to the track. An elapse time of a keep alive time may be started for the track in the occlusion.
- the detector may compute the intersection over a track area.
- the detector may be configured for each track to determine a predicted location or trajectory within the occlusion area.
- the predicted location or trajectory may include an uncertainty to include a possible variability of the possible location of the object as it remains in the occlusion area.
- the detector may therefore maintain a track of the occluded object in which the track is not a narrow trajectory (or may start as a narrow trajectory but does not stay a narrow trajectory) but an area the propagates out from the point the object entered the occlusion area, and over time expands over the occlusion region propagating from the entry location across the occlusion area.
- the detector may define a measurement or path in which the occluded object is considered to be uniformly present within the occlusion area. Essentially, the detector no longer knows where in the occlusion area the object may be. The occluded object may be considered anywhere in the occlusion area. Therefore, if an object enters an occluded region and is not observed to exit the occluded region, then every update on a periodic basis may evidence that the object is still in the occluded region. Over time, the detector may not know where it is in the occluded region. The object may be “measured” as anywhere in that contiguous region of the occlusion area.
- the occlusion map may be a bird’s eye view (BEV) binary image.
- the computation for a given track may be performed on a pixel image.
- the intersection measurement of each track may be made when the detector has information to add, such as when the track is near the interface of an occlusion according to the detector.
- the distance may be computed by a distance transform on the occlusion map.
- An occluded region may be defined by identifying all areas of a detector that are occluded. For example, a detector may define an area within its field of view. All bird eye view (BEV) cells inside the field of view (FOV) of the detector can be coded as un-occluded; then, regions behind objects as detected by the detector can be defined as occluded within a margin around the object.
- BEV bird eye view
- the system 600 may be configured such that the one or more detectors 610, 615 communicate with a tracker 630.
- the tracker 630 may be configured to determine and/or maintain position of objects and/or determine expected trajectories of objects (tracks 635).
- the tracker 630 may also and/or alternatively be configured to maintain keep alive times for respective tracks 635 according to embodiments described herein.
- the tracker may therefore be configured with instructions for determining keep alive times or updating keep alive times as described herein.
- the tracker may also or alternatively be configured to track the elapse time of an occluded object from the time the occluded obj ect became occluded.
- the tracker may also or alternatively be configured to determine a confidence score in the location, trajectory, and/or presence of an occluded object.
- the tracker 630 may also or alternatively be configured to receive the information from the one or more detectors 610/615 to generate a unified occlusion map 640.
- the detector is shown and described as maintaining the tracks of detected objects from a sensor.
- the tracker may also keep or maintain this information.
- the tracker may receive information from the detectors and maintain an amalgamated system of tracks.
- the received information may be raw data of a sensor or analyzed information from a detector.
- the detector may provide any or all of the track functions described herein and/or the tracker may perform any or all of the track functions described herein.
- the tracker and/or detector may therefore work together or replace one or the other.
- obstacles 645 are also detected in order to identify possible occlusions and generate an occlusion map 640.
- the system 600 may be configured such that the tracker communicates with consumers 650.
- the consumers 650 may include one or more separate and/or integrated objects that receive information from the tracker and use it to perform its own function.
- the planner 655 may be used to determine a path for controlling the autonomous vehicle.
- the viewer 660 may display information about the vehicle, including blind spots.
- the localizer 665 may be configured to track the places detected by the one or more detectors/sensors.
- the localizer 665 may be configured to identify areas of uncertainty where a road may be located.
- the localizer may be configured to use the occlusion(s) or occlusion map to reason about a vehicle’s corridor state, and its update.
- the localizer When the localizer cannot see a region and thus receives no lane detections in such region, the localizer is configured not to update the lanes as “not there”, but instead, the localizer is configured to reason about them in the same way as objects and maintain its existence as if it went into an occlusion. Uncertainty grows, but the estimated location of the lanes do not move since they are static and the last time the system detected them was when the occlusion started.
- the localizer may also have a similar function as the occlusion to predict a situation in which the system “has not seen this region ever” so the system can plan cautiously in recognition of this fact.
- Exemplary embodiments described herein may be used by the planner 665 for predicting behavior of occluded objects. Exemplary embodiments described herein may be used by the planner for planning, such as in planning a path or speed of the vehicle in response to objects that are no longer observable.
- An exemplary embodiment of the system and method fortracking objects through occluded regions may include
- Keepalive set a keep alive duration in each track state.
- Empty occlusion grid Publish an empty Occlusion Grid from an occlusion generator inside the tracker in local coordinates that may be similar to an obstacle grid.
- Grid visualizer Visualize this grid in the viewer.
- Use grid Use this empty grid to update keepalive durations near the end of each tracker cycle.
- an occlusion grid may be generated in the obstacle detector and maintain memory of the last cycle obstacles. Use the Grid to keep obstacles alive as the obstacle reside in the occluded region around the vehicle.
- the occlusion grid in the tracker may be used to decide on different keep alive durations.
- the occlusion grid may be used to compute the keep alive duration for every track.
- the keep alive duration TTL
- the keep alive time is increased when an object enters an occlusion region.
- the keep alive time may be based on the classification of the object, such as for pedestrians verses vehicles.
- the keep alive time may be reset, such as lowered to a predetermined amount (such as 1 second, 2 seconds, or more).
- current time and last detection time may be also or alternatively used to compute a keep alive duration.
- the keep alive time may be set all tracks that are update every cycle depending on where it is, what class it is, and when it was last observed.
- the vehicle may include a planner.
- the planner may be configured to identifying one or more one or more objects (such as pedestrians, vegetation, vehicles, or other obstacles) within the environment of the vehicle.
- the planner may be configured to determine a control attribute of the vehicle for autonomous driving.
- the control attribute may be any of a speed, a change in speed, a direction, a path, or other control of the vehicle.
- the planner may comprise one or more computing devices and/or may be a component of one or more computing devices.
- the system described herein may comprise one or more sensors including, for example, LiDAR sensors, RADAR sensors, camera sensors, etc.
- the one or more sensors may be coupled to the vehicle and/or may be positioned at any suitable position or positions along a vehicle (e.g., the front, sides, back, top, bottom, etc.).
- the camera sensor(s) may be in electronic communication with one or more image processors configured to detect objects within the image and/or classify the object, for example, as a pedestrian, a vehicle, vegetation, etc.
- a combination of shape and color may be used in the classification of one or more obstacles.
- one or more of the image detectors may be configured to query one or more color features for each point of a patched and/or full image.
- color features may be very strong representations which may be used to distinguish them.
- the classified obstacles may be sent downstream to one or more planning modules.
- the one or more planning modules may be configured to plan a trajectory of the vehicle, including any changes in direction, velocity, etc.
- the one or more planning modules may incorporate high precision data from the environment of the vehicle. The high precision data from the environment of the vehicle may be gathered, calculated, and/or determined via one or more high precision perceived environment modules.
- An exemplary system described herein may be configured to maintain tracking of an object that passes into an occluded area.
- the system may include detection system configured to define a map of a driving area; one or more sensors configured to generate sensor data; a detector configured to detect an object from the sensor data; an occlusion publisher configured to define one or more occlusion areas within the map; a tracker configured to create an object track for the object detected by the detector and configured to maintain the object track when the object is determined to enter the one or more occlusion areas.
- Exemplary embodiments of the system may include any combination of the following features: a tracker configured to drop the object track when the occlusion region is cleared; a tracker configured to drop the object track when the object exits the one of the one or more occlusion areas by detecting the object with the one or more sensors; a tracker configured to drop the object track when a probably that the object is no longer in one or more occlusion areas has surpassed a threshold; a tracker configured to update a probability location of the object while the object remains undetected by the one or more sensors and within the one or more occlusion areas; a tracker is configured to define a certainty of the probability location and manifest the certainty as an area of probability in which the object may be within the one or more occlusion areas along the object track; a tracker is configured to determine the probability location with less certainty the longer the object remains in the one or more occlusion areas; a tracker is configured to define a reduction in the certainty as a larger areas of probability associated with the object along
- Exemplary embodiments described herein include a system to adjust a driving path of a vehicle.
- the system may include a planner-map configured to identify a driving surface and determine a path on the driving surface; one or more sensors configured to generate sensor data to identify one or more objects relative to the driving surface; an occlusion publisher configured to detect one or more occlusion areas from the sensor data from the one or more sensors; a tracker configured to track one or more tracks of one or more objects within the one or more occlusion areas, wherein the planner-map is configured to add uncertainty of the one or more tracks of one or more objects within the one or more occlusion areas, and adjust the path on the driving surface based on the uncertainty of the one or more tracks within the one or more occlusion areas.
- the system may also or alternatively include any combination of additional features, such as, for example: a planner-map configured to add the uncertainty of the one or more tracks based on a time elapse of an object corresponding to the track that is within the one or more occlusion areas; a planner-map configured to increase the uncertainty of the one or more tracks as time elapses; a planner-map is configured to add the uncertainty based on a size of the one or more occlusion areas; a planner-map is configured to increase the uncertainty with the size of the one or more occlusion areas; a planner-map is configured to adjust the path based on the one or more tracks within the one or more occlusion areas.
- additional features such as, for example: a planner-map configured to add the uncertainty of the one or more tracks based on a time elapse of an object corresponding to the track that is within the one or more occlusion areas; a planner-map configured to increase the uncertainty of the one or more tracks as time elapses; a planner
- Exemplary embodiments described herein may include a system to remove tracks from a tracker when the tracks are within an occluded region.
- the system may include a plannermap configured to identify a driving surface and determine a path on the driving surface; one or more sensors configured to generate sensor data to identify one or more objects relative to the driving surface; an occlusion publisher configured to detect one or more occlusion areas from the sensor data from the one or more sensors; a tracker configured to determine that the object entered the occlusion area; wherein the planner-map is configured to create an estimated object location for the object within the occlusion area, and drop the estimated object location for the object when the occlusion area is cleared.
- the system may also or alternatively be configured to clear the occlusion area when an occluding object creating the occlusion area is no longer occluding the one or more sensors, and/or clear the occlusion area when the object is detected by the one or more sensors after leaving the occlusion area.
- FIG. 7 an illustration of an example architecture for a computing device 700 is provided.
- the computing device 130 of FIG. 1 may be the same as or similar to computing device 700. As such, the discussion of computing device 700 is sufficient for understanding the computing device 130 of FIG. 1, for example.
- Computing device 700 may comprise greater or fewer components than those shown in FIG. 1.
- the hardware architecture of FIG. 7 represents one example implementation of a representative computing device configured to one or more methods and means for tracking objects through an occluded area within a vehicle environment, and determining a course of action for the vehicle, as described herein.
- the computing device 700 of FIG. 7 implements at least a portion of the method(s) described herein (for example, method 300 of FIGS. 3A-3B).
- the hardware includes, but is not limited to, one or more electronic circuits.
- the electronic circuits can include, but are not limited to, passive components (e.g., resistors and capacitors) and/or active components (e.g., amplifiers and/or microprocessors).
- the passive and/or active components can be adapted to, arranged to and/or programmed to perform one or more of the methodologies, procedures, or functions described herein.
- the computing device 700 comprises a user interface 702, a Central Processing Unit (“CPU”) 706, a system bus 710, a memory 712 connected to and accessible by other portions of computing device 700 through system bus 710, and hardware entities 714 connected to system bus 710.
- the user interface can include input devices and output devices, which facilitate user-software interactions for controlling operations of the computing device 700.
- the input devices include, but are not limited to, a physical and/or touch keyboard 750.
- the input devices can be connected to the computing device 700 via a wired or wireless connection (e.g., a Bluetooth® connection).
- the output devices include, but are not limited to, a speaker 752, a display 754, and/or light emitting diodes 756.
- Hardware entities 714 perform actions involving access to and use of memory 712, which can be a Random Access Memory (RAM), a disk driver and/or a Compact Disc Read Only Memory (CD-ROM), among other suitable memory types.
- Hardware entities 714 can include a disk drive unit 516 comprising a computer-readable storage medium 718 on which is stored one or more sets of instructions 720 (e.g., programming instructions such as, but not limited to, software code) configured to implement one or more of the methodologies, procedures, or functions described herein.
- the instructions 720 can also reside, completely or at least partially, within the memory 712 and/or within the CPU 706 during execution thereof by the computing device 700.
- the memory 712 and the CPU 706 also can constitute machine-readable media.
- machine-readable media refers to a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 720.
- machine-readable media also refers to any medium that is capable of storing, encoding or carrying a set of instructions 720 for execution by the computing device 700 and that cause the computing device 700 to perform any one or more of the methodologies of the present disclosure.
- one or more computer applications 724 may be stored on the memory 712.
- example vehicle system architecture 800 for a vehicle is provided, in accordance with various embodiments of the present disclosure.
- Vehicle 105 of FIG. 1 can have the same or similar system architecture as that shown in FIG. 8. Thus, the following discussion of vehicle system architecture 800 is sufficient for understanding vehicle 105 FIG. 1.
- the vehicle system architecture 800 includes an engine, motor or propulsive device (e.g., a thruster) 802 and various sensors 804-818 for measuring various parameters of the vehicle system architecture 800.
- the sensors 804-818 may include, for example, an engine temperature sensor 804, a battery voltage sensor 806, an engine Rotations Per Minute (RPM) sensor 808, and/or a throttle position sensor 810.
- RPM Rotations Per Minute
- the vehicle may have an electric motor, and accordingly will have sensors such as a battery monitoring system 812 (to measure current, voltage and/or temperature of the battery), motor current 814 and voltage 816 sensors, and motor position sensors such as resolvers and encoders 818.
- sensors such as a battery monitoring system 812 (to measure current, voltage and/or temperature of the battery), motor current 814 and voltage 816 sensors, and motor position sensors such as resolvers and encoders 818.
- Operational parameter sensors that are common to both types of vehicles include, for example: a position sensor 834 such as an accelerometer, gyroscope and/or inertial measurement unit; a speed sensor 836; and/or an odometer sensor 838.
- the vehicle system architecture 800 also may have a clock 842 that the system uses to determine vehicle time during operation.
- the clock 842 may be encoded into the vehicle on-board computing device 820, it may be a separate device, or multiple clocks may be available.
- the vehicle system architecture 800 also may comprise various sensors that operate to gather information about the environment in which the vehicle is traveling. These sensors may include, for example: a location sensor 844 (for example, a Global Positioning System (GPS) device); object detection sensors such as one or more cameras 846; a LiDAR sensor system 848; and/or a radar and/or a sonar system 850.
- the sensors also may comprise environmental sensors 852 such as a precipitation sensor and/or ambient temperature sensor.
- the object detection sensors may enable the vehicle system architecture 800 to detect objects that are within a given distance range of the vehicle 800 in any direction, while the environmental sensors 852 collect data about environmental conditions within the vehicle's area of travel.
- the on-board computing device 820 may be configured to analyze the data captured by the sensors and/or data received from data providers, and may be configured to optionally control operations of the vehicle system architecture 800 based on results of the analysis.
- the on-board computing device 820 may be configured to control: braking via a brake controller 822; direction via a steering controller 824; speed and acceleration via a throttle controller 826 (in a gas-powered vehicle) or a motor speed controller 828 (such as a current level controller in an electric vehicle); a differential gear controller 830 (in vehicles with transmissions); and/or other controllers.
- Geographic location information may be communicated from the location sensor 844 to the on-board computing device 820, which may then access a map of the environment that corresponds to the location information to determine known fixed features of the environment such as streets, buildings, stop signs and/or stop/go signals. Captured images from the cameras 846 and/or object detection information captured from sensors such as LiDAR 848 is communicated from those sensors to the on-board computing device 820. The object detection information and/or captured images are processed by the on-board computing device 820 to detect objects in proximity to the vehicle. Any known or to be known technique for making an object detection based on sensor data and/or captured images may be used in the embodiments disclosed in this document. [0147] Exemplary embodiments described herein include a vehicle having a system for tracking objects through an occlusion as described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Optical Communication System (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2024232545A AU2024232545A1 (en) | 2023-03-06 | 2024-03-05 | Systems and methods for adjusting a driving path using occluded regions |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/179,165 | 2023-03-06 | ||
| US18/179,176 | 2023-03-06 | ||
| US18/179,193 US20240300533A1 (en) | 2023-03-06 | 2023-03-06 | Systems and Methods to Manage Tracking of Objects Through Occluded Regions |
| US18/179,165 US20240300486A1 (en) | 2023-03-06 | 2023-03-06 | Systems and Methods for Managing Tracks Within an Occluded Region |
| US18/179,193 | 2023-03-06 | ||
| US18/179,176 US20240300487A1 (en) | 2023-03-06 | 2023-03-06 | Systems and Methods for Adjusting a Driving Path Using Occluded Regions |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2024186814A2 true WO2024186814A2 (en) | 2024-09-12 |
| WO2024186814A3 WO2024186814A3 (en) | 2024-11-14 |
Family
ID=90719212
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/018516 Pending WO2024186814A2 (en) | 2023-03-06 | 2024-03-05 | Systems and methods for adjusting a driving path using occluded regions |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2024232545A1 (en) |
| WO (1) | WO2024186814A2 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119636750A (en) * | 2025-01-17 | 2025-03-18 | 知行汽车科技(苏州)股份有限公司 | Method, device, equipment and medium for solving visual occlusion |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US228A (en) | 1837-06-10 | Improvement | ||
| US18062A (en) | 1857-08-25 | Washing-machine |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11353577B2 (en) * | 2018-09-28 | 2022-06-07 | Zoox, Inc. | Radar spatial estimation |
| US11188082B2 (en) * | 2019-01-11 | 2021-11-30 | Zoox, Inc. | Occlusion prediction and trajectory evaluation |
-
2024
- 2024-03-05 WO PCT/US2024/018516 patent/WO2024186814A2/en active Pending
- 2024-03-05 AU AU2024232545A patent/AU2024232545A1/en active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US228A (en) | 1837-06-10 | Improvement | ||
| US18062A (en) | 1857-08-25 | Washing-machine |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119636750A (en) * | 2025-01-17 | 2025-03-18 | 知行汽车科技(苏州)股份有限公司 | Method, device, equipment and medium for solving visual occlusion |
Also Published As
| Publication number | Publication date |
|---|---|
| AU2024232545A1 (en) | 2025-09-18 |
| WO2024186814A3 (en) | 2024-11-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12319313B2 (en) | Method and system for operating an autonomous agent with incomplete environmental information | |
| US12333389B2 (en) | Autonomous vehicle system for intelligent on-board selection of data for training a remote machine learning model | |
| US10852726B2 (en) | Systems and methods for transitioning a vehicle from an autonomous driving mode to a manual driving mode | |
| US11718290B2 (en) | Methods and systems for safe out-of-lane driving | |
| US12315197B2 (en) | Systems and methods for validating camera calibration in real-time | |
| US12319246B2 (en) | Systems and methods for vehicle camera obstruction detection | |
| JP7376682B2 (en) | Object localization for autonomous driving using visual tracking and image reprojection | |
| US11328602B2 (en) | System and method for navigation with external display | |
| US12358536B2 (en) | Systems and methods for estimating the origins of abnormal driving | |
| US20230386326A1 (en) | Systems and methods for detecting pedestrians with crosswalking or jaywalking intent | |
| US20240300486A1 (en) | Systems and Methods for Managing Tracks Within an Occluded Region | |
| US20240300533A1 (en) | Systems and Methods to Manage Tracking of Objects Through Occluded Regions | |
| WO2024186814A2 (en) | Systems and methods for adjusting a driving path using occluded regions | |
| US20240300487A1 (en) | Systems and Methods for Adjusting a Driving Path Using Occluded Regions | |
| US12043290B2 (en) | State identification for road actors with uncertain measurements based on compliant priors | |
| US12485917B2 (en) | Systems and methods for path planning of autonomous vehicles | |
| US12415513B2 (en) | Systems and methods for controlling a vehicle using high precision and high recall detection | |
| US12043289B2 (en) | Persisting predicted objects for robustness to perception issues in autonomous driving | |
| US20250308253A1 (en) | Method for determining free space in a surrounding of a vehicle, and an apparatus thereof | |
| US20250091569A1 (en) | Systems and methods for parking a following vehicle in a convoy | |
| US20240151817A1 (en) | Systems and methods for static detection based amodalization placement | |
| US20230294737A1 (en) | Control subsystem and method to define the response of an autonomous vehicle to an unknown object | |
| US20250139796A1 (en) | Systems and methods for improving distance predictions |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: AU2024232545 Country of ref document: AU |
|
| ENP | Entry into the national phase |
Ref document number: 2024232545 Country of ref document: AU Date of ref document: 20240305 Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024716992 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24716992 Country of ref document: EP Kind code of ref document: A2 |
|
| ENP | Entry into the national phase |
Ref document number: 2024716992 Country of ref document: EP Effective date: 20251006 |