US12254772B2 - Signal processing device, signal processing method, and mobile device - Google Patents
Signal processing device, signal processing method, and mobile device Download PDFInfo
- Publication number
- US12254772B2 US12254772B2 US17/753,828 US202017753828A US12254772B2 US 12254772 B2 US12254772 B2 US 12254772B2 US 202017753828 A US202017753828 A US 202017753828A US 12254772 B2 US12254772 B2 US 12254772B2
- Authority
- US
- United States
- Prior art keywords
- grid
- occupancy
- attribute
- map
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/70—Labelling scene content, e.g. deriving syntactic or semantic representations
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
- G08G1/133—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams within the vehicle ; Indicators inside the vehicles or at stops
- G08G1/137—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams within the vehicle ; Indicators inside the vehicles or at stops the indicator being in the form of a map
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
Definitions
- Patent Document 1 only a case has been studied in which a grid is included in a blind spot area in a case where an obstacle is a stationary object. Therefore, for example, in a case where erroneous detection, detection omission, or the like of an obstacle occurs due to noise or the like, it is difficult to set an appropriate path on the basis of the occupancy grid map.
- the present technology has been made in view of such a situation, and is to make it possible to obtain an occupancy grid map in which an appropriate path can be set.
- a signal processing device includes: a map creation unit configured to create an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of first sensor data from a first sensor used to detect the object in surroundings of a mobile device; an attribute setting unit configured to set an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; and a correction unit configured to correct the occupancy grid map on the basis of an attribute of the grid.
- a signal processing method includes: creating an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of sensor data from a sensor used to detect the object in surroundings of a mobile device; setting an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; and correcting the occupancy grid map on the basis of an attribute of the grid.
- a program causes a computer to execute processing of: creating an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of sensor data from a sensor used to detect the object in surroundings of a mobile device; setting an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; and correcting the occupancy grid map on the basis of an attribute of the grid.
- a mobile device includes: a map creation unit configured to create an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of sensor data from a sensor used to detect the object in surroundings; an attribute setting unit configured to set an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; a correction unit configured to correct the occupancy grid map on the basis of an attribute of the grid; and an action planning unit configured to set a path on the basis of the corrected occupancy grid map.
- the occupancy grid map indicating a presence or absence of the object in a unit of a grid is created on the basis of sensor data from the sensor used to detect the object in surroundings of the mobile device, an attribute of the grid of the occupancy grid map is set on the basis of an attribute of the object, and the occupancy grid map is corrected on the basis of an attribute of the grid.
- the occupancy grid map indicating a presence or absence of the object in a unit of a grid is created on the basis of sensor data from the sensor used to detect the object in surroundings, an attribute of the grid of the occupancy grid map is set on the basis of an attribute of the object, the occupancy grid map is corrected on the basis of an attribute of the grid, and a path is set on the basis of the corrected occupancy grid map.
- FIG. 2 is a view illustrating an example of a sensing area.
- FIGS. 5 A and 5 B are views for explaining a problem of gradual decrease processing of a presence probability of the occupancy grid map.
- FIGS. 6 A and 6 B are views for explaining a problem of the gradual decrease processing of a presence probability of the occupancy grid map.
- FIG. 7 is a block diagram illustrating a configuration example of an information processing unit to which the present technology is applied.
- FIG. 8 is a flowchart for explaining map creation processing.
- FIG. 9 is a table illustrating an expansion amount for each type of a mobile object.
- FIG. 10 is a graph illustrating an example of transition of a presence probability in a case where reliability of LiDAR is high.
- FIG. 11 is a graph illustrating an example of transition of a presence probability in a case where reliability of LiDAR is low.
- FIG. 12 is a view schematically illustrating an example of the occupancy grid map.
- FIG. 13 is a view illustrating an example of an occupied area of a pedestrian.
- FIG. 14 is a view illustrating an example of an occupied area of a vehicle.
- FIG. 15 is a block diagram illustrating a modification of the information processing unit to which the present technology is applied.
- FIG. 16 is a view illustrating a modification of the occupied area of the vehicle.
- FIG. 17 is a block diagram illustrating a configuration example of a computer.
- FIG. 1 is a block diagram illustrating a configuration example of a vehicle control system 11 , which is an example of a mobile device control system to which the present technology is applied.
- the vehicle control system 11 is provided in a vehicle 1 and performs processing related to travel assistance and automatic driving of the vehicle 1 .
- the vehicle control system 11 includes a processor 21 , a communication unit 22 , a map information accumulation unit 23 , a global navigation satellite system (GNSS) reception unit 24 , an external recognition sensor 25 , an in-vehicle sensor 26 , a vehicle sensor 27 , a recording unit 28 , a travel assistance/automatic driving control unit 29 , a driver monitoring system (DMS) 30 , a human machine interface (HMI) 31 , and a vehicle control unit 32 .
- GNSS global navigation satellite system
- DMS driver monitoring system
- HMI human machine interface
- the processor 21 , the communication unit 22 , the map information accumulation unit 23 , the GNSS reception unit 24 , the external recognition sensor 25 , the in-vehicle sensor 26 , the vehicle sensor 27 , the recording unit 28 , the travel assistance/automatic driving control unit 29 , the driver monitoring system (DMS) 30 , the human machine interface (HMI) 31 , and the vehicle control unit 32 are connected to each other via a communication network 41 .
- the communication network 41 includes, for example, a bus, an in-vehicle communication network conforming to any standard such as a controller area network (CAN), a local interconnect network (LIN), a local area network (LAN), FlexRay (registered trademark), or Ethernet (registered trademark), and the like. Note that there is also a case where each unit of the vehicle control system 11 is directly connected by, for example, short-range wireless communication (near field communication (NFC)), Bluetooth (registered trademark), or the like without via the communication network 41 .
- NFC short-range wireless communication
- the processor 21 includes various processors such as, for example, a central processing unit (CPU), a micro processing unit (MPU), and an electronic control unit (ECU).
- the processor 21 controls the entire vehicle control system 11 .
- the communication unit 22 communicates with various types of equipment inside and outside the vehicle, other vehicles, servers, base stations, and the like, and transmits and receives various data.
- the communication unit 22 receives, from the outside, a program for updating software for controlling an operation of the vehicle control system 11 , map information, traffic information, information around the vehicle 1 , and the like.
- the communication unit 22 transmits information regarding the vehicle 1 (for example, data indicating a state of the vehicle 1 , a recognition result by a recognition unit 73 , and the like), information around the vehicle 1 , and the like to the outside.
- the communication unit 22 performs communication corresponding to a vehicle emergency call system such as an eCall.
- a communication method of the communication unit 22 is not particularly limited. Furthermore, a plurality of communication methods may be used.
- the communication unit 22 performs wireless communication with in-vehicle equipment by a communication method such as wireless LAN, Bluetooth, NFC, or wireless USB (WUSB).
- a communication method such as wireless LAN, Bluetooth, NFC, or wireless USB (WUSB).
- the communication unit 22 performs wired communication with in-vehicle equipment through a communication method such as a universal serial bus (USB), a high-definition multimedia interface (HDMI, registered trademark), or a mobile high-definition link (MHL), via a connection terminal (not illustrated) (and a cable if necessary).
- USB universal serial bus
- HDMI high-definition multimedia interface
- MHL mobile high-definition link
- the in-vehicle equipment is, for example, equipment that is not connected to the communication network 41 in the vehicle.
- mobile equipment or wearable equipment carried by a passenger such as a driver, information equipment brought into the vehicle and temporarily installed, and the like are assumed.
- the communication unit 22 uses a wireless communication method such as a fourth generation mobile communication system (4G), a fifth generation mobile communication system (5G), long term evolution (LTE), or dedicated short range communications (DSRC), to communicate with a server or the like existing on an external network (for example, the Internet, a cloud network, or a company-specific network) via a base station or an access point.
- 4G fourth generation mobile communication system
- 5G fifth generation mobile communication system
- LTE long term evolution
- DSRC dedicated short range communications
- the communication unit 22 uses a peer to peer (P2P) technology to communicate with a terminal (for example, a terminal of a pedestrian or a store, or a machine type communication (MTC) terminal) existing near the own vehicle.
- a terminal for example, a terminal of a pedestrian or a store, or a machine type communication (MTC) terminal
- MTC machine type communication
- the communication unit 22 performs V2X communication.
- the V2X communication is, for example, vehicle to vehicle communication with another vehicle, vehicle to infrastructure communication with a roadside device or the like, vehicle to home communication, vehicle to pedestrian communication with a terminal or the like possessed by a pedestrian, or the like.
- the communication unit 22 receives an electromagnetic wave transmitted by a road traffic information communication system (vehicle information and communication system (VICS), registered trademark), such as a radio wave beacon, an optical beacon, or FM multiplex broadcasting.
- a road traffic information communication system vehicle information and communication system (VICS), registered trademark
- VICS vehicle information and communication system
- the map information accumulation unit 23 accumulates a map acquired from the outside and a map created by the vehicle 1 .
- the map information accumulation unit 23 accumulates a three-dimensional high-precision map, a global map having lower accuracy than the high-precision map and covering a wide area, and the like.
- the high-precision map is, for example, a dynamic map, a point cloud map, a vector map (also referred to as an advanced driver assistance system (ADAS) map), or the like.
- the dynamic map is, for example, a map including four layers of dynamic information, semi-dynamic information, semi-static information, and static information, and is supplied from an external server or the like.
- the point cloud map is a map including a point cloud (point group data).
- the vector map is a map in which information such as a lane and a position of a traffic light is associated with the point cloud map.
- the point cloud map and the vector map may be supplied from, for example, an external server or the like, or may be created by the vehicle 1 as a map for performing matching with a local map to be described later on the basis of a sensing result by a radar 52 , a LiDAR 53 , or the like, and may be accumulated in the map information accumulation unit 23 . Furthermore, in a case where the high-precision map is supplied from an external server or the like, in order to reduce a communication capacity, for example, map data of several hundred meters square regarding a planned path on which the vehicle 1 will travel is acquired from a server or the like.
- the GNSS reception unit 24 receives a GNSS signal from a GNSS satellite, and supplies to the travel assistance/automatic driving control unit 29 .
- the external recognition sensor 25 includes various sensors used for recognizing a situation outside the vehicle 1 , and supplies sensor data from each sensor to each unit of the vehicle control system 11 . Any type and number of sensors included in the external recognition sensor 25 may be adopted.
- the external recognition sensor 25 includes, a camera 51 , the radar 52 , the light detection and ranging or laser imaging detection and ranging (LiDAR) 53 , and an ultrasonic sensor 54 .
- a camera 51 the radar 52 , the light detection and ranging or laser imaging detection and ranging (LiDAR) 53 , and an ultrasonic sensor 54 .
- LiDAR laser imaging detection and ranging
- ultrasonic sensor 54 Any number of the cameras 51 , the radar 52 , the LiDAR 53 , and the ultrasonic sensor 54 may be adopted, and an example of a sensing area of each sensor will be described later.
- the camera 51 for example, a camera of any image capturing system such as a time of flight (ToF) camera, a stereo camera, a monocular camera, or an infrared camera is used as necessary.
- a camera of any image capturing system such as a time of flight (ToF) camera, a stereo camera, a monocular camera, or an infrared camera is used as necessary.
- ToF time of flight
- stereo camera stereo camera
- monocular camera a monocular camera
- infrared camera infrared camera
- the external recognition sensor 25 includes an environment sensor for detection of weather, a meteorological state, a brightness, and the like.
- the environment sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, an illuminance sensor, and the like.
- the external recognition sensor 25 includes a microphone to be used to detect sound around the vehicle 1 , a position of a sound source, and the like.
- the in-vehicle sensor 26 includes various sensors for detection of information inside the vehicle, and supplies sensor data from each sensor to each unit of the vehicle control system 11 . Any type and number of sensors included in the in-vehicle sensor 26 may be adopted.
- the in-vehicle sensor 26 includes a camera, a radar, a seating sensor, a steering wheel sensor, a microphone, a biological sensor, and the like.
- a camera for example, a camera of any image capturing system such as a ToF camera, a stereo camera, a monocular camera, or an infrared camera can be used.
- the biological sensor is provided, for example, in a seat, a steering wheel, or the like, and detects various kinds of biological information of a passenger such as a driver.
- the vehicle sensor 27 includes various sensors for detection of a state of the vehicle 1 , and supplies sensor data from each sensor to each unit of the vehicle control system 11 . Any type and number of sensors included in the vehicle sensor 27 may be adopted.
- the vehicle sensor 27 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU).
- the vehicle sensor 27 includes a steering angle sensor that detects a steering angle of a steering wheel, a yaw rate sensor, an accelerator sensor that detects an operation amount of an accelerator pedal, and a brake sensor that detects an operation amount of a brake pedal.
- the vehicle sensor 27 includes a rotation sensor that detects a number of revolutions of an engine or a motor, an air pressure sensor that detects an air pressure of a tire, a slip rate sensor that detects a slip rate of a tire, and a wheel speed sensor that detects a rotation speed of a wheel.
- the vehicle sensor 27 includes a battery sensor that detects a remaining amount and a temperature of a battery, and an impact sensor that detects an external impact.
- the recording unit 28 includes, for example, a magnetic storage device such as a read only memory (ROM), a random access memory (RAM), and a hard disc drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, and the like.
- the recording unit 28 stores various programs, data, and the like used by each unit of the vehicle control system 11 .
- the recording unit 28 records a rosbag file including a message transmitted and received by a Robot Operating System (ROS) in which an application program related to automatic driving operates.
- the recording unit 28 includes an Event Data Recorder (EDR) and a Data Storage System for Automated Driving (DSSAD), and records information of the vehicle 1 before and after an event such as an accident.
- EDR Event Data Recorder
- DSSAD Data Storage System for Automated Driving
- the travel assistance/automatic driving control unit 29 controls travel support and automatic driving of the vehicle 1 .
- the travel assistance/automatic driving control unit 29 includes an analysis unit 61 , an action planning unit 62 , and an operation control unit 63 .
- the analysis unit 61 performs analysis processing on a situation of the vehicle 1 and surroundings.
- the analysis unit 61 includes an own-position estimation unit 71 , a sensor fusion unit 72 , and the recognition unit 73 .
- the own-position estimation unit 71 estimates an own-position of the vehicle 1 on the basis of sensor data from the external recognition sensor 25 and a high-precision map accumulated in the map information accumulation unit 23 .
- the own-position estimation unit 71 generates a local map on the basis of sensor data from the external recognition sensor 25 , and estimates the own-position of the vehicle 1 by performing matching of the local map with the high-precision map.
- the position of the vehicle 1 is based on, for example, a center of a rear wheel pair axle.
- the local map is, for example, a three-dimensional high-precision map, an occupancy grid map, or the like created using a technique such as simultaneous localization and mapping (SLAM).
- the three-dimensional high-precision map is, for example, the above-described point cloud map or the like.
- the occupancy grid map is a map in which a three-dimensional or two-dimensional space around the vehicle 1 is divided into grids of a predetermined size, and an occupancy state of an object is indicated in a unit of a grid.
- the occupancy state of the object is indicated by, for example, a presence or absence or a presence probability of the object.
- the local map is also used for detection processing and recognition processing of a situation outside the vehicle 1 by the recognition unit 73 , for example.
- the own-position estimation unit 71 may estimate the own-position of the vehicle 1 on the basis of a GNSS signal and sensor data from the vehicle sensor 27 .
- the sensor fusion unit 72 performs sensor fusion processing of combining a plurality of different types of sensor data (for example, image data supplied from the camera 51 and sensor data supplied from the radar 52 ) to obtain new information.
- Methods for combining different types of sensor data include integration, fusion, association, and the like.
- the recognition unit 73 performs detection processing and recognition processing of a situation outside the vehicle 1 .
- the recognition unit 73 performs detection processing and recognition processing of a situation outside the vehicle 1 on the basis of information from the external recognition sensor 25 , information from the own-position estimation unit 71 , information from the sensor fusion unit 72 , and the like.
- the recognition unit 73 performs detection processing, recognition processing, and the like of an object around the vehicle 1 .
- the detection processing of the object is, for example, processing of detecting a presence or absence, a size, a shape, a position, a movement, and the like of the object.
- the recognition processing of the object is, for example, processing of recognizing an attribute such as a type of the object or identifying a specific object.
- the detection processing and the recognition processing are not necessarily clearly divided, and may overlap.
- the recognition unit 73 detects an object around the vehicle 1 by performing clustering for classifying a point cloud on the basis of sensor data of the LiDAR, the radar, or the like for each cluster of point groups. As a result, a presence or absence, a size, a shape, and a position of the object around the vehicle 1 are detected.
- the recognition unit 73 detects a movement of the object around the vehicle 1 by performing tracking that is following a movement of the cluster of point groups classified by clustering. As a result, a speed and a traveling direction (a movement vector) of the object around the vehicle 1 are detected.
- the recognition unit 73 recognizes a type of the object around the vehicle 1 by performing object recognition processing such as semantic segmentation on an image data supplied from the camera 51 .
- the object to be detected or recognized for example, a vehicle, a person, a bicycle, an obstacle, a structure, a road, a traffic light, a traffic sign, a road sign, and the like are assumed.
- the recognition unit 73 performs recognition processing of traffic rules around the vehicle 1 on the basis of a map accumulated in the map information accumulation unit 23 , an estimation result of the own-position, and a recognition result of the object around the vehicle 1 .
- this processing for example, a position and a state of a traffic light, contents of a traffic sign and a road sign, contents of a traffic regulation, a travelable lane, and the like are recognized.
- the recognition unit 73 performs recognition processing of an environment around the vehicle 1 .
- the surrounding environment to be recognized for example, weather, a temperature, a humidity, a brightness, road surface conditions, and the like are assumed.
- the action planning unit 62 creates an action plan of the vehicle 1 .
- the action planning unit 62 creates an action plan by performing processing of path planning and path following.
- path planning is processing of planning a rough path from a start to a goal.
- This path planning is called track planning, and also includes processing of track generation (local path planning) that enables safe and smooth traveling in the vicinity of the vehicle 1 , in consideration of motion characteristics of the vehicle 1 in the path planned by the path planning.
- Path following is processing of planning an operation for safely and accurately traveling a path planned by the path planning within a planned time. For example, a target speed and a target angular velocity of the vehicle 1 are calculated.
- the operation control unit 63 controls an operation of the vehicle 1 in order to realize the action plan created by the action planning unit 62 .
- the operation control unit 63 controls a steering control unit 81 , a brake control unit 82 , and a drive control unit 83 to perform acceleration/deceleration control and direction control such that the vehicle 1 travels on a track calculated by the track planning.
- the operation control unit 63 performs cooperative control for the purpose of implementing functions of the ADAS, such as collision avoidance or impact mitigation, follow-up traveling, vehicle speed maintaining traveling, collision warning of the own vehicle, lane deviation warning of the own vehicle, and the like.
- the operation control unit 63 performs cooperative control for the purpose of automatic driving or the like of autonomously traveling without depending on an operation of the driver.
- the DMS 30 performs driver authentication processing, recognition processing of a state of the driver, and the like on the basis of sensor data from the in-vehicle sensor 26 , input data inputted to the HMI 31 , and the like.
- a state of the driver for example, a physical condition, an awakening level, a concentration level, a fatigue level, a line-of-sight direction, a drunkenness level, a driving operation, a posture, and the like are assumed.
- the DMS 30 may perform authentication processing of a passenger other than the driver and recognition processing of a state of the passenger. Furthermore, for example, the DMS 30 may perform recognition processing of a situation inside the vehicle on the basis of sensor data from the in-vehicle sensor 26 . As the situation inside the vehicle to be recognized, for example, a temperature, a humidity, a brightness, odor, and the like are assumed.
- the HMI 31 is used for inputting various data, instructions, and the like, generates an input signal on the basis of the inputted data, instructions, and the like, and supplies to each unit of the vehicle control system 11 .
- the HMI 31 includes: operation devices such as a touch panel, a button, a microphone, a switch, and a lever; an operation device that can be inputted by a method other than manual operation, such as with voice or a gesture; and the like.
- the HMI 31 may be a remote control device using infrared ray or other radio waves, or external connection equipment such as mobile equipment or wearable equipment corresponding to an operation of the vehicle control system 11 .
- the HMI 31 performs output control to control generation and output of visual information, auditory information, and tactile information to the passenger or the outside of the vehicle, and to control output contents, output timings, an output method, and the like.
- the visual information is, for example, information indicated by an image or light such as an operation screen, a state display of the vehicle 1 , a warning display, or a monitor image indicating a situation around the vehicle 1 .
- the auditory information is, for example, information indicated by sound such as guidance, warning sound, or a warning message.
- the tactile information is, for example, information given to a tactile sense of the passenger by a force, a vibration, a movement, or the like.
- a display device As a device that outputs visual information, for example, a display device, a projector, a navigation device, an instrument panel, a camera monitoring system (CMS), an electronic mirror, a lamp, and the like are assumed.
- the display device may be, for example, a device that displays visual information in a passenger's field of view, such as a head-up display, a transmissive display, or a wearable device having an augmented reality (AR) function, in addition to a device having a normal display.
- augmented reality AR
- an audio speaker for example, an audio speaker, a headphone, an earphone, or the like is assumed.
- a haptic element using haptic technology As a device that outputs tactile information, for example, a haptic element using haptic technology, or the like, is assumed.
- the haptic element is provided, for example, on a steering wheel, a seat, or the like.
- the vehicle control unit 32 controls each unit of the vehicle 1 .
- the vehicle control unit 32 includes the steering control unit 81 , the brake control unit 82 , the drive control unit 83 , a body system control unit 84 , a light control unit 85 , and a horn control unit 86 .
- the steering control unit 81 performs detection, control, and the like of a state of a steering system of the vehicle 1 .
- the steering system includes, for example, a steering mechanism including a steering wheel and the like, an electric power steering, and the like.
- the steering control unit 81 includes, for example, a controlling unit such as an ECU that controls the steering system, an actuator that drives the steering system, and the like.
- the brake control unit 82 performs detection, control, and the like of a state of a brake system of the vehicle 1 .
- the brake system includes, for example, a brake mechanism including a brake pedal, an antilock brake system (ABS), and the like.
- the brake control unit 82 includes, for example, a controlling unit such as an ECU that controls a brake system, an actuator that drives the brake system, and the like.
- the drive control unit 83 performs detection, control, and the like of a state of a drive system of the vehicle 1 .
- the drive system includes, for example, an accelerator pedal, a driving force generation device for generation of a driving force such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmission of the driving force to wheels, and the like.
- the drive control unit 83 includes, for example, a controlling unit such as an ECU that controls the drive system, an actuator that drives the drive system, and the like.
- the body system control unit 84 performs detection, control, and the like of a state of a body system of the vehicle 1 .
- the body system includes, for example, a keyless entry system, a smart key system, a power window device, a power seat, an air conditioner, an airbag, a seat belt, a shift lever, and the like.
- the body system control unit 84 includes, for example, a controlling unit such as an ECU that controls the body system, an actuator that drives the body system, and the like.
- the light control unit 85 performs detection, control, and the like of a state of various lights of the vehicle 1 .
- the lights to be controlled for example, a headlight, a backlight, a fog light, a turn signal, a brake light, a projection, a display of a bumper, and the like are assumed.
- the light control unit 85 includes a controlling unit such as an ECU that controls lights, an actuator that drives lights, and the like.
- the horn control unit 86 performs detection, control, and the like of state of a car horn of the vehicle 1 .
- the horn control unit 86 includes, for example, a controlling unit such as an ECU that controls the car horn, an actuator that drives the car horn, and the like.
- FIG. 2 is a view illustrating an example of a sensing area by the camera 51 , the radar 52 , the LiDAR 53 , and the ultrasonic sensor 54 of the external recognition sensor 25 in FIG. 1 .
- Sensing areas 101 F and 101 B illustrate examples of sensing areas of the ultrasonic sensor 54 .
- the sensing area 101 F covers a periphery of a front end of the vehicle 1 .
- the sensing area 101 B covers a periphery of a rear end of the vehicle 1 .
- Sensing results in the sensing areas 101 F and 101 B are used, for example, for parking assistance and the like of the vehicle 1 .
- Sensing areas 102 F to 102 B illustrate examples of sensing areas of the radar 52 for a short distance or a middle distance.
- the sensing area 102 F covers a position farther than the sensing area 101 F in front of the vehicle 1 .
- the sensing area 102 B covers a position farther than the sensing area 101 B behind the vehicle 1 .
- the sensing area 102 L covers a rear periphery of a left side surface of the vehicle 1 .
- the sensing area 102 R covers a rear periphery of a right side surface of the vehicle 1 .
- a sensing result in the sensing area 102 F is used, for example, for detection of a vehicle, a pedestrian, or the like existing in front of the vehicle 1 , and the like.
- a sensing result in the sensing area 102 B is used, for example, for a collision prevention function or the like behind the vehicle 1 .
- Sensing results in the sensing areas 102 L and 102 R are used, for example, for detection of an object in a blind spot on a side of the vehicle 1 , and the like.
- Sensing areas 103 F to 103 B illustrate examples of sensing areas by the camera 51 .
- the sensing area 103 F covers a position farther than the sensing area 102 F in front of the vehicle 1 .
- the sensing area 103 B covers a position farther than the sensing area 102 B behind the vehicle 1 .
- the sensing area 103 L covers a periphery of a left side surface of the vehicle 1 .
- the sensing area 103 R covers a periphery of a right side surface of the vehicle 1 .
- a sensing result in the sensing area 103 F is used for, for example, recognition of a traffic light or a traffic sign, a lane departure prevention assist system, and the like.
- a sensing result in the sensing area 103 B is used for, for example, parking assistance, a surround view system, and the like.
- Sensing results in the sensing areas 103 L and 103 R are used, for example, in a surround view system or the like.
- a sensing area 104 shows an example of a sensing area of the LiDAR 53 .
- the sensing area 104 covers a position farther than the sensing area 103 F in front of the vehicle 1 . Whereas, the sensing area 104 has a narrower range in a left-right direction than the sensing area 103 F.
- a sensing result in the sensing area 104 is used for, for example, emergency braking, collision avoidance, pedestrian detection, and the like.
- a sensing area 105 illustrates an example of a sensing area of the radar 52 for a long distance.
- the sensing area 105 covers a position farther than the sensing area 104 in front of the vehicle 1 . Whereas, the sensing area 105 has a narrower range in a left-right direction than the sensing area 104 .
- a sensing result in the sensing area 105 is used for, for example, adaptive cruise control (ACC) and the like.
- ACC adaptive cruise control
- each sensor may have various configurations other than those in FIG. 2 .
- the ultrasonic sensor 54 may also perform sensing on a side of the vehicle 1
- the LiDAR 53 may perform sensing behind the vehicle 1 .
- the occupancy grid map is a map representing an occupancy state of an object in a space around the vehicle 1 in a unit of a grit.
- the occupancy state of each grid is represented by a presence probability of an object.
- the presence probability is expressed, for example, within a range from 0 (an object is not present) to 1 (an object is present), and approaches 1 as the probability that an object is present increases, and approaches 0 as the probability that an object is present decreases.
- the occupancy state of each grid is represented by a presence or absence of an object.
- the occupancy state of each grid is represented by three types of values: present (an object is present), absent (an object is not present), and unknown.
- present an object is present
- absent an object is not present
- unknown an object is not present
- the occupancy state is set to “present” in a case where the presence probability is equal to or greater than a predetermined threshold value
- the occupancy state is set to “absent” in a case where the presence probability is less than the predetermined threshold value.
- the action planning unit 62 of the vehicle 1 sets a nearest path so as to pass through an area where no object is present in the occupancy grid map.
- expansion processing is usually performed on the occupancy grid map.
- the expansion processing is processing of expanding, to an adjacent grid, a boundary of an area (hereinafter, referred to as an occupied area) where each object occupies (is present) on the occupancy grid map.
- an occupied area a boundary of an area where each object occupies (is present) on the occupancy grid map.
- the occupied area of each object becomes larger than an actual area of the object in the occupancy grid map, and a risk of collision or contact with the object is reduced even if the vehicle 1 sets a path near the boundary of the occupied area.
- a margin for each object increases, and a path that enables safe avoidance even if a mobile object such as a vehicle or a person moves is to be set.
- a range in which the vehicle 1 can pass is narrowed, which may make it difficult to set a path or cause the vehicle 1 to take a detour.
- FIGS. 3 A and 3 B illustrate an example in which the vehicle 1 is about to pass between a stationary object 201 and a stationary object 202 .
- FIG. 3 A illustrates an example in which the expansion amount is reduced.
- an occupied area 211 A for the stationary object 201 and an occupied area 212 A for the stationary object 202 decreases.
- a space that allows traveling is secured between the occupied area 211 A and the occupied area 212 A, and the vehicle 1 can travel in a direction of arrow A 1 .
- FIG. 3 B illustrates an example in which the expansion amount is increased.
- an occupied area 211 B for the stationary object 201 and an occupied area 212 B for the stationary object 202 are increased.
- a space that allows traveling is not secured between the occupied area 211 B and the occupied area 212 B, and the vehicle 1 cannot move in a direction of arrow A 2 and needs to detour.
- FIGS. 4 A and 4 B illustrate an example in which the vehicle 1 is about to pass by a person 221 .
- FIG. 4 A illustrates an example in which the expansion amount is reduced.
- an occupied area 231 A for the person 221 becomes small, and a margin decreases. Therefore, the vehicle 1 travels immediately near the person 221 , and a risk of collision or contact with the person 221 increases.
- FIG. 4 B illustrates an example in which the expansion amount is increased.
- an occupied area 231 B for the person 221 is increased, and a margin is increased. Therefore, the vehicle 1 travels at a position away from the person 221 , and the risk of collision or contact with the person 221 is reduced.
- gradual decrease processing and gradual increase processing of the presence probability are performed on the occupancy grid map.
- the gradual decrease processing of the presence probability is processing of gradually decreasing the presence probability of a certain grid to 0 over time without immediately changing 0, even when an object is no longer detected in the grid.
- the gradual increase processing of the presence probability is processing of gradually increasing the presence probability of a certain grid to 1 over time without immediately changing to 1, even when an object is detected in the grid.
- a gradual decrease time and a gradual increase time are set to be constant.
- the gradual decrease time is a time required for the presence probability to change from 1 to 0.
- the gradual increase time is a time required for the presence probability to change from 0 to 1.
- the occupancy state of each grid is likely to change due to noise or the like, and there is a high risk that the occupancy state is erroneously set. For example, there is a high risk that an occupancy state of an area where a stationary object actually is present is set to “absent”.
- the occupancy state of each grid is less likely to change due to noise or the like.
- a movement or the like of the object is not quickly reflected in the occupancy grid map, and for example, an occupancy state of an area where an object has already moved and is no longer present is set to “present”, and there is a high possibility that the vehicle 1 cannot pass.
- FIGS. 5 A and 5 B illustrates an example in which the vehicle 1 is passing between a stationary object 241 and a stationary object 241 and aiming at a goal 243 on the right of the stationary object 242 .
- FIG. 5 A illustrates an example in which the gradual decrease time is shortened.
- occupied areas 251 to 253 are set for the stationary object 241
- occupied areas 254 to 256 are set for the stationary object 242 .
- an occupied area is not set between the occupied area 251 and the occupied area 252 and between the occupied area 252 and the occupied area 253 due to a low reflectance of the stationary object 241 or the like.
- an occupied area is not set between the occupied area 254 and the occupied area 255 and between the occupied area 255 and the occupied area 256 due to a low reflectance of the stationary object 242 or the like.
- the vehicle 1 determines that there is no object in a circled area B 21 and proceeds in a direction of arrow A 21 , and a risk of colliding with the stationary object 242 increases.
- FIG. 5 B illustrates an example in which the gradual decrease time is lengthened.
- an occupied area 261 and an occupied area 262 are also set between the occupied area 251 and the occupied area 252 and between the occupied area 252 and the occupied area 253 .
- an occupied area 263 and an occupied area 264 are also set between the occupied area 254 and the occupied area 255 and between the occupied area 255 and the occupied area 256 .
- the vehicle 1 travels in a direction of arrow A 22 while avoiding the stationary object 242 , and can arrive at the goal 243 without colliding with the stationary object 242 .
- FIGS. 6 A and 6 B illustrates an example in which the vehicle 1 is about to pass by a person 271 who is walking in a direction of arrow A 32 .
- FIG. 6 A illustrates an example in which the gradual decrease time is shortened.
- a time until an occupancy state of a grid after the person 271 walks changes to “absent” is shortened. Therefore, an occupied area 281 A for the person 271 is narrowed, and the vehicle 1 can quickly pass through the area after the person 271 disappears, in a direction of arrow A 31 .
- FIG. 6 B illustrates an example in which the gradual decrease time is lengthened.
- a time until an occupancy state of a grid after the person 271 walks changes to “absent” is lengthened. Therefore, an occupied area 281 B for the person 271 becomes long in a front-back direction of the person 271 , and the vehicle 1 cannot move in a direction of arrow A 33 even though the person 271 is not present.
- the present technology is intended to solve these problems.
- FIG. 7 illustrates a configuration example of an information processing unit 301 to which the present technology is applied.
- the information processing unit 301 is included in the recognition unit 73 of the vehicle control system 11 in FIG. 1 , for example, and creates an occupancy grid map.
- the information processing unit 301 includes an attribute detection unit 311 , a map creation unit 312 , an attribute setting unit 313 , and a correction unit 314 .
- the camera 51 captures an image of surroundings of the vehicle 1 and supplies image data including information indicating an attribute of an object around the vehicle 1 , to the information processing unit 301 .
- the LiDAR 53 performs sensing around the vehicle 1 to detect a position of an object around the vehicle 1 , and generates a three-dimensional point cloud indicating distribution of reflection points of each object, and supplies to the information processing unit 301 .
- the attribute detection unit 311 detects an attribute of an object around the vehicle 1 on the basis of image data and a point cloud.
- the attribute detection unit 311 includes a classification unit 321 , an object detection unit 322 , and a movement estimation unit 323 .
- the classification unit 321 recognizes a type of an object, which is one of attributes of the object around the vehicle 1 , on the basis of image data.
- the classification unit 321 supplies a recognition result of the type of the object to the object detection unit 322 .
- the object detection unit 322 detects an object in a three-dimensional space around the vehicle 1 on the basis of a point cloud. Furthermore, the object detection unit 322 associates the type of the object recognized by the classification unit 321 with the detected object. The object detection unit 322 supplies an object detection result to the movement estimation unit 323 and the attribute setting unit 313 .
- the movement estimation unit 323 performs tracking of an object detected by the object detection unit 322 to estimate a movement of the object, which is one of attributes of the object around the vehicle 1 .
- the movement estimation unit 323 supplies an estimation result of the movement of the object to the attribute setting unit 313 .
- the map creation unit 312 creates an occupancy grid map around the vehicle 1 on the basis of a point cloud.
- the map creation unit 312 supplies the occupancy grid map to the attribute setting unit 313 .
- the attribute setting unit 313 sets an attribute of each grid of the occupancy grid map on the basis of a detection result of the object and a result of the movement estimation.
- the attribute setting unit 313 supplies the occupancy grid map with the attribute to the correction unit 314 .
- the correction unit 314 corrects the occupancy grid map on the basis of an attribute of each grid.
- the correction unit 314 supplies the corrected occupancy grid map to a subsequent stage (for example, the action planning unit 62 or the like).
- map creation processing executed by the information processing unit 301 will be described.
- This processing is started when power of the vehicle 1 is turned on, and is ended when the power is turned off, for example.
- step S 1 the information processing unit 301 acquires sensing data.
- the camera 51 captures an image of surroundings of the vehicle 1 , and supplies obtained image data to the information processing unit 301 .
- the classification unit 321 acquires the image data supplied from the camera 51 .
- the image data may be either color or monochrome.
- the LiDAR 52 performs sensing on surroundings of the vehicle 1 , creates a three-dimensional point cloud, and supplies the point cloud to the information processing unit 301 .
- the map creation unit 312 and the object detection unit 322 acquire the point cloud supplied from the LiDAR 52 .
- step S 2 the object detection unit 322 detects an object. Specifically, the object detection unit 322 appropriately removes noise or the like from the point cloud, and then performs clustering to detect a position, a size, and a shape of the object in the point cloud.
- the classification unit 321 recognizes a type of the object. Specifically, the classification unit 321 uses instance segmentation to recognize a type of the object to which each pixel belongs, for each pixel of image data. The type of the object is classified into, for example, a vehicle, a pedestrian, a bicycle, a road surface, a guardrail, a building, and the like. The classification unit 321 supplies a recognition result of the type of the object to the object detection unit 322 .
- the object detection unit 322 gives an attribute to each point of the point cloud by associating each point with an image coordinate system.
- the object detection unit 322 supplies information indicating a position, a size, a shape, and a type of the object in the three-dimensional coordinate system, to the attribute setting unit 313 and the movement estimation unit 323 .
- step S 4 the map creation unit 312 creates an occupancy grid map. Specifically, the map creation unit 312 appropriately removes noise or the like from a point cloud, then divides the three-dimensional space into grids of a predetermined size, and sets an occupancy state indicating a presence or absence of an object for each grid.
- an occupancy state of a grid corresponding to an area where reflectance is equal to or greater than a predetermined threshold value is set to “present” indicating a presence of an object.
- An occupancy state of a grid corresponding to an area through which reflected light passes is set to “absent” indicating an absence of an object.
- An occupancy state of a grid corresponding to an area where reflected light has not returned is set to “unknown” indicating that a presence or absence of an object is unknown in all the areas in a radiation direction through which a laser would pass from an emission position of the laser.
- the map creation unit 312 supplies the created occupancy grid map to the attribute setting unit 313 .
- steps S 2 and S 3 and the processing in step S 4 are performed in parallel, for example.
- step S 5 the movement estimation unit 323 performs tracking of the object. Specifically, the movement estimation unit 323 uses a Kalman filter, a particle filter, or the like to recognize a movement of each object (a cluster of point groups) detected in the point cloud. At this time, the movement estimation unit 323 applies the Kalman filter or the particle filter in accordance with characteristics (for example, a difference between a mobile object and a stationary object, an assumed speed, and the like) of each object, for example.
- characteristics for example, a difference between a mobile object and a stationary object, an assumed speed, and the like
- the movement estimation unit 323 estimates a movement vector of the object. Specifically, the movement estimation unit 323 estimates the movement vector of each object, that is, a speed and a traveling direction (a moving direction) of each object on the basis of a result of tracking each object. The movement estimation unit 323 supplies information indicating an estimation result of the movement vector of each object to the attribute setting unit 313 .
- the attribute setting unit 313 sets an attribute of each grid of the occupancy grid map on the basis of the detection result of the attribute of each object and the estimation result of the movement vector of each object. Specifically, for a grid whose occupancy state of the occupancy grid map is “present”, the attribute setting unit 313 sets, as attributes, whether an object in the grid is a stationary object or a mobile object, and a type, a speed, and a traveling direction of the object. Note that, in a case where a mobile object is standing still, an attribute of a grid including the mobile object is set to a mobile object. The attribute setting unit 313 supplies the occupancy grid map with the attribute to the correction unit 314 .
- step S 8 the correction unit 314 corrects the occupancy grid map on the basis of the attribute of each grid.
- the correction unit 314 executes the expansion processing on the basis of the attribute of each grid. For example, the correction unit 314 sets an expansion amount and an expansion direction on the basis of an attribute of a grid at a boundary (an end portion) of the occupied area of each object on the occupancy grid map.
- the expansion amount is increased as compared with a case of a stationary object. That is, the occupied area of the mobile object is expanded more greatly than the occupied area of the stationary object.
- the expansion amount is increased in a direction closer to a traveling direction. That is, the occupied area of the mobile object is expanded more greatly for the traveling direction.
- the expansion amount is increased as a speed increases. That is, an occupied area of a mobile object having a higher speed is more greatly expanded.
- the expansion amount is reduced in a direction farther from the traveling direction. That is, in the occupied area of the mobile object that is difficult to change the direction, the expansion amount decreases as a distance from the traveling direction increases.
- FIG. 9 illustrates an example of an expansion amount and an expansion direction for each type of the mobile object.
- examples of cases are illustrated in which tracking of an object is performed and a traveling direction of the object has been detected and in which tracking of an object is not performed and the traveling direction of the object has not been detected.
- the expansion amount in the traveling direction made larger than usual for all the objects.
- the expansion amount in a left-right direction perpendicular to the traveling direction is not changed but kept as usual for a pedestrian, an office robot, and a drone whose direction is easily changed.
- the expansion amount in the left-right direction perpendicular to the traveling direction is made smaller than usual.
- the normal expansion amount is, for example, an expansion amount at a time of standing still.
- the correction unit 314 expands the occupied area of each object on the occupancy grid map, on the basis of the set expansion amount and expansion direction. As a result, the boundary of the occupied area of each object is expanded to a neighboring grid, on the basis of the set expansion amount and expansion direction.
- the correction unit 314 sets an attribute of an expanded grid, that is, a grid whose occupancy state has been corrected to “present” by the expansion processing, to the same value as the attribute of the grid at the boundary of the occupied area before the expansion.
- the correction unit 314 adjusts an attribute of a grid of the overlapping portion to the attribute of the stationary object.
- the correction unit 314 sets a change coefficient of each grid of the occupancy grid map after the expansion processing.
- the change coefficient is a coefficient indicating magnitude and a direction in which the presence probability of each grid is changed.
- the presence probability gradually increases at an increase rate according to the change coefficient. Furthermore, as the change coefficient increases, an increase rate of the presence probability increases, a speed at which the presence probability increases (hereinafter, referred to as a gradual increase speed) increases, and a gradual increase time of the presence probability is shortened.
- the presence probability gradually decreases at a decrease rate according to the change coefficient. Furthermore, as the change coefficient decreases (as an absolute value of the change coefficient increases), a decrease rate of the presence probability increases, a speed at which the presence probability decreases (hereinafter, referred to as a gradual decrease speed) increases, and a gradual decrease time of the presence probability is shortened.
- the change coefficient is set to a negative value. That is, the change coefficient is set in a direction in which the presence probability is decreased.
- a value of the change coefficient is adjusted on the basis of reliability (of object recognition) of the LiDAR 53 , which is a sensing device used to create the occupancy grid map.
- the absolute value of the change coefficient is made larger as the reliability of the LiDAR 53 is higher, while the absolute value of the change coefficient is reduced as the reliability of the LiDAR 53 is lower. This configuration prevents detection omission of an object from being reflected in the occupancy grid map, for example, in a case where the reliability of the LiDAR 53 is low.
- the change coefficient is set to a positive value. That is, the change coefficient is set in a direction in which the presence probability is increased. Furthermore, an absolute value of the change coefficient is increased as compared with a case where the grid occupancy state is “absent”. As a result, in each grid, the occupancy state is to be reflected more quickly in a case where the object appears than in a case where the object disappears.
- a value of the change coefficient is adjusted on the basis of an attribute of the grid. For example, when the attribute of the grid is a mobile object, a value of the change coefficient is increased as compared with a case of a stationary object. As a result, an occupancy state of a grid in an area where a mobile object is present as the mobile object moves is to be reflected more quickly.
- the value of the change coefficient is adjusted on the basis of the reliability of the LiDAR 53 .
- the value of the change coefficient is made larger as the reliability of the LiDAR 53 is higher, while the value of the change coefficient is reduced as the reliability of the LiDAR 53 is lower. This configuration prevents an erroneously detected object from being reflected in the occupancy grid map, for example, in a case where the reliability of the LiDAR 53 is low.
- the change coefficient is set to a negative value. That is, the change coefficient is set in a direction in which the presence probability is decreased.
- a value of the change coefficient is adjusted on the basis of an occupancy state and an attribute of a grid one cycle before. For example, in a case where the occupancy state of the grid one cycle before is “present”, when the attribute of the grid one cycle before is a mobile object, an absolute value of the change coefficient is increased than that when the attribute of the grid one cycle before is a stationary object. Furthermore, the absolute value of the change coefficient is made smaller than that in a case where the occupancy state of the current grid is “absent”. That is, in a case where the occupancy state is “unknown”, it is not clear whether or not an object is actually present, so that a decrease in the presence probability is suppressed.
- a value of the change coefficient may be adjusted or may not be adjusted on the basis of the reliability of the LiDAR 53 .
- the correction unit 314 calculates a presence probability of each grid on the basis of the set change coefficient.
- the presence probability increases in a case where the change coefficient is positive, and an increase amount of the presence probability increases as the change coefficient increases. However, in a case where the presence probability exceeds 1, the presence probability is corrected to 1. Whereas, the presence probability decreases in a case where the change coefficient is negative, and a decrease amount of the presence probability increases as an absolute value of the change coefficient increases. However, in a case where the presence probability is less than 0, the presence probability is corrected to 0.
- FIGS. 10 and 11 are graphs illustrating an example of transition of a presence probability of a certain grid on the occupancy grid map.
- a horizontal axis of the graph represents time, and a vertical axis represents a presence probability.
- FIG. 10 illustrates an example in a case where the reliability of the LiDAR 53 is high.
- time from time t 3 to time t 4 is longer than the time from time t 1 to time t 2 . That is, in a grid of the occupancy grid map, disappearance of the object is reflected later than appearance of the object.
- an increase rate of the presence probability is substantially equal to an increase rate in a period from time t 1 to time t 2 . Then, at time t 6 , the presence probability has reached 1.
- the presence probability gradually decreases as the occupancy state changes from “present” to “unknown”.
- a decrease rate of the presence probability is made smaller than a decrease rate in a period from time t 3 to time t 4 . That is, since it is unclear whether or not an object is present, reflection of disappearance of the object in the grid of the occupancy grid map is delayed.
- FIG. 11 illustrates an example in a case where the reliability of the LiDAR 53 is low.
- the occupancy state changes from “present” to “absent” at time t 12 . Therefore, in a period from time t 11 to time t 12 , depending on a value of a threshold value, there may be a case where the occupancy state remains “absent” without being determined to be “present” at the end.
- the presence probability gradually decreases and reaches 0 at time t 13 .
- a decrease rate of the presence probability is smaller than that in the period from time t 3 to time t 4 in FIG. 10 , and a period from time t 12 to time t 13 is longer than the period from time t 3 to time t 4 .
- an increase rate of the presence probability is substantially equal to an increase rate in a period from time t 11 to time t 12 . Then, at time t 15 , the presence probability has reached 1.
- the presence probability gradually decreases as the occupancy state changes from “present” to “unknown”.
- a decrease rate of the presence probability is substantially equal to that in a period after time t 7 in FIG. 10 .
- the correction unit 314 corrects the occupancy state of each grid on the basis of the presence probability of each grid. For example, among grids whose occupancy state is “present”, the correction unit 314 corrects, to “absent”, an occupancy state of a grid whose presence probability is less than a predetermined threshold value. For example, among grids whose occupancy state is “absent”, the correction unit 314 corrects, to “present”, an occupancy state of a grid whose presence probability is equal to or greater than a predetermined threshold value. The occupancy state of the grid whose occupancy state is “unknown” remains unchanged.
- FIG. 12 schematically illustrates an example of the occupancy grid map.
- the vehicle 1 is traveling behind a person 401 and a vehicle 402 , between a wall 403 and a wall 404 .
- the person 401 is traveling in a left diagonal front direction as indicated by arrow A 101 .
- the vehicle 402 is traveling forward as indicated by arrow A 102 .
- an occupied area 411 for the person 401 expands in the traveling direction of the person 401 . Furthermore, the occupied area 411 gradually disappears without expanding, in a direction opposite to the traveling direction of the person 401 . Moreover, the occupied area 411 is narrower in a left-right direction perpendicular to the traveling direction than in a front-back direction.
- the vehicle 1 is prevented from traveling into an area where the person 401 travels, and safety of the person 401 is secured. Furthermore, the vehicle 1 can travel into an area after the person 401 passes, and the number of options of the path of the vehicle 1 increases.
- an occupied area 412 for the vehicle 402 expands in the traveling direction of the vehicle 402 . Furthermore, the occupied area 412 gradually disappears without expanding, in a direction opposite to the traveling direction of the vehicle 402 . Moreover, the occupied area 412 is narrower in a left-right direction perpendicular to the traveling direction than in a front-back direction.
- the vehicle 1 is prevented from traveling into an area where the vehicle 402 travels, and safety of the vehicle 402 is secured. Furthermore, the vehicle 1 can travel into an area after the vehicle 402 passes, and the number of options of the path of the vehicle 1 increases.
- the wall 403 is surrounded by occupied areas 413 to 417 , and the vehicle 1 is prevented from colliding with the wall 403 .
- the wall 404 is surrounded by occupied areas 418 to 422 , and the vehicle 1 is prevented from colliding with the wall 404 .
- the occupied areas 413 to 417 are not greatly expanded and have substantially the same size as the wall 403 . Therefore, the occupied areas 413 to 417 are prevented from obstructing a path of the vehicle 1 .
- the wall 404 is a stationary object, the occupied areas 418 to 422 are not greatly expanded and have substantially the same size as the wall 404 . Therefore, the occupied areas 418 to 422 are prevented from obstructing a path of the vehicle 1 .
- the occupied areas 413 to 417 are occupied areas for the wall 403 that is a stationary object, an absolute value of the change coefficient of each grid is set to be small. Therefore, for example, even if an area having a small reflectance exists in the wall 403 or noise occurs in a point cloud, the occupied areas 413 to 417 are prevented from disappearing. Therefore, the vehicle 1 is prevented from colliding with the wall 403 .
- the occupied areas 418 to 422 are occupied areas for the wall 404 that is a stationary object, an absolute value of the change coefficient of each grid is set to be small. Therefore, for example, even if an area having a small reflectance exists in the wall 404 or noise occurs in a point cloud, the occupied areas 418 to 422 are prevented from disappearing. Therefore, the vehicle 1 is prevented from colliding with the wall 404 .
- the correction unit 314 supplies the corrected occupancy grid map to, for example, the action planning unit 62 or the like.
- the action planning unit 62 uses any path planning method such as A* to perform the path planning so as to pass through an area where an object is absent in the occupancy grid map.
- the operation control unit 63 controls the vehicle 1 so as to pass through the path planned by the action planning unit 62 .
- the expansion processing and setting processing of the presence probability are performed on the basis of an attribute of each grid of the occupancy grid map, the occupied area is appropriately set for each object.
- a path of the vehicle 1 is appropriately set. That is, a shorter path is to be set by avoiding an unnecessary detour while avoiding collision or contact with an obstacle and securing safety.
- FIG. 15 illustrates a modification of the information processing unit. Note that portions corresponding to those of the information processing unit 301 in FIG. 7 are denoted by the same reference numerals, and a description thereof will be omitted as appropriate.
- An information processing unit 501 in FIG. 15 is identical to the information processing unit 301 in that a map creation unit 312 is included. Furthermore, the information processing unit 501 is different from the information processing unit 301 in that an attribute detection unit 511 , an attribute setting unit 512 , and a correction unit 513 are included instead of the attribute detection unit 311 , the attribute setting unit 313 , and the correction unit 314 .
- the attribute detection unit 511 includes only a classification unit 321 .
- the classification unit 321 recognizes a type of an object for each pixel of image data, and supplies a recognition result of the type of the object to the attribute setting unit 512 .
- the attribute setting unit 512 converts a point cloud or a point of depth in three-dimensional coordinates into image coordinates, and sets an attribute. Then, the attribute setting unit 512 sets whether the object to which the grid belongs is a stationary object or a mobile object and a type of the object, as an attribute of a grid whose occupancy state of an occupancy grid map is “present”. The attribute setting unit 512 supplies the occupancy grid map with the attribute to the correction unit 513 .
- the correction unit 513 performs the expansion processing of the occupancy grid map and the setting processing of the presence probability, on the basis of an attribute of each grid. That is, the correction unit 513 performs the expansion processing of the occupancy grid map and the setting processing of the presence probability without using a movement vector (a traveling direction and a speed) of the object.
- an occupied area of each object is to be set more appropriately as compared with a case where the expansion processing and the setting processing of the presence probability are similarly performed on all objects.
- the expansion amount is similarly set for the entire periphery for a pedestrian and a drone whose direction is easily changed. Furthermore, for a bicycle, a motorcycle, an automobile, and a factory conveyance robot that are difficult to change a direction, the expansion amount for the front where the object is likely to travel is increased, for example, in a case where a direction of an object can be detected. Whereas, in a case where the direction of the object cannot be detected, the expansion amount is similarly set for the entire periphery.
- the expansion amount is set similarly to a bicycle, a motorcycle, an automobile, and a factory conveyance robot that are difficult to change the direction.
- the type of the object may be classified into only a mobile object or a stationary object, and only whether a mobile object or a stationary object may be set as the attribute of each grid. Also in this case, an occupied area of each object is to be set more appropriately as compared with a case where the expansion processing and the setting processing of the presence probability are similarly performed on all objects.
- an expansion direction and an expansion amount may be set on the basis of only a traveling direction without using a speed of the mobile object.
- the expansion processing and the setting processing of the presence probability may be performed on a mobile object that is standing still, by using an expansion amount and a change coefficient similar to those of a stationary object or an expansion amount and a change coefficient close to those of a stationary object.
- a threshold value of a speed to be used for determining whether or not the object is standing still may be set to a different value for each type of object. For example, whether or not a vehicle and a pedestrian are standing still may be determined on the basis of threshold values of different speeds.
- determination may be made as to whether or not the object is standing still on the basis of a state of the object.
- a bicycle on which no person rides may be determined as a stationary object.
- a bicycle that is standing still but has a person riding is likely to move, and thus may be determined as a mobile object or an object close to a stationary object (hereinafter, a quasi-stationary object).
- a vehicle on which a driver is not riding may be determined as a stationary object
- a vehicle on which a driver is riding may be determined as a mobile object or a quasi-stationary object.
- a door may be opened. Therefore, for example, in a case where a speed of the vehicle is less than a predetermined threshold value, an expansion amount of an occupied area in a left-right direction with respect to the vehicle may be increased.
- an expansion amount of an occupied area 552 in a left-right direction may be increased.
- the vehicle 1 sets a path that can avoid collision or contact with the vehicle 551 even if a door of the vehicle 551 is opened.
- a gradual increase speed and a gradual decrease speed of a presence probability of a grid in which a mobile object whose safety is secured even when collision occurs may be set to the maximum.
- appearance and disappearance of the mobile object in each grid are to be quickly reflected.
- a change coefficient for a mobile object may be adjusted on the basis of a speed of the vehicle 1 .
- a speed of the vehicle 1 For example, in a case where the vehicle 1 is traveling at a high speed equal to or higher than a predetermined speed on a highway or the like, a gradual increase speed and a gradual decrease speed of a presence probability of a grid in which a mobile object is present may be set to the maximum. As a result, appearance and disappearance of a mobile object in each grid are to be quickly reflected.
- a presence probability of a grid whose occupancy state has changed from “present” to “absent” may be immediately set to 0, and only a presence probability of a grid whose occupancy state has changed from “present” to “unknown” may be gradually decreased.
- a method other than the above-described instance segmentation may be used for recognition processing of a type of an object.
- a method such as semantic segmentation, you only look once (YOLO), or single shot multibox detector (SSD).
- a mobile object can be detected using an optical flow, but in this case, a mobile object that is standing still cannot be detected.
- a point cloud may be created using a sensor other than the LiDAR.
- a point cloud may be created by a radar, a depth camera (for example, a stereo camera or a ToF camera), or the like.
- a point cloud may be created on the basis of a plurality of pieces of image data captured by a monocular camera in different orientations.
- the occupancy grid map may be created on the basis of a three-dimensional model other than the point cloud.
- movement estimation of an object may be performed on the basis of image data.
- the present technology can also be applied to a two-dimensional occupancy grid map.
- a recognition result of an object of image data is projection-converted into a two-dimensional bird's-eye view, and is associated with each object.
- an upper limit value of a moving speed of an object may be provided on the basis of a type of the object. As a result, erroneous detection of the moving speed can be prevented.
- a sensor such as the camera 51 may perform the recognition processing of a type of an object and supply a recognition result of the type of the object to the information processing unit 301 or the information processing unit 501 .
- the present technology can also be applied to a case where an occupancy grid map is created in a mobile device other than a vehicle, for example, a drone, a robot, or the like.
- the series of processes described above can be executed by hardware or also executed by software.
- a program that configures the software is installed in a computer.
- examples of the computer include, for example, a computer that is built in dedicated hardware, a general-purpose personal computer that can perform various functions by being installed with various programs, and the like.
- FIG. 17 is a block diagram illustrating a configuration example of hardware of a computer that executes the series of processes described above in accordance with a program.
- a central processing unit (CPU) 1001 a central processing unit (CPU) 1001 , a read only memory (ROM) 1002 , and a random access memory (RAM) 1003 are mutually connected by a bus 1004 .
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- the bus 1004 is further connected with an input/output interface 1005 .
- an input unit 1006 To the input/output interface 1005 , an input unit 1006 , an output unit 1007 , a recording unit 1008 , a communication unit 1009 , and a drive 1010 are connected.
- the input unit 1006 includes an input switch, a button, a microphone, an image sensor, and the like.
- the output unit 1007 includes a display, a speaker, and the like.
- the recording unit 1008 includes a hard disk, a non-volatile memory, and the like.
- the communication unit 1009 includes a network interface or the like.
- the drive 1010 drives a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the series of processes described above are performed, for example, by the CPU 1001 loading a program recorded in the recording unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004 , and executing.
- the program executed by the computer 1000 can be provided by being recorded on, for example, the removable medium 1011 as a package medium or the like. Furthermore, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the recording unit 1008 via the input/output interface 1005 . Furthermore, the program can be received by the communication unit 1009 via a wired or wireless transmission medium, and installed in the recording unit 1008 . Besides, the program can be installed in advance in the ROM 1002 and the recording unit 1008 .
- the program executed by the computer may be a program that performs processing in a time series according to an order described in this specification, or may be a program that performs processing in parallel or at necessary timing such as when a call is made.
- the system means a set of a plurality of components (a device, a module (a part), and the like), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device with a plurality of modules housed in one housing are both systems.
- the present technology can have a cloud computing configuration in which one function is shared and processed in cooperation by a plurality of devices via a network.
- each step described in the above-described flowchart can be executed by one device, and also shared and executed by a plurality of devices.
- one step includes a plurality of processes
- the plurality of processes included in the one step can be executed by one device, and also shared and executed by a plurality of devices.
- the present technology can also have the following configurations.
- a signal processing device including:
- the signal processing device according to any one of (1) to (13) described above, further including:
- a signal processing method including:
- a mobile device including:
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computational Linguistics (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- Patent Document 1: Japanese Patent Application Laid-Open No. 2012-238151
-
- 1. Configuration example of vehicle control system
- 2. Problem of occupancy grid map
- 3. Embodiment
- 4. Modification
- 5. Other
-
- a map creation unit configured to create an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of first sensor data from a first sensor used to detect the object in surroundings of a mobile device;
- an attribute setting unit configured to set an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; and
- a correction unit configured to correct the occupancy grid map on the basis of an attribute of the grid.
(2)
-
- the correction unit sets an expansion amount of an occupied area of the object on the occupancy grid map on the basis of an attribute of the grid, and performs expansion processing of the occupied area on the basis of the expansion amount.
(3)
- the correction unit sets an expansion amount of an occupied area of the object on the occupancy grid map on the basis of an attribute of the grid, and performs expansion processing of the occupied area on the basis of the expansion amount.
-
- an attribute of the grid includes a type of the object in the grid, and
- the correction unit sets the expansion amount on the basis of a type of the object.
(4)
-
- a type of the object includes a mobile object and a stationary object, and
- the correction unit sets an expansion amount for a mobile object to be larger than an expansion amount for a stationary object.
(5)
-
- an attribute of the grid includes a traveling direction of the object in the grid, and
- the correction unit sets the expansion amount and an expansion direction of the occupied area on the basis of a traveling direction of the object.
(6)
-
- an attribute of the grid further includes a speed of the object in the grid, and
- the correction unit sets the expansion amount, further on the basis of a speed of the object.
(7)
-
- an attribute of the grid includes a direction of the object in the grid, and
- the correction unit sets the expansion amount and an expansion direction of the occupied area on the basis of a direction of the object.
(8)
-
- the correction unit sets a presence probability of the object in the grid on the basis of an occupancy state indicating a presence or absence of the object in the grid and on the basis of an attribute of the grid.
(9)
- the correction unit sets a presence probability of the object in the grid on the basis of an occupancy state indicating a presence or absence of the object in the grid and on the basis of an attribute of the grid.
-
- an attribute of the grid includes a type of the object in the grid, and
- the correction unit sets a speed at which the presence probability of the grid is increased or decreased of the grid on the basis of a type of the object in the grid.
(10)
-
- a type of the object includes a mobile object and a stationary object, and
- in a case where the occupancy state of the grid has changed from present to absent, when the object in the grid is a mobile object, the correction unit increases a speed at which the presence probability of the grid is decreased as compared with a case where the object in the grid is a stationary object.
(11)
-
- in a case where the occupancy state of the grid has changed from present to unknown, the correction unit reduces a speed at which the presence probability of the grid is decreased as compared with a case where the occupancy state has changed from present to absent.
(12)
- in a case where the occupancy state of the grid has changed from present to unknown, the correction unit reduces a speed at which the presence probability of the grid is decreased as compared with a case where the occupancy state has changed from present to absent.
-
- the correction unit sets a speed at which the presence probability of the grid is increased or decreased on the basis of reliability of the first sensor.
(13)
- the correction unit sets a speed at which the presence probability of the grid is increased or decreased on the basis of reliability of the first sensor.
-
- the correction unit corrects the occupancy state on the basis of the presence probability of the grid.
(14)
- the correction unit corrects the occupancy state on the basis of the presence probability of the grid.
-
- an attribute detection unit configured to detect an attribute of the object on the basis of second sensor data from a second sensor.
(15)
- an attribute detection unit configured to detect an attribute of the object on the basis of second sensor data from a second sensor.
-
- the attribute detection unit detects a type of the object as an attribute of the object on the basis of the second sensor data.
(16)
- the attribute detection unit detects a type of the object as an attribute of the object on the basis of the second sensor data.
-
- the attribute detection unit detects a traveling direction and a speed of the object as attributes of the object on the basis of the first sensor data and the second sensor data.
(17)
- the attribute detection unit detects a traveling direction and a speed of the object as attributes of the object on the basis of the first sensor data and the second sensor data.
-
- the second sensor is a sensor used to acquire information indicating an attribute of the object.
(18)
- the second sensor is a sensor used to acquire information indicating an attribute of the object.
-
- creating an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of sensor data from a sensor used to detect the object in surroundings of a mobile device;
- setting an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; and
- correcting the occupancy grid map on the basis of an attribute of the grid.
(19)
-
- creating an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of sensor data from a sensor used to detect the object in surroundings of a mobile device;
- setting an attribute of the grid of the occupancy grid map on the basis of an attribute of the object; and
- correcting the occupancy grid map on the basis of an attribute of the grid.
(20)
-
- a map creation unit configured to create an occupancy grid map indicating a presence or absence of an object in a unit of a grid on the basis of sensor data from a sensor used to detect the object in surroundings;
- an attribute setting unit configured to set an attribute of the grid of the occupancy grid map on the basis of an attribute of the object;
- a correction unit configured to correct the occupancy grid map on the basis of an attribute of the grid; and
- an action planning unit configured to set a path on the basis of the corrected occupancy grid map.
-
- 1 Vehicle
- 11 Vehicle control system
- 51 Camera
- 53 LiDAR
- 73 Recognition unit
- 62 Action planning unit
- 301 Information processing unit
- 311 Attribute setting unit
- 312 Map creation unit
- 313 Attribute setting unit
- 314 Correction unit
- 321 Classification unit
- 322 Object detection unit
- 323 Movement estimation unit
- 501 Information processing unit
- 511 Attribute detection unit
- 512 Attribute setting unit
- 513 Correction unit
Claims (18)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019-174077 | 2019-09-25 | ||
| JP2019174077 | 2019-09-25 | ||
| PCT/JP2020/034437 WO2021060018A1 (en) | 2019-09-25 | 2020-09-11 | Signal processing device, signal processing method, program, and moving device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220383749A1 US20220383749A1 (en) | 2022-12-01 |
| US12254772B2 true US12254772B2 (en) | 2025-03-18 |
Family
ID=75166657
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/753,828 Active 2040-09-20 US12254772B2 (en) | 2019-09-25 | 2020-09-11 | Signal processing device, signal processing method, and mobile device |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US12254772B2 (en) |
| JP (1) | JP7663839B2 (en) |
| CN (1) | CN114424265B (en) |
| DE (1) | DE112020004545T5 (en) |
| WO (1) | WO2021060018A1 (en) |
Families Citing this family (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230095384A1 (en) * | 2020-03-25 | 2023-03-30 | Intel Corporation | Dynamic contextual road occupancy map perception for vulnerable road user safety in intelligent transportation systems |
| JP6982669B1 (en) * | 2020-09-07 | 2021-12-17 | 日立建機株式会社 | Work machine |
| DE102020006719A1 (en) * | 2020-11-02 | 2022-05-05 | Daimler Ag | Procedure for providing information about road users |
| JP7739142B2 (en) * | 2020-12-24 | 2025-09-16 | 現代自動車株式会社 | Parking assistance system and method with improved avoidance steering control |
| WO2022169988A1 (en) * | 2021-02-03 | 2022-08-11 | Autonomous Solutions, Inc. | Localization system for autonomous vehicles using sparse radar data |
| DE102021203809B4 (en) * | 2021-03-16 | 2023-05-04 | Continental Autonomous Mobility Germany GmbH | Driving course estimation in an environment model |
| JP2023008172A (en) * | 2021-07-05 | 2023-01-19 | Kddi株式会社 | Robot device, robot system, control method for robot device, and computer program |
| EP4116873A1 (en) * | 2021-07-09 | 2023-01-11 | Aptiv Technologies Limited | Occupancy grid calibration |
| EP4123338B1 (en) * | 2021-07-21 | 2024-12-04 | Hyundai Mobis Co., Ltd. | Apparatus and method for monitoring surrounding environment of vehicle |
| JP2023072146A (en) * | 2021-11-12 | 2023-05-24 | トヨタ自動車株式会社 | Robot control system, robot control method, and program |
| KR102524105B1 (en) * | 2022-11-30 | 2023-04-21 | (주)토탈소프트뱅크 | Apparatus for recognizing occupied space by objects |
| US20240300486A1 (en) * | 2023-03-06 | 2024-09-12 | Kodiak Robotics, Inc. | Systems and Methods for Managing Tracks Within an Occluded Region |
| US20240300533A1 (en) * | 2023-03-06 | 2024-09-12 | Kodiak Robotics, Inc. | Systems and Methods to Manage Tracking of Objects Through Occluded Regions |
| JP7738616B2 (en) * | 2023-09-29 | 2025-09-12 | 本田技研工業株式会社 | Control device, control method, and program |
| JP7630583B1 (en) | 2023-09-29 | 2025-02-17 | 本田技研工業株式会社 | Control device, control method, and program |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2333578A2 (en) | 2009-12-02 | 2011-06-15 | Robert Bosch GmbH | Method and control device for determining a movement direction of an object moving towards a vehicle |
| JP2012238151A (en) | 2011-05-11 | 2012-12-06 | Toyota Motor Corp | Periphery monitoring device and periphery monitoring method, and driving support device |
| CN105511485A (en) | 2014-09-25 | 2016-04-20 | 科沃斯机器人有限公司 | Grid map creating method for self-moving robot |
| CN105868687A (en) | 2015-02-09 | 2016-08-17 | 丰田自动车株式会社 | Traveling road surface detection apparatus and traveling road surface detection method |
| CN107480638A (en) * | 2017-08-16 | 2017-12-15 | 北京京东尚科信息技术有限公司 | Vehicle obstacle-avoidance method, controller, device and vehicle |
| WO2018091386A1 (en) * | 2016-11-21 | 2018-05-24 | Valeo Schalter Und Sensoren Gmbh | Method for detecting and classifying an object by means of at least one sensor apparatus on the basis of an occupancy map, driver assistance system and motor vehicle |
| US20180203445A1 (en) * | 2017-01-13 | 2018-07-19 | Ford Global Technologies, Llc | Generating Simulated Sensor Data For Training And Validation Of Detection Models |
| CN108775902A (en) | 2018-07-25 | 2018-11-09 | 齐鲁工业大学 | The adjoint robot path planning method and system virtually expanded based on barrier |
| JP2019016308A (en) | 2017-07-10 | 2019-01-31 | 株式会社Zmp | Object detection apparatus and method |
| JP2019046147A (en) * | 2017-09-01 | 2019-03-22 | 株式会社デンソー | Driving environment recognition device, driving environment recognition method, program |
| US20200225673A1 (en) * | 2016-02-29 | 2020-07-16 | AI Incorporated | Obstacle recognition method for autonomous robots |
| US20210403015A1 (en) * | 2017-08-03 | 2021-12-30 | Koito Manufacturing Co., Ltd | Vehicle lighting system, vehicle system, and vehicle |
| US20220057232A1 (en) * | 2018-12-12 | 2022-02-24 | Intel Corporation | Time-aware occupancy grid mapping for robots in dynamic environments |
-
2020
- 2020-09-11 CN CN202080065561.3A patent/CN114424265B/en active Active
- 2020-09-11 US US17/753,828 patent/US12254772B2/en active Active
- 2020-09-11 DE DE112020004545.5T patent/DE112020004545T5/en active Pending
- 2020-09-11 JP JP2021548799A patent/JP7663839B2/en active Active
- 2020-09-11 WO PCT/JP2020/034437 patent/WO2021060018A1/en not_active Ceased
Patent Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2333578A2 (en) | 2009-12-02 | 2011-06-15 | Robert Bosch GmbH | Method and control device for determining a movement direction of an object moving towards a vehicle |
| JP2012238151A (en) | 2011-05-11 | 2012-12-06 | Toyota Motor Corp | Periphery monitoring device and periphery monitoring method, and driving support device |
| CN105511485A (en) | 2014-09-25 | 2016-04-20 | 科沃斯机器人有限公司 | Grid map creating method for self-moving robot |
| CN105868687A (en) | 2015-02-09 | 2016-08-17 | 丰田自动车株式会社 | Traveling road surface detection apparatus and traveling road surface detection method |
| US20200225673A1 (en) * | 2016-02-29 | 2020-07-16 | AI Incorporated | Obstacle recognition method for autonomous robots |
| WO2018091386A1 (en) * | 2016-11-21 | 2018-05-24 | Valeo Schalter Und Sensoren Gmbh | Method for detecting and classifying an object by means of at least one sensor apparatus on the basis of an occupancy map, driver assistance system and motor vehicle |
| US20180203445A1 (en) * | 2017-01-13 | 2018-07-19 | Ford Global Technologies, Llc | Generating Simulated Sensor Data For Training And Validation Of Detection Models |
| JP2019016308A (en) | 2017-07-10 | 2019-01-31 | 株式会社Zmp | Object detection apparatus and method |
| US20210403015A1 (en) * | 2017-08-03 | 2021-12-30 | Koito Manufacturing Co., Ltd | Vehicle lighting system, vehicle system, and vehicle |
| CN107480638A (en) * | 2017-08-16 | 2017-12-15 | 北京京东尚科信息技术有限公司 | Vehicle obstacle-avoidance method, controller, device and vehicle |
| JP2019046147A (en) * | 2017-09-01 | 2019-03-22 | 株式会社デンソー | Driving environment recognition device, driving environment recognition method, program |
| CN108775902A (en) | 2018-07-25 | 2018-11-09 | 齐鲁工业大学 | The adjoint robot path planning method and system virtually expanded based on barrier |
| US20220057232A1 (en) * | 2018-12-12 | 2022-02-24 | Intel Corporation | Time-aware occupancy grid mapping for robots in dynamic environments |
Non-Patent Citations (4)
| Title |
|---|
| International Search Report and Written Opinion of PCT Application No. PCT/JP2020/034437, issued on Nov. 17, 2020, 09 pages of ISRWO. |
| Machine Translation CN-107480638 (Year: 2017). * |
| Machine Translation JP-2019046147 (year: 2019). * |
| Machine translation WO-2018091386 (year: 2018). * |
Also Published As
| Publication number | Publication date |
|---|---|
| DE112020004545T5 (en) | 2022-08-18 |
| JPWO2021060018A1 (en) | 2021-04-01 |
| US20220383749A1 (en) | 2022-12-01 |
| CN114424265A (en) | 2022-04-29 |
| JP7663839B2 (en) | 2025-04-17 |
| CN114424265B (en) | 2024-09-24 |
| WO2021060018A1 (en) | 2021-04-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12254772B2 (en) | Signal processing device, signal processing method, and mobile device | |
| JP7615141B2 (en) | Information processing device, information processing method, and program | |
| JP7257737B2 (en) | Information processing device, self-position estimation method, and program | |
| US20240054793A1 (en) | Information processing device, information processing method, and program | |
| WO2020116206A1 (en) | Information processing device, information processing method, and program | |
| WO2022158185A1 (en) | Information processing device, information processing method, program, and moving device | |
| JP7676407B2 (en) | Information processing device, information processing method, and program | |
| US20230206596A1 (en) | Information processing device, information processing method, and program | |
| US20240426997A1 (en) | Information processing apparatus, information processing method, and information processing system | |
| US12399700B2 (en) | Information processing apparatus, information processing method, and information processing system to enable update of a software | |
| US20240386724A1 (en) | Recognition processing device, recognition processing method, and recognition processing system | |
| US20250128740A1 (en) | Information processing device, information processing method, and vehicle control system | |
| US20250172950A1 (en) | Information processing apparatus, information processing method, information processing program, and mobile apparatus | |
| US20240375613A1 (en) | Information processing device, information processing method, recording medium, and in-vehicle system | |
| WO2023162497A1 (en) | Image-processing device, image-processing method, and image-processing program | |
| US20250067875A1 (en) | Information processing apparatus, information processing method, and information processing program | |
| US12437639B2 (en) | Information processing device, information processing method, and information processing program | |
| US12505744B2 (en) | Information processing device, information processing method, and program | |
| EP4593369A1 (en) | Information processing device and information processing method | |
| US20240290204A1 (en) | Information processing device, information processing method, and program | |
| US20250128732A1 (en) | Information processing apparatus, information processing method, and moving apparatus | |
| US20230377108A1 (en) | Information processing apparatus, information processing method, and program | |
| WO2024024471A1 (en) | Information processing device, information processing method, and information processing system | |
| US20240019539A1 (en) | Information processing device, information processing method, and information processing system | |
| WO2023149089A1 (en) | Learning device, learning method, and learning program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SONY GROUP CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ISHIKAWA, TATSUYA;WANG, CHAO;REEL/FRAME:059277/0601 Effective date: 20220209 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |