WO2024232245A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents
Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDFInfo
- Publication number
- WO2024232245A1 WO2024232245A1 PCT/JP2024/015586 JP2024015586W WO2024232245A1 WO 2024232245 A1 WO2024232245 A1 WO 2024232245A1 JP 2024015586 W JP2024015586 W JP 2024015586W WO 2024232245 A1 WO2024232245 A1 WO 2024232245A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- sound
- information processing
- processing device
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- This technology relates to an information processing device, an information processing method, and a program, and in particular to an information processing device, an information processing method, and a program that can more appropriately present information for avoiding danger to users of moving objects such as vehicles.
- the user cannot know the type of object that is present around the vehicle even when hearing the warning sound.
- the continuous sound of the warning sound may distract the user.
- This technology was developed in light of these circumstances, and makes it possible to more effectively present information to users of moving objects such as vehicles to help them avoid danger.
- An information processing device includes a prediction unit that predicts motion information indicating future motion of objects around a moving body, a calculation unit that calculates a degree of danger of the object based on the motion information, and a presentation control unit that controls a notification sound that is presented to a user of the moving body according to the type of the object based on the degree of danger.
- an information processing device predicts motion information indicating future motion of objects around a moving body, calculates the degree of danger of the objects based on the motion information, and controls a notification sound that is presented to a user of the moving body according to the type of the object based on the degree of danger.
- a program causes a computer to execute a process of predicting motion information indicating future motion of objects around a moving body, calculating the degree of danger of the objects based on the motion information, and controlling a notification sound that is presented to a user of the moving body according to the type of the object based on the degree of danger.
- motion information indicating future movements of objects around a moving body is predicted, the degree of danger of the object is calculated based on the motion information, and a notification sound according to the type of object and presented to the user of the moving body is controlled based on the degree of danger.
- FIG. 1 is a block diagram showing an example of the configuration of a vehicle control system.
- FIG. 2 is a diagram illustrating an example of a sensing region. 1 is a block diagram showing a configuration example of a vehicle control system to which the present technology is applied. 4 is a flowchart illustrating a process performed by the vehicle control system.
- FIG. 2 is a diagram showing an example of speaker arrangement.
- FIG. 11 is a diagram showing another example of speaker arrangement.
- FIG. 13 is a diagram illustrating an example of an output of a notification sound.
- FIG. 13 is a diagram showing an example of visual information.
- FIG. 11 is a diagram showing another example of visual information.
- FIG. 2 is a block diagram showing an example of the hardware configuration of a computer.
- FIG. 1 is a block diagram showing an example of the configuration of a vehicle control system 11, which is an example of a mobility device control system to which the present technology is applied.
- the vehicle control system 11 is provided in the vehicle 1 and performs processing related to the automated driving of the vehicle 1.
- This automated driving includes driving automation of levels 1 to 5, as well as remote driving and remote assistance of the vehicle 1 by a remote driver.
- the vehicle control system 11 includes a vehicle control ECU (Electronic Control Unit) 21, a communication unit 22, a map information storage unit 23, a location information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a memory unit 28, a driving automation control unit 29, a DMS (Driver Monitoring System) 30, an HMI (Human Machine Interface) 31, and a vehicle control unit 32.
- vehicle control ECU Electronic Control Unit
- communication unit 22 includes a vehicle control ECU (Electronic Control Unit) 21, a communication unit 22, a map information storage unit 23, a location information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a memory unit 28, a driving automation control unit 29, a DMS (Driver Monitoring System) 30, an HMI (Human Machine Interface) 31, and a vehicle control unit 32.
- the vehicle control ECU 21, communication unit 22, map information storage unit 23, position information acquisition unit 24, external recognition sensor 25, in-vehicle sensor 26, vehicle sensor 27, memory unit 28, driving automation control unit 29, DMS 30, HMI 31, and vehicle control unit 32 are connected to each other so as to be able to communicate with each other via a communication network 41.
- the communication network 41 is composed of an in-vehicle communication network or bus that complies with digital two-way communication standards such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), and Ethernet (registered trademark).
- the communication network 41 may be used differently depending on the type of data being transmitted.
- CAN may be applied to data related to vehicle control
- Ethernet may be applied to large-volume data.
- each part of the vehicle control system 11 may be directly connected without going through the communication network 41, using wireless communication intended for communication over relatively short distances, such as near field communication (NFC) or Bluetooth (registered trademark).
- NFC near field communication
- Bluetooth registered trademark
- the vehicle control ECU 21 is composed of various processors, such as a CPU (Central Processing Unit) and an MPU (Micro Processing Unit).
- the vehicle control ECU 21 controls all or part of the functions of the vehicle control system 11.
- the communication unit 22 communicates with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmits and receives various types of data. At this time, the communication unit 22 can communicate using multiple communication methods.
- the communication unit 22 communicates with servers (hereinafter referred to as external servers) on an external network via base stations or access points using wireless communication methods such as 5G (fifth generation mobile communication system), LTE (Long Term Evolution), and DSRC (Dedicated Short Range Communications).
- the external network with which the communication unit 22 communicates is, for example, the Internet, a cloud network, or an operator-specific network.
- the communication method that the communication unit 22 uses with the external network is not particularly limited as long as it is a wireless communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined distance.
- the communication unit 22 can communicate with a terminal present in the vicinity of the vehicle using P2P (Peer To Peer) technology.
- the terminal present in the vicinity of the vehicle can be, for example, a terminal attached to a mobile object moving at a relatively slow speed, such as a pedestrian or a bicycle, a terminal installed at a fixed position in a store, or an MTC (Machine Type Communication) terminal.
- the communication unit 22 can also perform V2X communication.
- V2X communication refers to communication between the vehicle and others, such as vehicle-to-vehicle communication with other vehicles, vehicle-to-infrastructure communication with roadside devices, vehicle-to-home communication with a home, and vehicle-to-pedestrian communication with a terminal carried by a pedestrian, etc.
- the communication unit 22 can, for example, receive from the outside a program for updating software that controls the operation of the vehicle control system 11 (Over the Air).
- the communication unit 22 can further receive map information, traffic information, information about the surroundings of the vehicle 1, etc. from the outside.
- the communication unit 22 can also transmit information about the vehicle 1 and information about the surroundings of the vehicle 1 to the outside.
- Information about the vehicle 1 that the communication unit 22 transmits to the outside includes, for example, data indicating the state of the vehicle 1, the recognition results by the recognition unit 73, etc.
- the communication unit 22 performs communication corresponding to a vehicle emergency notification system such as e-Call.
- the communication unit 22 receives electromagnetic waves transmitted by a road traffic information and communication system (VICS (Vehicle Information and Communication System) (registered trademark)) such as a radio beacon, optical beacon, or FM multiplex broadcasting.
- VICS Vehicle Information and Communication System
- the communication unit 22 can communicate with each device in the vehicle using, for example, wireless communication.
- the communication unit 22 can perform wireless communication with each device in the vehicle using a communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined speed via wireless communication, such as wireless LAN, Bluetooth, NFC, or WUSB (Wireless USB).
- the communication unit 22 can also communicate with each device in the vehicle using wired communication.
- the communication unit 22 can communicate with each device in the vehicle using wired communication via a cable connected to a connection terminal (not shown).
- the communication unit 22 can communicate with each device in the vehicle using a communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined speed via wired communication, such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), or MHL (Mobile High-definition Link).
- a communication method that allows digital two-way communication at a communication speed equal to or higher than a predetermined speed via wired communication, such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), or MHL (Mobile High-definition Link).
- the in-vehicle device that communicates with the communication unit 22 refers to, for example, a device that is not connected to the communication network 41 inside the vehicle.
- Examples of in-vehicle devices include mobile devices and wearable devices carried by users inside the vehicle, such as the driver, and information devices that are brought into the vehicle and temporarily installed.
- the map information storage unit 23 stores one or both of a map acquired from an external source and a map created by the vehicle 1.
- the map information storage unit 23 stores a three-dimensional high-precision map, a global map that is less accurate than a high-precision map and covers a wide area, etc.
- High-precision maps include, for example, dynamic maps, point cloud maps, and vector maps.
- a dynamic map is, for example, a map consisting of four layers of dynamic information, semi-dynamic information, semi-static information, and static information, and is provided to the vehicle 1 from an external server or the like.
- a point cloud map is a map made up of a point cloud (point cloud data).
- a vector map is, for example, a map that is adapted for driving automation by associating traffic information such as the positions of lanes and traffic lights with a point cloud map.
- the point cloud map and vector map may be provided, for example, from an external server, or may be created by the vehicle 1 based on sensing results from the camera 51, radar 52, LiDAR 53, etc. as a map for matching with a local map described below, and stored in the map information storage unit 23.
- map data of, for example, an area of several hundred meters square regarding the planned route along which the vehicle 1 will travel is acquired from the external server, etc., in order to reduce communication capacity.
- the location information acquisition unit 24 receives GNSS signals from Global Navigation Satellite System (GNSS) satellites and acquires location information of the vehicle 1.
- GNSS Global Navigation Satellite System
- the acquired location information is supplied to the driving automation control unit 29.
- the location information acquisition unit 24 is not limited to a method using GNSS signals, and may acquire location information using a beacon, for example.
- the external recognition sensor 25 includes various sensors used to recognize the situation outside the vehicle 1, and supplies sensor data from each sensor to each part of the vehicle control system 11.
- the type and number of sensors included in the external recognition sensor 25 are arbitrary.
- the external recognition sensor 25 includes a camera 51, a radar 52, a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) 53, and an ultrasonic sensor 54.
- the external recognition sensor 25 may be configured to include one or more types of sensors among the camera 51, the radar 52, the LiDAR 53, and the ultrasonic sensor 54.
- the number of cameras 51, radars 52, LiDAR 53, and ultrasonic sensors 54 is not particularly limited as long as it is a number that can be realistically installed on the vehicle 1.
- the types of sensors included in the external recognition sensor 25 are not limited to this example, and the external recognition sensor 25 may include other types of sensors. Examples of the sensing areas of each sensor included in the external recognition sensor 25 will be described later.
- the imaging method of camera 51 is not particularly limited.
- cameras of various imaging methods such as a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, and an infrared camera, which are imaging methods capable of distance measurement, can be applied to camera 51 as necessary.
- ToF Time Of Flight
- stereo camera stereo camera
- monocular camera stereo camera
- infrared camera infrared camera
- the present invention is not limited to this, and camera 51 may simply be used for acquiring photographic images, without regard to distance measurement.
- the external recognition sensor 25 can be equipped with an environmental sensor for detecting the environment relative to the vehicle 1.
- the environmental sensor is a sensor for detecting the environment such as the weather, climate, brightness, etc., and can include various sensors such as a raindrop sensor, a fog sensor, a sunlight sensor, a snow sensor, an illuminance sensor, etc.
- the external recognition sensor 25 includes a microphone that is used to detect sounds around the vehicle 1 and the location of sound sources.
- the in-vehicle sensor 26 includes various sensors for detecting information inside the vehicle, and supplies sensor data from each sensor to each part of the vehicle control system 11. There are no particular limitations on the types and number of the various sensors included in the in-vehicle sensor 26, so long as they are of the types and number that can be realistically installed in the vehicle 1.
- the in-vehicle sensor 26 may be equipped with one or more types of sensors including a camera, radar, a seating sensor, a steering wheel sensor, a microphone, and a biometric sensor.
- the camera equipped in the in-vehicle sensor 26 may be a camera using various imaging methods capable of measuring distances, such as a ToF camera, a stereo camera, a monocular camera, or an infrared camera. Without being limited to this, the camera equipped in the in-vehicle sensor 26 may be a camera simply for acquiring captured images, regardless of distance measurement.
- the biometric sensor equipped in the in-vehicle sensor 26 is provided, for example, on a seat, steering wheel, etc., and detects various types of biometric information of the user.
- the vehicle sensor 27 includes various sensors for detecting the state of the vehicle 1, and supplies sensor data from each sensor to each part of the vehicle control system 11. There are no particular limitations on the types and number of the various sensors included in the vehicle sensor 27, so long as they are of the types and number that can be realistically installed on the vehicle 1.
- the vehicle sensor 27 includes a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU) that integrates these.
- the vehicle sensor 27 includes a steering angle sensor that detects the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor that detects the amount of accelerator pedal operation, and a brake sensor that detects the amount of brake pedal operation.
- the vehicle sensor 27 includes a rotation sensor that detects the number of rotations of the engine or motor, an air pressure sensor that detects the air pressure of the tires, a slip ratio sensor that detects the slip ratio of the tires, and a wheel speed sensor that detects the rotation speed of the wheels.
- the vehicle sensor 27 includes a battery sensor that detects the remaining charge and temperature of the battery, and an impact sensor that detects external impacts.
- the memory unit 28 includes at least one of a non-volatile storage medium and a volatile storage medium, and stores data and programs.
- the memory unit 28 is used, for example, as an EEPROM (Electrically Erasable Programmable Read Only Memory) and a RAM (Random Access Memory), and the storage medium may be a magnetic storage device such as a hard disc drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the memory unit 28 stores various programs and data used by each part of the vehicle control system 11.
- the memory unit 28 includes an EDR (Event Data Recorder) and a DSSAD (Data Storage System for Automated Driving), and stores information about the vehicle 1 before and after an event such as an accident, and information acquired by the in-vehicle sensor 26.
- EDR Event Data Recorder
- DSSAD Data Storage System for Automated Driving
- the driving automation control unit 29 controls the driving automation function of the vehicle 1.
- the driving automation control unit 29 includes an analysis unit 61, an action planning unit 62, and an operation control unit 63.
- the analysis unit 61 performs analysis processing of the vehicle 1 and the surrounding conditions.
- the analysis unit 61 includes a self-position estimation unit 71, a sensor fusion unit 72, and a recognition unit 73.
- the self-position estimation unit 71 estimates the self-position of the vehicle 1 based on the sensor data from the external recognition sensor 25 and the high-precision map stored in the map information storage unit 23. For example, the self-position estimation unit 71 generates a local map based on the sensor data from the external recognition sensor 25, and estimates the self-position of the vehicle 1 by matching the local map with the high-precision map.
- the position of the vehicle 1 is based on, for example, the center of the rear wheel pair axle.
- the local map is, for example, a three-dimensional high-precision map or an occupancy grid map created using technology such as SLAM (Simultaneous Localization and Mapping).
- the three-dimensional high-precision map is, for example, the point cloud map described above.
- the occupancy grid map is a map in which the three-dimensional or two-dimensional space around the vehicle 1 is divided into grids of a predetermined size, and the occupancy state of objects is shown on a grid-by-grid basis.
- the occupancy state of objects is indicated, for example, by the presence or absence of an object and the probability of its existence.
- the local map is also used, for example, in detection processing and recognition processing of the situation outside the vehicle 1 by the recognition unit 73.
- the self-position estimation unit 71 may estimate the self-position of the vehicle 1 based on the position information acquired by the position information acquisition unit 24 and the sensor data from the vehicle sensor 27.
- the sensor fusion unit 72 performs sensor fusion processing to obtain information by combining multiple different types of sensor data (e.g., image data supplied from the camera 51 and sensor data supplied from the radar 52). Methods for combining different types of sensor data include compounding, integration, fusion, and association.
- the recognition unit 73 executes a detection process to detect the situation outside the vehicle 1, and a recognition process to recognize the situation outside the vehicle 1.
- the recognition unit 73 performs detection and recognition processing of the situation outside the vehicle 1 based on information from the external recognition sensor 25, information from the self-position estimation unit 71, information from the sensor fusion unit 72, etc.
- the recognition unit 73 performs detection processing and recognition processing of objects around the vehicle 1.
- Object detection processing is, for example, processing to detect the presence or absence, size, shape, position, and movement (for example, the content of the action, direction of movement, and speed of movement) of an object.
- Object recognition processing is, for example, processing to recognize attributes such as the type of object, and to identify a specific object.
- detection processing and recognition processing are not necessarily clearly separated, and there may be overlap.
- the recognition unit 73 detects objects around the vehicle 1 by performing clustering to classify a point cloud based on sensor data from the radar 52, the LiDAR 53, or the like into clusters of points. This allows the presence or absence, size, shape, and position of objects around the vehicle 1 to be detected.
- the recognition unit 73 detects the movement of objects around the vehicle 1 by performing tracking to follow the movement of clusters of point clouds classified by clustering. This allows the speed and direction of travel (movement vector) of objects around the vehicle 1 to be detected.
- the recognition unit 73 detects or recognizes vehicles, people, bicycles, obstacles, structures, roads, traffic lights, traffic signs, road markings, etc. based on image data supplied from the camera 51.
- the recognition unit 73 may also recognize the types of objects around the vehicle 1 by performing recognition processing such as semantic segmentation.
- the recognition unit 73 can perform recognition processing of traffic rules around the vehicle 1 based on the map stored in the map information storage unit 23, the result of self-location estimation by the self-location estimation unit 71, and the result of recognition of objects around the vehicle 1 by the recognition unit 73. Through this processing, the recognition unit 73 can recognize the positions and states of traffic lights, the contents of traffic signs and road markings, the contents of traffic regulations, and lanes on which travel is possible, etc.
- the recognition unit 73 can perform recognition processing of the environment around the vehicle 1.
- the surrounding environment that the recognition unit 73 recognizes may include weather, temperature, humidity, brightness, and road surface conditions.
- the behavior planning unit 62 creates a behavior plan for the vehicle 1. For example, the behavior planning unit 62 creates the behavior plan by performing route planning and route following processing.
- Route planning includes global path planning and local path planning.
- Global path planning involves planning a rough route from the start to the goal.
- Local path planning is also called trajectory planning, and involves generating a trajectory that allows safe and smooth progress in the vicinity of vehicle 1 on the planned route, taking into account the motion characteristics of vehicle 1.
- Path following is a process of planning operations for safely and accurately traveling along a route planned by a route plan within a planned time.
- the action planning unit 62 can, for example, calculate the target speed and target angular velocity of the vehicle 1 based on the results of this path following process.
- the operation control unit 63 controls the operation of the vehicle 1 to realize the action plan created by the action planning unit 62.
- the operation control unit 63 controls the steering control unit 81, the brake control unit 82, and the drive control unit 83 included in the vehicle control unit 32 described below, and performs lateral vehicle motion control and longitudinal vehicle motion control so that the vehicle 1 proceeds along the trajectory calculated by the trajectory plan.
- the operation control unit 63 performs control aimed at driving automation, such as driver assistance functions such as collision avoidance or impact mitigation, following driving, maintaining vehicle speed, collision warning for the vehicle itself, and lane departure warning for the vehicle itself, and driving without the operation of the driver or a remote driver.
- the DMS 30 performs processes such as authenticating the driver and recognizing the driver's state based on the sensor data from the in-vehicle sensors 26 and the input data input to the HMI 31 (described later).
- Examples of the driver's state to be recognized include physical condition, alertness, concentration, fatigue, line of sight, level of intoxication, driving operation, posture, etc.
- the DMS 30 may also perform authentication processing for users other than the driver and recognition processing for the status of the users.
- the DMS 30 may also perform recognition processing for the status inside the vehicle based on sensor data from the in-vehicle sensor 26. Examples of the status inside the vehicle that may be recognized include temperature, humidity, brightness, odor, etc.
- HMI31 inputs various data and instructions, and displays various data to the user.
- the HMI 31 is equipped with an input device that allows a person to input data.
- the HMI 31 generates input signals based on data and instructions input via the input device, and supplies the signals to each part of the vehicle control system 11.
- the HMI 31 is equipped with input devices such as a touch panel, buttons, switches, and levers. Without being limited to these, the HMI 31 may further be equipped with an input device that allows information to be input by a method other than manual operation, such as voice or gestures.
- the HMI 31 may use, as an input device, an externally connected device such as a remote control device that uses infrared or radio waves, or a mobile device or wearable device that supports the operation of the vehicle control system 11.
- the HMI 31 generates visual information, auditory information, and tactile information for the user or the outside of the vehicle.
- the HMI 31 also performs output control to control the output, output content, output timing, output method, etc. of each piece of generated information.
- the HMI 31 generates and outputs, as visual information, information indicated by images or light, such as an operation screen, vehicle 1 status display, warning display, and monitor image showing the situation around the vehicle 1.
- the HMI 31 also generates and outputs, as auditory information, information indicated by sounds, such as voice guidance, warning sounds, and warning messages.
- the HMI 31 also generates and outputs, as tactile information, information that is imparted to the user's sense of touch by force, vibration, movement, etc.
- the output device to which the HMI 31 outputs visual information may be, for example, a display device that displays an image to present visual information, or a projector device that projects an image to present visual information.
- the display device may be, in addition to a display device having a normal display, a head-up display, a transparent display, or a display with an AR (Augmented Reality) function.
- the display device may be a device that displays visual information within the user's field of vision, such as a wearable device with an AR function. Examples of wearable devices include glasses-type displays, smartphones, and smart watches.
- the HMI 31 may also use display devices such as a navigation device, an instrument panel, a CMS (Camera Monitoring System), an electronic mirror, and lamps installed in the vehicle 1 as output devices to output visual information.
- the HMI 131 may also use, as output devices to output visual information, display devices attached thereto, such as an on-dash monitor, a tablet terminal, a drive recorder, and a display for a drive recorder.
- Output devices that output visual information are installed on the front of the dashboard in front of the driver's seat and passenger seat, on the console between the driver's seat and passenger seat, on the windshield, the steering wheel, on the back of the driver's seat and passenger seat, etc.
- the output device through which the HMI 31 outputs auditory information can be, for example, an audio speaker, headphones, or earphones.
- Haptic elements using haptic technology can be used as an output device for the HMI 31 to output haptic information.
- Haptic elements are provided on parts that the user touches, such as the steering wheel and the seat.
- haptic information may also be output by a smartphone, smartwatch, etc., carried by the user.
- the vehicle control unit 32 controls each part of the vehicle 1.
- the vehicle control unit 32 includes a steering control unit 81, a brake control unit 82, a drive control unit 83, a body control unit 84, a light control unit 85, and a horn control unit 86.
- the steering control unit 81 detects and controls the state of the steering system of the vehicle 1.
- the steering system includes, for example, a steering mechanism including a steering wheel, an electric power steering, etc.
- the steering control unit 81 includes, for example, a steering ECU that controls the steering system, an actuator that drives the steering system, etc.
- the brake control unit 82 detects and controls the state of the brake system of the vehicle 1.
- the brake system includes, for example, a brake mechanism including a brake pedal, an ABS (Antilock Brake System), a regenerative brake mechanism, etc.
- the brake control unit 82 includes, for example, a brake ECU that controls the brake system, and an actuator that drives the brake system.
- the drive control unit 83 detects and controls the state of the drive system of the vehicle 1.
- the drive system includes, for example, an accelerator pedal, a drive force generating device for generating drive force such as an internal combustion engine or a drive motor, and a drive force transmission mechanism for transmitting the drive force to the wheels.
- the drive control unit 83 includes, for example, a drive ECU for controlling the drive system, and an actuator for driving the drive system.
- the body system control unit 84 detects and controls the state of the body system of the vehicle 1.
- the body system includes, for example, a keyless entry system, a smart key system, a power window device, a power seat, an air conditioning system, an airbag, a seat belt, a shift lever, etc.
- the body system control unit 84 includes, for example, a body system ECU that controls the body system, an actuator that drives the body system, etc.
- the light control unit 85 detects and controls the state of various lights of the vehicle 1. Examples of lights to be controlled include headlights, backlights, fog lights, turn signals, brake lights, projection, and bumper displays.
- the light control unit 85 includes a light ECU that controls the lights, an actuator that drives the lights, and the like.
- the horn control unit 86 detects and controls the state of the car horn of the vehicle 1.
- the horn control unit 86 includes, for example, a horn ECU that controls the car horn, an actuator that drives the car horn, etc.
- FIG. 2 is a diagram showing an example of a sensing area by the camera 51, radar 52, LiDAR 53, ultrasonic sensor 54, etc. of the external recognition sensor 25 in FIG. 1. Note that FIG. 2 shows a schematic view of the vehicle 1 as seen from above, with the left end side being the front end of the vehicle 1 and the right end side being the rear end of the vehicle 1.
- Sensing area 101F and sensing area 101B show examples of sensing areas of ultrasonic sensors 54. Sensing area 101F covers the periphery of the front end of vehicle 1 with multiple ultrasonic sensors 54. Sensing area 101B covers the periphery of the rear end of vehicle 1 with multiple ultrasonic sensors 54.
- sensing results in sensing area 101F and sensing area 101B are used, for example, for parking assistance for vehicle 1.
- Sensing area 102F to sensing area 102B show examples of sensing areas of a short-range or medium-range radar 52. Sensing area 102F covers a position farther in front of the vehicle 1 than sensing area 101F. Sensing area 102B covers a position farther in the rear of the vehicle 1 than sensing area 101B. Sensing area 102L covers the rear periphery of the left side of the vehicle 1. Sensing area 102R covers the rear periphery of the right side of the vehicle 1.
- the sensing results in sensing area 102F are used, for example, to detect vehicles, pedestrians, etc., that are in front of vehicle 1.
- the sensing results in sensing area 102B are used, for example, for collision prevention functions behind vehicle 1.
- the sensing results in sensing area 102L and sensing area 102R are used, for example, to detect objects in blind spots to the sides of vehicle 1.
- Sensing area 103F to sensing area 103B show examples of sensing areas by camera 51. Sensing area 103F covers a position farther in front of vehicle 1 than sensing area 102F. Sensing area 103B covers a position farther in the rear of vehicle 1 than sensing area 102B. Sensing area 103L covers the periphery of the left side of vehicle 1. Sensing area 103R covers the periphery of the right side of vehicle 1.
- the sensing results in sensing area 103F can be used, for example, for recognizing traffic lights and traffic signs, lane departure prevention support systems, and automatic headlight control systems.
- the sensing results in sensing area 103B can be used, for example, for parking assistance and surround view systems.
- the sensing results in sensing area 103L and sensing area 103R can be used, for example, for surround view systems.
- Sensing area 104 shows an example of the sensing area of LiDAR 53. Sensing area 104 covers a position farther in front of vehicle 1 than sensing area 103F. On the other hand, sensing area 104 has a narrower range in the left-right direction than sensing area 103F.
- the sensing results in the sensing area 104 are used, for example, to detect objects such as surrounding vehicles.
- Sensing area 105 shows an example of the sensing area of long-range radar 52. Sensing area 105 covers a position farther in front of vehicle 1 than sensing area 104. On the other hand, sensing area 105 has a narrower range in the left-right direction than sensing area 104.
- the sensing results in the sensing area 105 are used, for example, for ACC (Adaptive Cruise Control), emergency braking, collision avoidance, etc.
- ACC Adaptive Cruise Control
- emergency braking braking
- collision avoidance etc.
- the sensing areas of the cameras 51, radar 52, LiDAR 53, and ultrasonic sensors 54 included in the external recognition sensor 25 may have various configurations other than those shown in FIG. 2. Specifically, the ultrasonic sensor 54 may also sense the sides of the vehicle 1, and the LiDAR 53 may sense the rear of the vehicle 1.
- the installation positions of the sensors are not limited to the examples described above. The number of sensors may be one or more.
- FIG. 3 is a block diagram showing an example configuration of a vehicle control system 11 to which this technology is applied.
- the vehicle control system 11 in FIG. 3 includes the above-mentioned configuration (external recognition sensor 25 and recognition unit 73), as well as an information processing unit 201 that calculates the danger level of objects around the vehicle 1 and generates audio data for a notification sound indicating the danger level of the object, a sound material database 202, and a sound presentation unit 203 that presents the notification sound.
- FIG. 3 shows the configuration of the portion of the vehicle control system 11 that is involved in presenting the notification sound.
- the external recognition sensor 25 includes a number of microphones arranged around the vehicle 1, for example at the outer edge of the vehicle 1, similar to the camera 51, radar 52, LiDAR 53, ultrasonic sensor 54, etc.
- the external recognition sensor 25 may include a microphone array consisting of a number of microphones. Using the multiple microphones, the external recognition sensor 25 can collect sounds around the vehicle 1 by separating them according to the direction from which the sounds are coming.
- the recognition unit 73 acquires sensor data from the external recognition sensor 25, and performs recognition processing of objects around the vehicle 1 based on the sensor data using AI (Artificial Intelligence) trained using deep learning or other machine learning, neural networks, etc. Through the recognition processing, the recognition unit 73 recognizes, for example, the type, attributes, state, position, orientation, speed, and size of objects around the vehicle 1.
- AI Artificial Intelligence
- the type of object indicates whether the object is a pedestrian, a vehicle, a ball, a guardrail, a road sign, an obstacle, etc.
- the recognition unit 73 recognizes not only moving objects such as pedestrians, vehicles, and balls, but also stationary objects such as guardrails, signs, and obstacles.
- the object attributes indicate, for example, whether the pedestrian is a child or an elderly person. If the type of object is a vehicle, the object attributes indicate, for example, the model and color of the vehicle.
- the object state indicates, for example, that a pedestrian as an object is walking while looking at a smartphone, that the brake lights of a vehicle as an object are on, etc.
- the recognition unit 73 supplies the recognition results of the objects around the vehicle 1 to the information processing unit 201. Note that the recognition unit 73 may obtain information about the objects around the vehicle 1 from equipment other than the vehicle 1 via the communication unit 22, and perform recognition processing based on that information.
- the information processing unit 201 is part of the functions of the HMI 31.
- the information processing unit 201 is composed of a prediction unit 211, a risk calculation unit 212, and a sound generation unit 213.
- the prediction unit 211 predicts the future movement of objects around the vehicle 1 and the movement of the objects in the event of an accident, based on the object recognition results by the recognition unit 73.
- the prediction unit 211 predicts the range that an object will reach in a few seconds under normal circumstances as the object's future movement.
- the range that an object will reach in a few seconds under normal circumstances is basically predicted based on the object's position, orientation, and speed.
- the range that an object will reach in a few seconds under normal circumstances may be predicted based on a time series of the object's movement. For example, the range that an object will reach in a few seconds under normal circumstances may be predicted based on the amount of movement of the object in a recent specified period of time. In addition, the range that an object will reach in a few seconds under normal circumstances may be predicted taking into account the state of the object, such as a pedestrian walking while looking at their smartphone, or a vehicle's brake lights being on.
- the prediction unit 211 predicts the possibility (certainty) of an accident occurring and the movement of an object if an accident occurs (the range the object will reach in a few seconds) based on the attributes of the object, such as, for example, "children are likely to suddenly run out into the road” or "elderly people are likely to fall while crossing the road.”
- the prediction unit 211 can also predict the possibility of an accident occurring and the movement of an object if an accident occurs, taking into account the position of a stationary object, such as, for example, "because there is a guardrail, it is unlikely that a child will suddenly run out into the road.”
- the prediction unit 211 supplies the risk calculation unit 212 and the sound generation unit 213 with movement information indicating the future movement of the object and accident information indicating the movement of the object if an accident occurs.
- the accident information includes information indicating the possibility of an accident occurring.
- the accident information may also include prediction difficulty, which is the degree of difficulty in predicting the future movement of the object.
- the risk calculation unit 212 calculates the risk of an object based on the object recognition result by the recognition unit 73, and the movement information and accident information supplied from the prediction unit 211. For example, the risk of a collision between the object and the vehicle 1 is estimated based on the range in which the object is predicted to arrive in a few seconds, and the moving direction and speed of the vehicle 1.
- the danger level calculation unit 212 supplies the danger level of the object to the sound generation unit 213.
- the sound generation unit 213 generates audio data for a notification sound based on the movement information and accident information supplied from the prediction unit 211, and the degree of danger of the object supplied from the danger calculation unit 212.
- the sound generation unit 213 basically acquires the sound coming from the position of the object (the direction in which the object is located as viewed from the vehicle 1) from the external recognition sensor 25. If the object is a car, for example, the engine sound is acquired; if the object is a pedestrian, for example, the footsteps are acquired; if the object is a bicycle, for example, the sound of the chain is acquired.
- the sound generation unit 213 causes the sound coming from the position of the object (the sound actually emitted from the object) to be output from the sound presentation unit 203 as a notification sound.
- the sound generation unit 213 If the object is an electric vehicle or a pedestrian, the engine noise or footsteps emitted by the electric vehicle or pedestrian may not be captured because the volume of these sounds is low. Also, if the object is a ball, the ball may not actually emit much sound, so for example, the sound of the ball rolling may not be captured. If sounds emitted from objects around the vehicle 1 cannot be captured, the sound generation unit 213 generates audio data of a sound that evokes the type of object, based on audio data of previously prepared sounds obtained from the sound material database 202.
- Sound data such as a "beep beep” sound pronounced of a car horn, a "beep beep” sound pronounced of a motorbike engine, a "tinkle jingle” sound pronounced of a bicycle bell, a "tick tick” sound pronounced of a pedestrian, and a "rolling" sound pronounced of a ball are generated.
- the sound pronounced of a car may be a pre-recorded sound made by a car, a pre-recorded sound made by another object, or an electronic sound.
- the sound generation unit 213 causes the sound presentation unit 203 to output a sound that evokes the type of object as a notification sound. In this way, the sound generation unit 213 functions as a presentation control unit that controls the sound presentation unit 203 to present the notification sound to the user of the vehicle 1.
- the sound generation unit 213 controls the volume of the notification sound for notifying the user of an object based on the danger level of the object. For example, the sound generation unit 213 sets the volume of the notification sound for notifying the user of an object whose danger level is higher than the first threshold to a high volume, and sets the volume of the notification sound for notifying the user of an object whose danger level is lower than the first threshold to a low volume.
- a warning sound such as a beep may be synthesized with a sound actually emitted by the object or a sound reminiscent of the type of object, and this synthesized sound may be output as a notification sound.
- the sound generation unit 213 controls the position of the sound image (virtual sound source) of the notification sound based on the position of the object.
- the sound material database 202 contains pre-recorded audio data of various sounds that evoke different types of objects.
- the sound presentation unit 203 is composed of, for example, multiple speakers installed inside the vehicle.
- the sound presentation unit 203 presents the notification sound to the user of the vehicle 1 by outputting the notification sound according to the control of the sound generation unit 213.
- All or part of the functions of the recognition unit 73 and the information processing unit 201 may be provided in a device other than the vehicle 1, such as a cloud.
- the processing performed by the vehicle control system 11 having the above configuration will be described with reference to the flowchart in FIG. 4.
- the processing in FIG. 4 is started, for example, when an operation is performed to start the vehicle 1 and start driving, for example, when the ignition switch, power switch, start switch, etc. of the vehicle 1 is turned on.
- the processing in FIG. 4 is ended, for example, when an operation is performed to end driving of the vehicle 1, for example, when the ignition switch, power switch, start switch, etc. of the vehicle 1 is turned off.
- step S1 the external recognition sensor 25 senses the situation around the vehicle 1.
- step S2 the recognition unit 73 recognizes objects around the vehicle 1 based on the sensor data of the external recognition sensor.
- step S3 the information processing unit 201 determines whether or not an object has been recognized by the recognition unit 73.
- step S3 If it is determined in step S3 that an object has not been recognized, the process returns to step S1, and the situation around the vehicle 1 is sensed until an object is recognized.
- step S3 if it is determined in step S3 that an object has been recognized, the process proceeds to step S4.
- step S4 the prediction unit 211 predicts the future movement of the object recognized by the recognition unit 73.
- step S5 the danger level calculation unit 212 calculates the danger level of the object recognized by the recognition unit 73 based on the recognition result by the recognition unit 73 and the prediction result by the prediction unit 211.
- step S6 the sound generation unit 213 generates audio data for a notification sound based on the degree of danger.
- step S7 the sound presentation unit 203 presents a notification sound to the user of the vehicle 1 in accordance with the control of the sound generation unit 213. After the notification sound is presented in step S7, the process returns to step S1, and subsequent processes are performed.
- Figure 5 shows an example of speaker placement.
- the sound presentation unit 203 includes eight speakers 203F to 203B. Note that in FIG. 5, the positions of the speakers 203F to 203B are shown diagrammatically by dotted circles.
- Speakers 203F to 203B are arranged in the interior of vehicle 1 so as to surround driver D1 who is sitting in the driver's seat.
- speaker 203FL is disposed to the left front of driver D1, near the front edge of the passenger door of vehicle 1
- speaker 203FR is disposed to the right front of driver D1, near the front edge of the driver door of vehicle 1.
- Speaker 203F is disposed midway between speaker 203FL and speaker 203FR, near the center of the dashboard of vehicle 1.
- Speaker 203BL is located to the left rear of driver D1 and slightly forward of the center of the left rear door of vehicle 1
- speaker 203BR is located to the right rear of driver D1 and slightly forward of the center of the right rear door of vehicle 1.
- Speaker 203L is located midway between speaker 203FL and speaker 203BL, near the rear end of the passenger door of vehicle 1.
- Speaker 203R is located midway between speaker 203FR and speaker 203BR, near the rear end of the driver's door of vehicle 1.
- Speaker 203B is located midway between speakers 203BL and 203BR, near the center of the rear seat. Speakers 203F and 203B, and speakers 203L and 203R, are positioned to face each other. Speakers 203F through 203B are all installed to face the interior of vehicle 1.
- Speakers 203F to 203B are used, for example, to output sound to the entire vehicle interior (all passengers in the vehicle). Speakers 203F to 203B realize 360 Reality Audio (registered trademark, hereafter referred to as 360RA) and stereophonic sound.
- 360RA 360 Reality Audio
- 360RA is a technology that places the sound image (virtual sound source) of each sound individually at any position within a spherical space, and controls the sound output so that the listener feels that each sound is coming from the direction of each sound image.
- presenting a notification sound using 360RA it is possible to make the passenger aware of, for example, the position of an object present around the vehicle 1 (the direction of the object as seen from the vehicle 1).
- the number and positions of speakers arranged in vehicle 1 are not limited to the number and positions described with reference to FIG. 5.
- stereo speakers or nine or less surround speakers may be used to realize three-dimensional sound. It is also possible to apply this technology to a system that can localize a sound image only at any position within a predetermined range, rather than in all directions.
- Three-dimensional sound may be realized by a single speaker, or the technology may be applied to a vehicle equipped with a single speaker that does not support three-dimensional sound.
- Figure 6 shows another example of speaker placement.
- FIG. 6 is a perspective view showing the interior of vehicle 1 as viewed from the left. In the figure, parts corresponding to those in FIG. 5 are given the same reference numerals.
- speaker 203FLU (not shown), speaker 203FRU, speaker 203FLD (not shown), speaker 203FRD, speaker 203BLD (not shown), and speaker 203BRD have been added compared to the example of FIG. 5.
- Speaker 203FRU is located higher than speaker 203FR and is positioned near the upper right edge of the windshield of vehicle 1, and speaker 203FRD is located below speaker 203FR and near the bottom edge of the driver's door. Speaker 203BRD is located below speaker 203BR and near the bottom edge of the right rear seat door.
- speaker 203FLU, speaker 203FLD, and speaker 203BLD are positioned on the left side of the room, approximately opposite speaker 203FRU, speaker 203FRD, and speaker 203BRD.
- Speaker 203FLU, speaker 203FRU, speaker 203FLD, speaker 203FRD, speaker 203BLD, and speaker 203BRD are all installed facing the interior of vehicle 1.
- Figure 7 shows an example of a notification sound output.
- a notification sound is output so that a sound emitted from vehicle 1 can be heard coming from the direction of vehicle 1 (to the right front).
- a notification sound is output so that, for example, a sound emitted by the pedestrian P1 can be heard coming from the direction of the pedestrian P1 (left front).
- the sound generation unit 213 generates audio data to be supplied to each speaker of the sound presentation unit 203 so that the sound image of the notification sound for notifying the driver of objects present around the vehicle 1 is localized at a position corresponding to the direction of each object relative to the vehicle 1.
- the sound emitted by the object or a sound reminiscent of the type of object is output so that it can be heard coming from the direction of the object, allowing the user of vehicle 1 to intuitively grasp from which direction and what type of object is approaching vehicle 1.
- the characteristics of the notification sound may be controlled based on the distance between the vehicle 1 and the object. For example, if an object is present at a distance from the vehicle 1, a notification sound with high-pitched components cut may be output, or the pitch of the notification sound may be raised.
- the height (vertical) position of the sound image of the notification sound may be controlled based on the distance between vehicle 1 and the object. For example, if an object is present at a position far from vehicle 1, the sound image of the notification sound is localized above a reference position, and if the distance between vehicle 1 and the object is medium, the sound image of the notification sound is localized at the reference position. If an object is present at a position close to vehicle 1, the sound image of the notification sound is localized below the reference position.
- the height position of the sound image of the notification sound may be controlled based on the danger level of the object.
- the position of the sound image of the notification sound may be controlled based on the predicted future movement of the object (movement information). For example, a "whoosh" sound may be output as the notification sound so that the position of the sound image moves along the predicted trajectory of the object's movement.
- the features of the notification sound may be controlled based on the fluctuation in the danger level. For example, if the fluctuation in the danger level is large (if the accuracy of the predicted object movement is low), reverb processing or echo processing that reproduces the reverberation of a cave or the like may be applied to the notification sound.
- motion information indicating the future movement of objects around the vehicle 1 is predicted, the degree of danger of the object is calculated based on the motion information, and the features of the notification sound according to the type of object that is presented to the user of the vehicle 1 are controlled based on the degree of danger.
- the features of the notification sound include volume, frequency components, pitch, effects (e.g., reverb and echo), etc.
- the vehicle control system 11 does not simply warn that a dangerous object is present around the vehicle 1, but presents a notification sound so that the user of the vehicle 1 can understand the level of danger of the object, the type of object, and the direction of the object. Therefore, the vehicle control system 11 can ideally present information for avoiding danger to the user of the vehicle 1. This allows the user of the vehicle 1 to intuitively know potential danger, and the vehicle control system 11 can achieve safe driving of the vehicle 1.
- the sound generation unit 213 can also use a voice signal processing technique to generate voice data so that the notification sound can be heard only by the driver of the vehicle 1. In this case, the notification sound cannot be heard by users who are not engaged in driving and who are sitting in the passenger seat or the back seat, so it is possible to realize a comfortable vehicle interior space for users who are not engaged in driving.
- a notification sound may be presented to notify of an object that is in a position that is not actually visible to the vehicle 1. For example, at an intersection with poor visibility, when it is predicted that another vehicle will jump out in front of the vehicle 1, a sound emitted from the vehicle or a sound reminiscent of the vehicle is output so as to be heard coming from the direction of the vehicle.
- Example of Noise Canceling Depending on the driving environment of the vehicle 1, external sounds may be heard by the user inside the vehicle. For example, when the vehicle 1 is driving near a construction site, the user inside the vehicle may hear the sounds of the construction. In this case, the notification sound presented by the vehicle control system 11 is mixed with the external sounds, making it difficult for the user of the vehicle 1 to hear the notification sound.
- the notification sound may be presented at a louder volume than the external sounds.
- the vehicle control system 11 can also present a notification sound in linkage with visual information indicating the degree of danger of objects around the vehicle 1 .
- Figure 8 shows an example of visual information.
- danger prediction information A21 to A24 indicating the degree of danger of objects around vehicle 1 is superimposed on the captured image showing the surroundings of vehicle 1.
- the danger prediction information A21 indicates, with a color, the danger level of a pedestrian P11 moving backwards as seen by the user of the vehicle 1, and indicates, with a shape, the range into which the pedestrian P11 is predicted to move.
- the danger prediction information A22 indicates, with a color, the danger level of a pedestrian P12 crossing the crosswalk, and indicates, with a shape, the range into which the pedestrian P12 is predicted to move.
- the danger prediction information A23 indicates, with a color, the danger level of a pedestrian P13 about to cross the crosswalk, and indicates, with a shape, the range into which the pedestrian P13 is predicted to move.
- the danger prediction information A24 indicates, with a color, the danger level of a vehicle C11 stopped on the road.
- Points Po1 to Po3 are displayed superimposed on the heads of pedestrians P11 to P13, respectively.
- the captured image with the danger prediction information A21 to A24 and points Po to Po3 superimposed thereon is displayed, for example, on a display mounted inside the vehicle 1.
- a notification sound in conjunction with the captured image and danger prediction information displayed on the display, it becomes easier for the user of the vehicle 1 to understand the source of the notification sound.
- Figure 9 shows another example of visual information.
- a CG (Computer Graphics) image is displayed that reproduces the vehicle 1 itself and the objects and environment around the vehicle 1.
- the CG image includes a waveform W1 that indicates the direction from which the notification sound will be heard and the volume (level of danger) of the notification sound.
- the waveform W1 is a circular line that surrounds the vehicle 1, and the portion of the line that corresponds to the direction from which the notification sound will be heard is displayed as wavy. The louder the notification sound is, the more the portion of the waveform W1 that corresponds to the direction from which the notification sound will be heard will wavy.
- the notification sound notifying the user of person P21 (bicycle) is presented at a high volume, so that the portion of waveform W1 that corresponds to the direction of person P21 relative to vehicle 1 undulates significantly.
- the portion corresponding to the direction of each object relative to the vehicle 1 is displayed in orange, for example, and the other portion is displayed in cyan, for example.
- the color of the portion of the waveform W1 corresponding to the direction of each object relative to the vehicle 1 may be determined based on the degree of danger of each object.
- the notification sound is presented in conjunction with a CG image that includes a waveform W1 that indicates the direction from which the notification sound is coming and the volume (level of danger) of the notification sound, making it easier for the user of vehicle 1 to understand the source of the notification sound.
- the present technology can be applied to various products.
- the present technology may be realized as a device mounted on any type of moving object, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, or a robot.
- the above-described series of processes can be executed by hardware or software.
- the program constituting the software is installed from a program recording medium into a computer incorporated in dedicated hardware, or into a general-purpose personal computer, etc.
- FIG. 10 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes by a program.
- the information processing unit 201 is, for example, a PC having a configuration similar to that shown in FIG. 10.
- CPU 501 CPU 501
- ROM (Read Only Memory) 502 RAM 503 are interconnected by bus 504.
- an input/output interface 505 Connected to the input/output interface 505 are an input unit 506 consisting of a keyboard, mouse, etc., and an output unit 507 consisting of a display, speakers, etc. Also connected to the input/output interface 505 are a storage unit 508 consisting of a hard disk or non-volatile memory, a communication unit 509 consisting of a network interface, etc., and a drive 510 that drives removable media 511.
- the CPU 501 loads a program stored in the storage unit 508 into the RAM 503 via the input/output interface 505 and the bus 504, and executes the program, thereby performing the above-mentioned series of processes.
- the programs executed by the CPU 501 are provided, for example, by being recorded on removable media 511, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 508.
- the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
- a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
- this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.
- each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
- a single step includes multiple processes
- the processes included in that single step can be executed by a single device, or can be shared and executed by multiple devices.
- Example of combination of configurations The present technology can also have the following configurations.
- a prediction unit that predicts motion information indicating future motions of objects around the moving object; A calculation unit that calculates a degree of danger of the object based on the movement information; a notification sound that is presented to a user of the moving body and corresponds to the type of the object, and a notification control unit that controls the notification sound based on the degree of danger.
- the notification sound includes a sound emitted from the object.
- the notification sound includes a sound that evokes a type of the object.
- the presentation control unit generates a sound that evokes the type of the object based on a previously prepared sound.
- the information processing device controls a volume of the notification sound based on the degree of danger.
- the presentation control unit presents the notification sound corresponding to the type of object whose danger level is lower than a first threshold at a volume lower than a standard volume, and presents the notification sound corresponding to the type of object whose danger level is higher than the first threshold at a volume higher than the standard volume.
- the presentation control unit presents the notification sound including a warning sound when the degree of risk is higher than a second threshold value.
- the information processing device controls a position of a sound image of the notification sound based on a position of the object.
- the presentation control unit localizes the sound image at a position corresponding to a direction of the object relative to the moving body.
- the presentation control unit further presents visual information indicating the degree of risk.
- the presentation control unit includes a waveform indicating the position of the sound image of the notification sound and the volume of the notification sound, and presents a CG image indicating the objects around the moving body as the visual information.
- the information processing device according to any one of (8) to (11), wherein the presentation control unit moves a position of the sound image along a trajectory along which the object is predicted to move. (13) The information processing device according to any one of (8) to (12), wherein the presentation control unit controls a position of the sound image in a height direction based on a distance between the moving body and the object. (14) The information processing device according to any one of (1) to (13), wherein the presentation control unit presents the notification sound only to a driver of the moving object. (15) The information processing device according to any one of (1) to (14), wherein the presentation control unit performs noise canceling for an external sound.
- the prediction unit further predicts accident information indicating a movement of the object when an accident occurs;
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
La présente technologie concerne un dispositif de traitement d'informations, un procédé de traitement d'informations et un programme qui permettent à des informations permettant d'éviter un danger d'être présentées de manière plus appropriée à un utilisateur d'un corps mobile tel qu'un véhicule. Le dispositif de traitement d'informations selon la présente technologie comprend : une unité de prédiction qui prédit des informations de mouvement indiquant un mouvement futur d'un objet à proximité du corps mobile ; une unité de calcul qui calcule un degré de danger de l'objet sur la base des informations de mouvement ; et une unité de commande de présentation qui commande un son de notification qui est présenté à un utilisateur du corps mobile et qui correspond au type de l'objet, sur la base du degré de danger. La présente technologie est applicable à des corps mobiles tels que des véhicules, par exemple.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202480029882.6A CN121039722A (zh) | 2023-05-10 | 2024-04-19 | 信息处理装置、信息处理方法和程序 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-077720 | 2023-05-10 | ||
| JP2023077720 | 2023-05-10 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024232245A1 true WO2024232245A1 (fr) | 2024-11-14 |
Family
ID=93429967
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/015586 Pending WO2024232245A1 (fr) | 2023-05-10 | 2024-04-19 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN121039722A (fr) |
| WO (1) | WO2024232245A1 (fr) |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006090988A (ja) * | 2004-09-27 | 2006-04-06 | Fujitsu Ten Ltd | ナビゲーション装置 |
| JP2016172469A (ja) * | 2015-03-16 | 2016-09-29 | 株式会社デンソー | 画像生成装置 |
| WO2017056382A1 (fr) * | 2015-09-29 | 2017-04-06 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| WO2017060978A1 (fr) * | 2015-10-06 | 2017-04-13 | 株式会社日立製作所 | Dispositif de commande de conduite automatique et procédé de commande de conduite automatique |
| WO2018180523A1 (fr) * | 2017-03-29 | 2018-10-04 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations, et corps mobile |
| JP2020154795A (ja) * | 2019-03-20 | 2020-09-24 | パナソニックIpマネジメント株式会社 | 出力制御装置、出力制御方法、および出力制御プログラム |
| JP2021096539A (ja) * | 2019-12-13 | 2021-06-24 | ヴィオニア スウェーデン エービー | ドライバの注意喚起装置、注意喚起方法及び、これを用いた安全システム |
| JP2022020160A (ja) * | 2020-07-20 | 2022-02-01 | 日立Astemo株式会社 | 車両用運転支援装置 |
| US20220116709A1 (en) * | 2020-10-08 | 2022-04-14 | Valeo North America, Inc. | Method, apparatus, and computer-readable storage medium for providing three-dimensional stereo sound |
| WO2022145286A1 (fr) * | 2021-01-04 | 2022-07-07 | ソニーグループ株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations, programme, dispositif mobile et système de traitement d'informations |
-
2024
- 2024-04-19 CN CN202480029882.6A patent/CN121039722A/zh active Pending
- 2024-04-19 WO PCT/JP2024/015586 patent/WO2024232245A1/fr active Pending
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006090988A (ja) * | 2004-09-27 | 2006-04-06 | Fujitsu Ten Ltd | ナビゲーション装置 |
| JP2016172469A (ja) * | 2015-03-16 | 2016-09-29 | 株式会社デンソー | 画像生成装置 |
| WO2017056382A1 (fr) * | 2015-09-29 | 2017-04-06 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| WO2017060978A1 (fr) * | 2015-10-06 | 2017-04-13 | 株式会社日立製作所 | Dispositif de commande de conduite automatique et procédé de commande de conduite automatique |
| WO2018180523A1 (fr) * | 2017-03-29 | 2018-10-04 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations, et corps mobile |
| JP2020154795A (ja) * | 2019-03-20 | 2020-09-24 | パナソニックIpマネジメント株式会社 | 出力制御装置、出力制御方法、および出力制御プログラム |
| JP2021096539A (ja) * | 2019-12-13 | 2021-06-24 | ヴィオニア スウェーデン エービー | ドライバの注意喚起装置、注意喚起方法及び、これを用いた安全システム |
| JP2022020160A (ja) * | 2020-07-20 | 2022-02-01 | 日立Astemo株式会社 | 車両用運転支援装置 |
| US20220116709A1 (en) * | 2020-10-08 | 2022-04-14 | Valeo North America, Inc. | Method, apparatus, and computer-readable storage medium for providing three-dimensional stereo sound |
| WO2022145286A1 (fr) * | 2021-01-04 | 2022-07-07 | ソニーグループ株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations, programme, dispositif mobile et système de traitement d'informations |
Also Published As
| Publication number | Publication date |
|---|---|
| CN121039722A (zh) | 2025-11-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7764919B2 (ja) | 情報処理装置、および情報処理方法、プログラム、並びに車両 | |
| JP7615141B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| WO2022259621A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme informatique | |
| EP3836119A1 (fr) | Dispositif de traitement d'informations, corps mobile, procédé de traitement d'informations et programme | |
| JP7374098B2 (ja) | 情報処理装置及び情報処理方法、コンピュータプログラム、情報処理システム、並びに移動体装置 | |
| JP7786398B2 (ja) | 情報処理装置、情報処理方法、プログラム、移動装置、及び、情報処理システム | |
| JP7676407B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
| WO2022014327A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| JP2025013519A (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
| US20250172950A1 (en) | Information processing apparatus, information processing method, information processing program, and mobile apparatus | |
| JP2024003806A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| US20240386724A1 (en) | Recognition processing device, recognition processing method, and recognition processing system | |
| US20250099042A1 (en) | Vibration detection system and vibration detection method | |
| WO2024009829A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et système de commande de véhicule | |
| WO2024232245A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| JP2023062484A (ja) | 情報処理装置、情報処理方法及び情報処理プログラム | |
| WO2022004423A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme | |
| EP4593369A1 (fr) | Dispositif et procédé de traitement d'informations | |
| US20250225866A1 (en) | Signal processing apparatus, signal processing method, and recording medium | |
| WO2024232244A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| WO2024252912A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations | |
| WO2024038759A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| WO2023068116A1 (fr) | Dispositif de communication embarqué dans un véhicule, dispositif terminal, procédé de communication, procédé de traitement d'informations et système de communication | |
| JP2025050517A (ja) | 情報処理装置、情報処理方法、及びレーダ装置 | |
| WO2024225024A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24803359 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2025519366 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2025519366 Country of ref document: JP |