WO2024185680A1 - Dispositif d'affichage d'aide à la conduite et procédé d'affichage d'aide à la conduite - Google Patents
Dispositif d'affichage d'aide à la conduite et procédé d'affichage d'aide à la conduite Download PDFInfo
- Publication number
- WO2024185680A1 WO2024185680A1 PCT/JP2024/007743 JP2024007743W WO2024185680A1 WO 2024185680 A1 WO2024185680 A1 WO 2024185680A1 JP 2024007743 W JP2024007743 W JP 2024007743W WO 2024185680 A1 WO2024185680 A1 WO 2024185680A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display mode
- driving assistance
- display
- vehicle
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to a driving assistance display device and a driving assistance display method.
- known technologies for assisting vehicle driving include a technology that combines actual images from an onboard camera with iconized information about the traffic environment around the vehicle, such as other vehicles and obstacles, and displays it on an onboard monitor.
- HUD Head Up Display
- Patent Document 1 discloses a HUD that can improve safety at intersections.
- the HUD in Patent Document 1 projects an image showing an overhead view of the intersection onto the image projection unit at a first timing when the vehicle approaches the intersection to a predetermined distance, and projects an image showing other vehicles superimposed on the actual scene onto the image projection unit at a second timing when the vehicle approaches the intersection even closer than the first timing.
- the HUD in Patent Document 1 is said to be able to recognize other vehicles earlier and further improve safety at intersections.
- the display formats of driving assistance display devices generally include real images of the traffic environment around the vehicle and icon images that show the traffic environment around the vehicle in icon form.
- Real images allow the driver to grasp the situation around the vehicle more accurately, based on reality.
- real images also contain information that is not necessarily required when the driver is driving, forcing the driver to choose which information to accept and which not. This places a burden on the driver, so icon images that show a schematic representation of mainly the information necessary for driving are widely used.
- constantly displaying information about the traffic environment around the vehicle as icon images may not be ideal for the driver. For example, when approaching an intersection where there are multiple other vehicles on the approaching road, if the other vehicles are displayed as icon images, multiple moving icon images are displayed, which is difficult for the driver to judge and irritating. Also, in poor visibility environments such as at night or in thick fog, it is difficult for icon images to correspond to the actual traffic environment, which may cause stress for the driver.
- the present invention has been made in consideration of the above-mentioned problems with the conventional technology, and aims to provide a driving assistance display device and a driving assistance display method that can display road information according to changes in the traffic environment at intersections with other roads in a form that is easily visible to the driver.
- the driving assistance display device of the present invention is characterized in that it has a display unit having two display modes: a first display mode in which an object is displayed as a simplified schematic image, and a second display mode in which the object is displayed as a detailed image using an actual image, and that when a predetermined condition is satisfied, the first display mode and the second display mode are switched.
- the driving assistance display device of the present invention is provided with a display unit having two display modes: a first display mode in which an object is displayed as a simplified schematic image, and a second display mode in which an object is displayed as a detailed image using an actual image.
- the display mode switches between the first and second display modes. This makes it possible to display road information according to changes in the traffic environment in the intersection area with other roads in a form that is easy for the driver to see.
- the simplified image is an icon image of the object.
- the display device further includes a control unit that switches between the first display mode and the second display mode, and the predetermined condition is the number of objects, and the control unit switches to the first display mode when the number of objects is equal to or less than the predetermined number, and switches to the second display mode when the number of objects exceeds the predetermined number.
- the driving assistance display device further includes at least one of an illuminance acquisition unit that acquires illuminance information about the surroundings of the vehicle on which the driving assistance display device is mounted and a weather acquisition unit that acquires weather information about the surroundings, and the predetermined condition is the illuminance acquired by the illuminance acquisition unit or the weather information acquired by the weather acquisition unit, and the control unit switches to the first display mode when the illuminance according to the illuminance information is equal to or greater than a predetermined value or when the weather information indicates that the weather is not bad, and switches to the second display mode when the illuminance according to the illuminance information is less than the predetermined value or when the weather information indicates that the weather is bad.
- an illuminance acquisition unit that acquires illuminance information about the surroundings of the vehicle on which the driving assistance display device is mounted
- a weather acquisition unit that acquires weather information about the surroundings
- the predetermined condition is the illuminance acquired by the illuminance acquisition unit or the weather information acquired by the weather acquisition unit
- the vehicle further includes a position information acquisition unit that acquires position information of a vehicle equipped with the driving assistance display device, a pair of detection units that are arranged on the left and right front sides of the vehicle and detect objects present in a predetermined area, a pair of imaging units that are arranged on the left and right front sides of the vehicle and capture images of the predetermined area to acquire actual images, and a storage unit that stores map information, and the control unit identifies the position of the vehicle using the position information and the map information, and performs a switching operation between the first display mode and the second display mode when the vehicle changes course on the map information.
- a position information acquisition unit that acquires position information of a vehicle equipped with the driving assistance display device
- a pair of detection units that are arranged on the left and right front sides of the vehicle and detect objects present in a predetermined area
- a pair of imaging units that are arranged on the left and right front sides of the vehicle and capture images of the predetermined area to acquire actual images
- a storage unit that stores map information
- the control unit
- the driving assistance display method of the present invention is a driving assistance display method using a driving assistance display device equipped with a display unit having two display modes: a first display mode in which an object is displayed as a simplified schematic image, and a second display mode in which the object is displayed as a detailed image using an actual image, and is characterized in that the first display mode and the second display mode are switched when a predetermined condition is satisfied.
- the present invention provides a driving assistance display device and a driving assistance display method that can display road information according to changes in the traffic environment at intersections with other roads in a format that is easy for the driver to see.
- FIG. 1 is a top view showing an example of a vehicle equipped with a driving assistance display device according to an embodiment.
- 1 is a block diagram showing an example of a configuration of a driving assistance display device according to an embodiment;
- 2 is a schematic diagram showing operating areas of a camera, a first sensor, and a second sensor of the driving assistance display device according to the embodiment;
- FIG. 5 is a flowchart showing an example of a processing flow of the driving assistance display device according to the embodiment.
- 1A is a diagram showing an example of a traffic environment when the driving assistance display device according to the embodiment displays in display mode A
- FIG. 1B is a diagram showing an example of a HUD display.
- FIG. 1A is a diagram showing an example of a traffic environment when the driving assistance display device according to the embodiment displays in display mode A
- FIG. 1B is a diagram showing an example of a HUD display
- 1A is a diagram showing an example of a traffic environment when the driving assistance display device according to the embodiment displays in display mode B
- FIG. 1B is a diagram showing an example of a HUD display
- 1A is a diagram showing an example of a traffic environment when the driving assistance display device according to the embodiment displays in display mode B
- FIG. 1B is a diagram showing an example of a HUD display.
- a driving assistance display device according to this embodiment will be described by exemplifying a form in which it is mounted on a moving body such as a vehicle.
- the display device of the driving assistance display device according to this embodiment is not limited to a HUD, monitor, meter panel, etc., but the following embodiment will be described by exemplifying a case in which it is displayed on a HUD.
- the driving support display device presents the driver with information on the traffic environment at the approaching point in advance when the vehicle changes direction, such as when turning right or left at an intersection, making it possible to avoid dangers associated with changing direction as much as possible.
- the driving support display device has two display modes: display mode A (simplified display) which displays icon images on the HUD, and display mode B (detailed display) which displays actual images.
- display mode A simple display
- display mode B detailed display
- Display mode A and “display mode B” are examples of the "first display mode” and "second display mode” according to the present invention, respectively.
- the driving support display device basically displays in display mode A, and switches to display mode B when a predetermined condition is met.
- the driving support display device detects objects around the vehicle using one or more sensors to determine the traffic environment, but in the following description, moving objects such as other vehicles, motorcycles, and people are referred to as "attention-calling objects," and structures such as buildings, fences, and pillars are referred to as “obstacles" to distinguish them.
- Fig. 1 is a top view showing an example of a vehicle equipped with a driving assistance display device according to the present embodiment.
- the vehicle 30 according to the present embodiment is equipped with a left first sensor 14L, a right first sensor 14R (hereinafter collectively referred to as “first sensor 14"), a left second sensor 15L, a right second sensor 15R (hereinafter collectively referred to as "second sensor 15"), a left camera 17L, a right camera 17R (hereinafter collectively referred to as "camera 17”), and a third sensor 16, which are part of the driving assistance display device 10 (not shown).
- the first sensor 14L and the camera 17L are disposed inside the left headlight 31L, and the first sensor 14R and the camera 17R are disposed inside the right headlight 31R.
- the first sensor 14, the second sensor 15, the third sensor 16, and the camera 17 collect information to support the visibility of the driver 40.
- the camera 17 is an example of an imaging unit according to the present invention.
- FIG. 2 is a block diagram showing an example of the configuration of the driving assistance display device 10.
- the driving assistance display device 10 in addition to the first sensor 14, second sensor 15, third sensor 16, and camera 17 described above, the driving assistance display device 10 also includes a control unit 11, a position information acquisition unit 18, a memory unit 19, a display unit 20, an IP connection unit 23, and a route information acquisition unit 24.
- the location information acquisition unit 18 includes a receiver (not shown) and can receive GPS information and the like obtained from GPS (Global Positioning System) satellites.
- GPS Global Positioning System
- the GPS information can be used to determine the current location (latitude, longitude, etc.) of the vehicle 30.
- the IP connection unit 23 is a part that connects to the Internet via an IP (Internet protocol) network.
- the IP connection unit 23 is configured, for example, by a router or the like.
- the device connects to the Internet via the IP connection unit 23 and mainly obtains weather information.
- the device may also connect to the Internet via the IP connection unit 23 and obtain map information such as HD maps from a map information providing device (for example, a server) located on the IP network.
- the route information acquisition unit 24 is a component that acquires information in advance when the vehicle 30 changes route. More specifically, the route information acquisition unit 24 is connected to the ECU (Electrical Control Unit: not shown) of the vehicle 30, and acquires information related to the direction of route change, such as turn signal indication information and steering information, from the ECU (hereinafter sometimes referred to as "route change instructions").
- ECU Electronic Control Unit: not shown
- route change instructions information related to the direction of route change, such as turn signal indication information and steering information
- the memory unit 19 stores map data including detailed road information. By combining the location information acquired by the location information acquisition unit 18 with the map data, the position of the vehicle 30 on the map can be ascertained.
- a HDD hard disk drive
- a ROM read-only memory
- the memory unit 19 may also be a navigation system installed in the vehicle 30, etc.
- the first sensor 14 is a sensor for detecting an object that requires attention on the front side of the vehicle. As described above, the first sensor 14 is disposed inside the headlight 31. In this embodiment, a millimeter wave radar sensor is used as an example of the first sensor 14, but this is not limited to this, and LiDAR (Light Detection and Ranging), a camera, an ultrasonic sensor, etc. may also be used. The detection distance is set to about 200 m as an example. The detection range of the first sensor 14 will be explained with reference to FIG. 3.
- FIG. 3 is a diagram showing the right front of the body 39 of the vehicle 30, and illustrates the first sensor 14R, the second sensor 15R, and the camera 17R together with the headlight 31R.
- the left front of the body 39 is only line-symmetric with FIG. 3, so its explanation will be omitted.
- the detection range of the first sensor 14R is in the range of 10° to 120° when expressed in clockwise angles with the front being 0°. In FIG. 3, this range is shown diagrammatically as area A1. However, this is just one example, and may be set appropriately taking into account the relationship with other sensors, etc.
- the "first sensor 14" and the "attention-calling object” are examples of the “detection unit” and "object” according to the present invention, respectively.
- the second sensor 15 is a sensor for detecting obstacles on the side of the vehicle.
- the second sensor 15 is disposed in front of the vehicle 30.
- a millimeter wave radar sensor is used as an example of the second sensor 15, but this is not limited to this, and LiDAR, a camera, an ultrasonic sensor, etc. may also be used.
- the detection distance is approximately 200 m as an example.
- the detection range of the second sensor 15R will be described with reference to Figure 3.
- the detection range of the second sensor 15R is in the range of 45° to 135° when expressed in clockwise angles with the front being 0°. In Figure 3, this range is shown diagrammatically as area A2. However, this is only an example, and may be set appropriately taking into account the relationship with other sensors, etc.
- the structure detected by the second sensor 15 is within a predetermined distance, for example, within 5 m, the structure is determined to be an obstacle.
- a predetermined distance for example, within 5 m
- the structure is determined to be an obstacle.
- two radar sensors, the first sensor 14 and the second sensor 15, are arranged, but this is not limiting, and one radar sensor may be used for both, taking into consideration the detection range, etc.
- the third sensor 16 shown in FIG. 1 is an illuminance sensor that detects the brightness around the vehicle 30 for the purpose of detecting a poor visibility environment.
- the third sensor 16 is arranged on the dashboard directly below the windshield of the vehicle 30 as an example.
- a poor visibility environment refers to a case where the brightness is insufficient in a traffic environment such as night or darkness (hereinafter, sometimes referred to as "night, etc.”), or a case where the weather is bad such as thick fog, snow, or rain.
- the driving support display device 10 according to this embodiment is equipped with at least one of a sensor that detects night, etc., and a means for obtaining information on bad weather.
- the third sensor 16 is a sensor that detects night, etc., among poor visibility environments.
- the illuminance is less than a predetermined value, it is determined that the surroundings of the vehicle 30 are night, etc.
- the predetermined value is, as an example, 100 lx (lux). If it is only necessary to determine that it is night, it may be determined, for example, by the time.
- the "third sensor 16" is an example of an "illuminance acquisition unit" according to the present invention.
- weather information can be obtained from a site that handles weather information by connecting to the Internet via the IP connection unit 23. If the obtained weather information indicates, for example, that a dense fog warning has been issued, that the amount of snowfall exceeds 30 mm/hour, or that the amount of rainfall exceeds 30 mm/hour, it is determined that the environment in which the vehicle 30 is located is a poor visibility environment.
- the "IP connection unit 23" is an example of the "weather acquisition unit” according to the present invention.
- the camera 17 is an imaging means for acquiring an image (actual image) of the front side of the vehicle 30.
- the camera 17 is not particularly limited and may be a normal camera, but in this embodiment, a night vision camera is used.
- a night vision camera is a night vision monitor camera using an infrared camera, and can ensure clear visibility not only at night but also in rainy weather and thick fog.
- the imaging range of the camera 17R is in the range of 10° to 120° when expressed as a clockwise angle with the front as 0°. In this embodiment, the imaging range of the camera 17 and the detection range of the first sensor 14 are approximately equal. In FIG. 3, this imaging range is diagrammatically shown as area A3. However, this is only an example, and may be set appropriately taking into account the relationship with other sensors, etc.
- a high-sensitivity camera or an object identification system using a gating camera may be used as the camera 17.
- An object identification system using a gating camera is advantageous in bad weather because it identifies objects based on slice images in the depth direction.
- the display unit 20 is a part that displays the assistance information of the driving assistance display device 10.
- various display means such as a HUD, a monitor, or a meter panel can be used as the display unit 20 in this embodiment, but a HUD is used in this embodiment.
- the control unit 11 includes a processing unit 12 and a control signal generating unit 13.
- the first sensor 14, the second sensor 15, the third sensor 16, the camera 17, the position information acquiring unit 18, the memory unit 19, the display unit 20, the IP connection unit 23, and the course information acquiring unit 24 are each connected to the control unit 11.
- the control unit 11 mainly acquires information from the first sensor 14, the second sensor 15, the third sensor 16, the camera 17, the position information acquiring unit 18, the memory unit 19, the IP connection unit 23, and the course information acquiring unit 24, and mainly controls the display unit 20, the IP connection unit 23, the memory unit 19, the processing unit 12, and the control signal generating unit 13.
- the control unit 11 may be, for example, a microcomputer including a CPU, ROM, RAM, etc. (not shown).
- the processing unit 12 processes information acquired from the first sensor 14, the second sensor 15, the third sensor 16, the camera 17, the location information acquisition unit 18, the memory unit 19, the IP connection unit 23, and the route information acquisition unit 24.
- the control signal generating unit 13 generates data for displaying an image on the display unit 20 (HUD in this embodiment) based on the data processed by the processing unit 12.
- the processing unit 12 and the control signal generating unit 13 are each realized by software, but this is not limited thereto, and at least a part of them may be realized by hardware such as an ASIC.
- Fig. 4 is a flowchart showing the process flow of a display program describing this display process.
- This display program is stored in a storage means such as a ROM (not shown) as an example, and is read by the CPU, deployed in a RAM, etc., and executed.
- the instruction to start execution can be, for example, when the control unit 11 receives a notification that the engine of the vehicle 30 has started. Alternatively, it can be when the driver enables the display of the HUD.
- the processing unit 12 of the control unit 11 continuously or intermittently obtains the position information of the vehicle 30, which is the vehicle itself, from the position information acquisition unit 18.
- the processing unit 12 continuously or intermittently obtains weather information from the Internet via the IP connection unit 23.
- step S10 the system waits until an intersection CS is detected.
- Detection of the intersection CS is performed in the following procedure. That is, first, the position of the vehicle 30 (latitude, longitude, etc.) is acquired from GPS information from the position information acquisition unit 18. Next, a map of the surrounding area of the position is acquired from map information stored in the memory unit 19. Then, it is determined whether the vehicle 30 is near the intersection CS. The determination that the vehicle 30 is near the intersection CS may be made, for example, based on whether the position of the vehicle 30 is within 30 m of the center of the intersection CS. If the intersection CS is detected, the system proceeds to step S11.
- step S11 the control unit 11 determines whether a lane change instruction has been issued.
- the presence or absence of a lane change instruction is determined, for example, by acquiring turn signal information from the vehicle's ECU (not shown) via the route information acquisition unit 24. If the determination is positive, the process proceeds to step S12, and if the determination is negative, the process returns to step S10 and continues detecting intersections.
- step S12 the control unit 11 determines whether the first sensor 14 has detected an object requiring attention (another vehicle, a motorcycle, a person, etc.) on the front side of the vehicle 30. If the determination is positive, the process proceeds to step S13, and if the determination is negative, the process returns to step S10 and continues detecting the intersection.
- an object requiring attention another vehicle, a motorcycle, a person, etc.
- step S13 the control unit 11 controls the control signal generation unit 13 to cause the display unit 20 to display in display mode A.
- FIG. 5(a) shows an example of a traffic environment when this step is applied
- FIG. 5(b) shows an example of the display of the HUD display area 35.
- the HUD image is displayed in the HUD display area 35 of the windshield 34.
- vehicle 30, which is the vehicle itself is approaching intersection CS and is about to turn right.
- another vehicle, vehicle 32 is stopped in the direction in which vehicle 30 is entering.
- first sensor 14 detects vehicle 32, which is an object to be warned about, in area A1 on the front lateral side of vehicle 30.
- FIG. 5(a) shows an example of the display in the HUD display area 35 in this case.
- the control unit 11 symbolizes the vehicle 32 with an icon, and displays the icon image 21 in the HUD display area 35.
- the icon image 21 moves in accordance with the movement of the vehicle 32.
- the icon image 21 is shown as an example of three arcs arranged in a row, but this is not limiting and an appropriate image may be used taking into consideration the visibility of the driver 40, etc.
- step S14 the control unit 11 determines whether the second sensor 15 has detected an obstacle (a building, a fence, a pillar, etc.) on the side of the vehicle 30. If the determination is positive, the process proceeds to step S15, and if the determination is negative, the process returns to step S10 and continues detecting the intersection.
- an obstacle a building, a fence, a pillar, etc.
- FIG. 6(a) shows an example of a traffic environment when this step is applied
- FIG. 6(b) shows an example of the display in the HUD display area 35.
- the second sensor 15 scans area A2 for obstacles to the side of the vehicle 30.
- the second sensor 15 detects a building 50.
- merely detecting an obstacle does not change the display in the HUD display area 35.
- not only attention-demanding objects but also obstacles are detected for reasons such as blocking visibility or obstructing driving.
- step S15 the control unit 11 determines whether multiple attention-demanding objects have been detected. If the determination is positive, the process proceeds to step S17, and if the determination is negative, the process proceeds to step S16.
- Steps S15 to S17 are a process for switching from display mode A to display mode B.
- the display mode is switched from display mode A to display mode B.
- the following two conditions are applied as an example of the predetermined environmental conditions.
- Environmental condition 1 When multiple attention-attracting objects are detected
- Environmental condition 2 When the vehicle is in an environment with poor visibility That is, step S15 is a judgment of environmental condition 1. Note that, in this embodiment, an example is described in which the display mode is switched to display mode A when one attention-attracting object is detected, and the display mode is switched to display mode B when multiple attention-attracting objects are detected, but the present invention is not limited to this.
- the number of attention-attracting objects for switching between the two modes may be an appropriate number taking into consideration the specifications of the driving assistance display device 10, etc.
- step S17 the control unit 11 controls the display unit 20 to display in display mode B.
- FIG. 7(a) shows an example of a traffic environment when this step is applied
- FIG. 7(b) shows an example of a display in the HUD display area 35.
- the first sensor 14 scans the area A1 for attention-demanding objects in front of the vehicle 30.
- FIG. 7(a) two other vehicles, vehicles 32 and 33, are detected.
- the control unit 11 switches the display image in the HUD display area 35 from an icon image to the actual image 22 captured by the camera 17.
- the display mode is switched from A to B.
- the image in the HUD display area 35 in this case is as shown in FIG. 7(b).
- the actual images of the vehicles 32 and 33 are displayed in the HUD display area 35.
- a building 50 detected by the second sensor 15 is also shown.
- step S16 the control unit 11 determines whether a poor visibility environment has been detected as the current traffic environment. That is, it performs a determination of environmental condition 2 described above.
- a poor visibility environment is assumed to be nighttime, or bad weather.
- the control unit 11 detects a poor visibility environment. If the determination is positive, the process proceeds to step S17, and the display of the HUD display area 35 is switched from display mode A to display mode B. On the other hand, if the determination is negative, the process returns to step S10, and detection of the intersection CS continues.
- FIG. 8(a) shows an example of a traffic environment when this step is applied
- FIG. 8(b) shows an example of a display in the HUD display area 35.
- FIG. 8(a) illustrates an example of a case where the weather is dense fog, and the fine dots shown in FIG. 8(a) represent the fog.
- the first sensor 14 is scanning the area A1 for attention-demanding objects on the side of the vehicle 30, and in FIG. 8(a), another vehicle, vehicle 32, is detected.
- FIG. 8(b) actual images of the vehicle 32 and building 50 are displayed in the HUD display area 35.
- a night vision camera is used as the camera 17, so the image of the vehicle 32 and the like is clear despite the dense fog.
- step S18 it is determined whether an end command has been issued. If the determination is positive, the display program is terminated, and if the determination is negative, the process returns to step S10 and detection of the intersection CS continues.
- the command to end the display program may be determined, for example, when the control unit 11 receives information indicating that the engine of the vehicle 30 has been stopped by the driver. Alternatively, it may be determined when the HUD display is disabled by the driver.
- the environment around the vehicle 30 usually changes from moment to moment. In other words, it is expected that switching from display mode A to display mode B and from display mode B to display mode A will occur frequently. Even in such a case, this display process always returns to step S10 and loops until an end instruction is given after the process has started, so that changes in the traffic environment can be reliably tracked. Note that the flowchart shown in FIG. 4 is only an example, and the steps may be rearranged or swapped as long as no inconsistencies arise in the process flow.
- the driving assistance display device and driving assistance display method can provide a driving assistance display device and driving assistance display method that can display road information according to changes in the traffic environment at intersections with other roads in a form that is easy for the driver to see.
- the driving assistance display device and driving assistance display method according to the present invention have been described using a lane change at an intersection as an example, but this is not limited to this.
- the same can be applied when entering or exiting a parking lot adjacent to a road.
- traffic environment information may be acquired using at least one type of sensor.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Instrument Panels (AREA)
Abstract
L'invention concerne un dispositif d'affichage d'aide à la conduite qui permet l'affichage d'informations de voie de circulation correspondant à un changement de l'environnement de circulation dans une région d'intersection avec une autre voie de circulation, sous une forme qui pouvant être reconnue visuellement facilement par un conducteur, ainsi qu'un procédé d'affichage d'aide à la conduite. Ce dispositif d'affichage d'aide à la conduite comprend une unité d'affichage (20) qui présente deux modes d'affichage, parmi un premier mode d'affichage dans lequel des objets sont affichés dans une image simplifiée dans laquelle des objets sont schématisés, et un second mode d'affichage dans lequel les objets sont affichés dans une image détaillée utilisant des images réelles, et qui est caractérisée par la commutation entre le premier mode d'affichage et le second mode d'affichage dans un cas où une condition qui est définie à l'avance est satisfaite.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023034143A JP2024125967A (ja) | 2023-03-06 | 2023-03-06 | 運転支援表示装置、および運転支援表示方法 |
| JP2023-034143 | 2023-03-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024185680A1 true WO2024185680A1 (fr) | 2024-09-12 |
Family
ID=92675095
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/007743 Pending WO2024185680A1 (fr) | 2023-03-06 | 2024-03-01 | Dispositif d'affichage d'aide à la conduite et procédé d'affichage d'aide à la conduite |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2024125967A (fr) |
| WO (1) | WO2024185680A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011108198A1 (fr) * | 2010-03-03 | 2011-09-09 | 本田技研工業株式会社 | Dispositif de surveillance de zone environnante pour véhicule |
| WO2012172842A1 (fr) * | 2011-06-13 | 2012-12-20 | 本田技研工業株式会社 | Dispositif d'aide à la conduite |
| JP2018074286A (ja) * | 2016-10-26 | 2018-05-10 | 三菱自動車工業株式会社 | 運転支援装置 |
| JP2020078084A (ja) * | 2020-02-03 | 2020-05-21 | 株式会社Jvcケンウッド | 車両用表示制御装置、車両用表示システム、車両用表示制御方法およびプログラム |
-
2023
- 2023-03-06 JP JP2023034143A patent/JP2024125967A/ja active Pending
-
2024
- 2024-03-01 WO PCT/JP2024/007743 patent/WO2024185680A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011108198A1 (fr) * | 2010-03-03 | 2011-09-09 | 本田技研工業株式会社 | Dispositif de surveillance de zone environnante pour véhicule |
| WO2012172842A1 (fr) * | 2011-06-13 | 2012-12-20 | 本田技研工業株式会社 | Dispositif d'aide à la conduite |
| JP2018074286A (ja) * | 2016-10-26 | 2018-05-10 | 三菱自動車工業株式会社 | 運転支援装置 |
| JP2020078084A (ja) * | 2020-02-03 | 2020-05-21 | 株式会社Jvcケンウッド | 車両用表示制御装置、車両用表示システム、車両用表示制御方法およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2024125967A (ja) | 2024-09-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10436600B2 (en) | Vehicle image display system and method | |
| JP4855158B2 (ja) | 運転支援装置 | |
| US10269331B2 (en) | Display control device for vehicle | |
| JP3864406B2 (ja) | 車両の表示装置 | |
| JP4432801B2 (ja) | 運転支援装置 | |
| US10564434B2 (en) | Display control device for vehicle | |
| EP3216667A1 (fr) | Système de commande de véhicule | |
| US9026356B2 (en) | Vehicle navigation system and method | |
| EP1961613A2 (fr) | Procédé d'assistance à la conduite et dispositif d'assistance à la conduite | |
| US20090187333A1 (en) | Method and System for Displaying Navigation Instructions | |
| WO2014027489A1 (fr) | Dispositif d'assistance à la conduite | |
| JP2007228448A (ja) | 撮像環境認識装置 | |
| JP2008250503A (ja) | 運転支援装置 | |
| JP2009211624A (ja) | 運転支援装置、運転支援方法及びコンピュータプログラム | |
| JP2008222153A (ja) | 合流支援装置 | |
| JP2000090393A (ja) | 車載型走行路環境認識装置 | |
| US11697425B1 (en) | Method and system for assisting drivers in locating objects that may move into their vehicle path | |
| JP2009214831A (ja) | 車両用表示装置 | |
| JP3900508B2 (ja) | 車両の周囲情報報知装置 | |
| US20100271720A1 (en) | System and method for driving assistance at road intersections | |
| JP2008249634A (ja) | 対象物検出装置 | |
| WO2024185680A1 (fr) | Dispositif d'affichage d'aide à la conduite et procédé d'affichage d'aide à la conduite | |
| JP2008097279A (ja) | 車両外部情報表示装置 | |
| JP2007156808A (ja) | 運転支援装置及び運転支援方法 | |
| JP2006344133A (ja) | 道路区画線検出装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24767048 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |