WO2020105685A1 - Dispositif, procédé et programme informatique de commande d'affichage - Google Patents
Dispositif, procédé et programme informatique de commande d'affichageInfo
- Publication number
- WO2020105685A1 WO2020105685A1 PCT/JP2019/045494 JP2019045494W WO2020105685A1 WO 2020105685 A1 WO2020105685 A1 WO 2020105685A1 JP 2019045494 W JP2019045494 W JP 2019045494W WO 2020105685 A1 WO2020105685 A1 WO 2020105685A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- information
- information image
- displayed
- visibility
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/149—Instrument input by detecting viewing direction not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/191—Highlight information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/347—Optical elements for superposition of display information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/40—Hardware adaptations for dashboards or instruments
- B60K2360/48—Sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/301—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/302—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
- B60R2300/308—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene by overlaying the real scene, e.g. through a head-up display on the windscreen
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/70—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by an event-triggered choice to display a specific image among a selection of captured images
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/045—Zooming at least part of an image, i.e. enlarging it or shrinking it
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
Definitions
- the present disclosure relates to a display control device, a method, and a computer program that are used in a vehicle and that superimpose an image on the foreground of the vehicle for visual recognition.
- Patent Document 1 discloses a vehicle image display system that relatively enhances the perceptibility of an image that is not perceived by the driver by reducing the conspicuousness of the image that matches the line of sight of the driver of the vehicle. It is disclosed.
- the outline of the present disclosure relates to improving the visibility of a driver's visual field in front of the driver's field of view and an image displayed in an overlapping manner with the actual scene. More specifically, the present invention relates to facilitating the transmission of information to the driver while suppressing the visual stimulus of the image displayed in an overlapping manner with the real scene.
- the display control device described in the present specification does not lower the visibility than the first information image that reduces the visibility when the line of sight is facing, and the degree to which the visibility of the first information image decreases when the line of sight is oriented. And a second information image.
- the degree of change in the visibility of the information image when the line of sight is directed may be determined according to the magnitude of the risk potential of the information indicated by the information image.
- FIG. 3 is a block diagram of a vehicular display system according to some embodiments.
- FIG. 6 is a flow diagram of a process for reducing image visibility according to some embodiments.
- FIG. 6 is a flow diagram of a process for increasing image visibility according to some embodiments. It is a figure which shows the example of the image which the display system for vehicles displays according to some embodiments. It is a figure which shows the example of the image which the display system for vehicles displays according to some embodiments.
- FIG. 6 is a flow diagram of a process for reducing image visibility according to some embodiments. It is a figure which shows the example of the image which the display system for vehicles displays according to some embodiments. It is a figure which shows the example of the image which the display system for vehicles displays according to some embodiments.
- the image display unit 11 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
- the HUD device emits the display light 11a toward the front windshield 2 (which is an example of a member to be projected) and displays the image 200 in the virtual display area 100, so that the front windshield 2 is used.
- the image 200 is visually recognized by being superimposed on the foreground 300 which is the visually recognized real space.
- the image display unit 11 may be a head mounted display (hereinafter, HMD) device.
- HMD head mounted display
- the driver 4 mounts the HMD device on his / her head and sits in the seat of the host vehicle 1 to visually recognize the displayed image 200 by superimposing it on the foreground 300 through the front windshield 2 of the host vehicle 1.
- the display area 100 in which the vehicle display system 10 displays the predetermined image 200 is fixed at a specific position with the coordinate system of the host vehicle 1 as a reference, and when the driver 4 turns in that direction, the display area 100 is displayed at the specific position.
- the image 200 displayed in the fixed display area 100 can be visually recognized.
- the image display unit 11 Under the control of the display control device 13, the image display unit 11 includes obstacles (pedestrians, bicycles) existing in the foreground 300, which is a real space (real scene) visually recognized through the front windshield 2 of the vehicle 1. , A motorcycle, another vehicle, etc., a road surface, a road sign, the vicinity of the real object 310 such as a feature (building, bridge, etc.) (an example of the positional relationship between the image and the real object), and the position overlapping the real object 310 ( By displaying the image 200 at a position (an example of the positional relationship between the image and the real object) or a position (an example of the positional relationship between the image and the real object) set with the real object 310 as a reference, the visual augmented reality ( AR: Augmented Reality) can also be formed.
- the image display unit 11 displays a first information image 210 (AR image) and a second information image 220 (AR image) that differ according to the type of information to be provided (described in detail later).
- FIG. 2 is a block diagram of a vehicle display system 10 according to some embodiments.
- the vehicle display system 10 includes an image display unit 11 and a display control device 13 that controls the image display unit 11.
- the display controller 13 includes one or more I / O interfaces 14, one or more processors 16, one or more storage units 18, and one or more image processing circuits 20. ..
- the various functional blocks depicted in FIG. 2 may be implemented in hardware, software, or a combination of both.
- FIG. 2 is only one embodiment of an implementation, and the illustrated components may be combined into fewer components or there may be additional components.
- image processing circuitry 20 eg, a graphics processing unit
- processors 16 e.g, a graphics processing unit
- the processor 16 and the image processing circuit 20 are operably connected to the storage unit 18. More specifically, the processor 16 and the image processing circuit 20 execute the program stored in the storage unit 18 to operate the vehicle display system 10, such as generating or / and transmitting image data. It can be carried out.
- the processor 16 or / and the image processing circuit 20 may be at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), at least one field programmable gate array (FPGA). , Or any combination thereof.
- the storage unit 18 includes any type of magnetic medium such as a hard disk, any type of optical medium such as CD and DVD, any type of semiconductor memory such as volatile memory, and non-volatile memory. Volatile memory may include DRAM and SRAM, and non-volatile memory may include ROM and NVROM.
- the processor 16 is operably connected to the I / O interface 14.
- the I / O interface 14 uses the vehicle display system 10 as a personal area network (PAN) such as a Bluetooth (registered trademark) network or a local area network (LAN) such as an 802.11x Wi-Fi (registered trademark) network.
- PAN personal area network
- LAN local area network
- 802.11x Wi-Fi registered trademark
- a wireless communication interface for connecting to a wide area network (WAN) such as a 4G or LTE cellular network can be included.
- the I / O interface 14 may also include a wired communication interface such as, for example, a USB port, a serial port, a parallel port, an OBDII, and / or any other suitable wired communication port.
- the processor 16 is interoperably connected to the I / O interface 14 so as to be informed of various other electronic devices connected to the vehicle display system 10 (I / O interface 14). Can be given and received.
- the I / O interface 14 includes, for example, a vehicle ECU 401 provided in the host vehicle 1, a road information database 403, a host vehicle position detector 405, a vehicle exterior sensor 407, a line-of-sight direction detector 409, an eye position detector 411, and portable information.
- the terminal 413, the communication connection device 420 outside the vehicle, and the like are operably connected.
- the image display unit 11 is operably connected to the processor 16 and the image processing circuit 20.
- the image displayed by the image display unit 11 may be based on the image data received from the processor 16 and / or the image processing circuit 20.
- the processor 16 and the image processing circuit 20 control the image displayed by the image display unit 11 based on the information obtained from the I / O interface 14.
- the I / O interface 14 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
- the host vehicle 1 is in the state of the host vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, engine throttle opening, injector fuel injection amount, engine speed, motor speed, steering angle, shift position, drive mode). , Various warning states) are included in the vehicle ECU 401.
- the vehicle ECU 401 controls each unit of the host vehicle 1, and can transmit vehicle speed information indicating the current vehicle speed of the host vehicle 1 to the processor 16, for example. It should be noted that the vehicle ECU 401 can send the determination result of the data detected by the sensor and / or the analysis result to the processor 16 in addition to or instead of simply sending the data detected by the sensor to the processor 16. For example, information indicating whether the host vehicle 1 is traveling at a low speed or is stopped may be transmitted to the processor 16.
- the vehicle ECU 401 may transmit an instruction signal for instructing the image 200 displayed by the vehicle display system 10 to the I / O interface 14, and at this time, the coordinates of the image 200, the notification necessity degree of the image 200, or / And the necessity degree related information which is a basis for determining the notification necessity degree may be added to the instruction signal and transmitted.
- the own vehicle 1 may include a road information database 403 including a navigation system and the like.
- the road information database 403 is based on the position of the own vehicle 1 acquired from the own vehicle position detection unit 405 described later, and is road information (lane, white line, stop line, Crosswalk, width of road, number of lanes, intersection, curve, branch road, traffic regulation, etc.), presence / absence of feature information (buildings, bridges, rivers, etc.), position (including distance to own vehicle 1), direction , Shape, type, detailed information, etc. may be read and transmitted to the processor 16. Further, the road information database 403 may calculate an appropriate route from the starting point to the destination and send it to the processor 16 as navigation information.
- the host vehicle 1 may include a host vehicle position detection unit 405 such as a GNSS (Global Navigation Satellite System).
- the road information database 403, the portable information terminal 413, which will be described later, and / or the external communication connecting device 420 acquires the position information of the own vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at every predetermined event.
- the information around the host vehicle 1 can be selected / generated and transmitted to the processor 16.
- the host vehicle 1 may include one or more vehicle exterior sensors 407 that detect a real object existing in the vicinity of the host vehicle 1 (particularly the foreground 300 in this embodiment).
- the real object detected by the vehicle exterior sensor 407 includes, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (a preceding vehicle, etc.), a road surface, a marking line, a roadside object, and / or a feature (a building etc.).
- the vehicle exterior sensor for example, a millimeter wave radar, an ultrasonic radar, a radar sensor such as a laser radar, there is a camera sensor consisting of a camera and an image processing device, may be configured by a combination of both the radar sensor, the camera sensor, It may be configured with only one of them.
- One or more out-of-vehicle sensors 407 detect a real object in front of the host vehicle 1 for each detection cycle of the sensors and detect real object related information (presence or absence of real object, which is an example of real object related information).
- information such as the position, size, and / or type of each real object
- the real object related information may be transmitted to the processor 16 via another device (for example, the vehicle ECU 401).
- a camera is used as a sensor, an infrared camera or a near infrared camera is desirable so that a real object can be detected even when the surroundings are dark such as at night.
- a stereo camera that can acquire a distance and the like by parallax is desirable.
- the host vehicle 1 may include a line-of-sight direction detection unit 409 including an infrared camera that detects the gaze direction of the driver 4 (hereinafter, also referred to as “line-of-sight direction”) that captures the face of the driver 4.
- the processor 16 can specify the line-of-sight direction of the driver 4 by acquiring an image captured by the infrared camera (an example of information that can estimate the line-of-sight direction) and analyzing the captured image. Note that the processor 16 may acquire, from the I / O interface 14, the line-of-sight direction of the driver 4 specified by the line-of-sight direction detection unit 409 (or another analysis unit) from the image captured by the infrared camera.
- the method for acquiring the driver's 4 line-of-sight direction of the vehicle 1 or the information capable of estimating the driver's 4 line-of-sight direction is not limited to these, and the EOG (Electro-oculogram) method, corneal reflex Method, scleral reflection method, Purkinje image detection method, search coil method, infrared fundus camera method, and other known gaze direction detection (estimation) techniques may be used.
- EOG Electro-oculogram
- corneal reflex Method corneal reflex Method
- scleral reflection method Purkinje image detection method
- search coil method search coil method
- infrared fundus camera method and other known gaze direction detection (estimation) techniques
- the host vehicle 1 may include an eye position detection unit 411 including an infrared camera that detects the position of the eyes of the driver 4.
- the processor 16 can specify the eye position of the driver 4 by acquiring an image captured by the infrared camera (an example of information that can estimate the eye position) and analyzing the captured image.
- the processor 16 may acquire the information on the position of the eyes of the driver 4 identified from the image captured by the infrared camera from the I / O interface 14.
- the method for acquiring the position of the eyes of the driver 4 of the vehicle 1 or the information capable of estimating the position of the eyes of the driver 4 is not limited to these, and known eye position detection (estimation) It may be acquired using a technology.
- the processor 16 adjusts at least the position of the image 200 based on the position of the eyes of the driver 4 so that the viewer who has detected the eye position of the image 200 superimposed on the desired position of the foreground 300 (driver 4). May be visually confirmed.
- the mobile information terminal 413 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by the driver 4 (or another occupant of the vehicle 1).
- the I / O interface 14 can communicate with the mobile information terminal 413 by pairing with the mobile information terminal 413, and data recorded in the mobile information terminal 413 (or a server via the mobile information terminal). To get.
- the mobile information terminal 413 has, for example, the same function as the road information database 403 and the vehicle position detection unit 405 described above, acquires the road information (an example of the real object related information), and transmits it to the processor 16. Good.
- the mobile information terminal 413 may also acquire commercial information (an example of real object-related information) related to a commercial facility in the vicinity of the vehicle 1, and transmit the commercial information to the processor 16.
- the mobile information terminal 413 transmits schedule information of the owner (for example, the driver 4) of the mobile information terminal 413, incoming information at the mobile information terminal 413, mail reception information, etc. to the processor 16,
- the image processing circuit 20 may generate or / and transmit image data regarding these.
- the outside-vehicle communication connection device 420 is a communication device for exchanging information with the own vehicle 1, and, for example, another vehicle connected to the own vehicle 1 via vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), pedestrian-to-vehicle communication (V2P: It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything).
- V2V Vehicle To Vehicle
- V2P pedestrian-to-vehicle communication
- V2P It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything).
- the extra-vehicle communication connection device 420 acquires the position of, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (such as a preceding vehicle), a road surface, a marking line, a roadside object, and / or a feature (such as a building), and the processor 16 May be sent to.
- the vehicle exterior communication connection device 420 may have the same function as the own vehicle position detection unit 405 described above, may acquire the position information of the own vehicle 1 and may transmit the position information to the processor 16, and further, the road information database described above. It also has the function of 403, and may acquire the road information (an example of real object-related information) and send it to the processor 16.
- the information acquired from the vehicle exterior communication connection device 420 is not limited to the above.
- the software components stored in the storage unit 18 are the real object related information detection module 502, the notification necessity degree detection module 504, the image type determination module 506, the image position determination module 508, the image size determination module 510, and the eye position detection module 512.
- the real object related information detection module 502 detects the position and size of the real object existing in the foreground 300 of the own vehicle 1, which is the basis for determining the coordinates and size of the image 200 described later.
- the real object-related information detection module 502 uses, for example, the road information database 403, the vehicle exterior sensor 407, or the vehicle exterior communication connection device 420 to identify the real object 310 existing in the foreground 300 of the vehicle 1 (for example, the road surface shown in FIG. 5A). 311, a preceding vehicle 312, a pedestrian 313, a building 314, etc.) (height direction (vertical direction) when the driver 4 of the own vehicle 1 visually recognizes the traveling direction (front) of the own vehicle 1 from the driver's seat). , The position in the horizontal direction (horizontal direction), and the position in the depth direction (forward direction) may be added thereto, and the size of the real object 310 (height direction, horizontal size) is acquired. You may.
- the real object related information detection module 502 may determine whether or not the environment (the bad weather such as rain, fog, or snow) in which the detection accuracy decreases when detecting the position or size of the real object by the vehicle exterior sensor 407. Good. For example, the position detection accuracy of the real object is calculated, and the determination result of the degree of decrease in the detection accuracy, the position where the detection accuracy is decreased, and the environment (bad weather) in which the detection accuracy is decreased are determined by the processor. 16 may be transmitted.
- the bad weather may be determined by acquiring the weather information of the position where the host vehicle 1 travels from the portable information terminal 413, the vehicle exterior communication connection device 420, or the like.
- the real object related information detection module 502 is a source of determining the content of the image 200 described below (hereinafter, also referred to as “image type” as appropriate), which is information about the real object existing in the foreground 300 of the host vehicle 1.
- image type information about the real object existing in the foreground 300 of the host vehicle 1.
- the real object related information is, for example, type information indicating the type of the real object such as a pedestrian or another vehicle, moving direction information indicating the moving direction of the real object, distance to the real object, or the like. It is distance time information indicating the arrival time, or individual detailed information of a real object such as a charge of a parking lot (real object) (but not limited to these).
- the real object related information detection module 502 obtains the type information, the distance time information, and / or the individual detailed information from the road information database 403 or the portable information terminal 413, and the type information, the moving direction information, or the out-of-vehicle sensor 407.
- the type information, the moving direction information, the distance time information, and / or the individual detailed information may be detected from the vehicle exterior communication connection device 420 by acquiring the / and distance time information.
- the notification necessity level detection module 504 notifies the driver 4 of the real object position information detected by the real object related information detection module 502 and the real object related information detected by the real object related information detection module 502. Necessity) is detected.
- the notification necessity degree detection module 504 may detect the notification necessity degree from various other electronic devices connected to the I / O interface 14.
- the electronic device connected to the I / O interface 14 in FIG. 2 transmits information to the vehicle ECU 401, and the notification necessity detection module 504 detects (acquires) the notification necessity degree determined by the vehicle ECU 401 based on the received information. ) May be.
- the “information need level” is, for example, a risk level derived from the degree of seriousness of the possibility itself, an urgency level derived from the length of the reaction time required to take a reaction action, the own vehicle 1 or the driver 4 ( Alternatively, it may be determined based on the effectiveness derived from the situation of other occupants of the vehicle 1 or a combination thereof (the indicator of the notification necessity degree is not limited to these).
- the notification necessity detection module 504 may detect the necessity degree related information that is the basis for estimating the notification necessity degree of the image 200, and estimate the notification necessity degree from this.
- Necessity degree related information that is a basis for estimating the notification necessity degree of the image 200 may be estimated, for example, by the position and type of a real object or traffic regulation (an example of road information), and is connected to the I / O interface 14. It may be estimated based on other information input from various other electronic devices described above, or in consideration of other information.
- the vehicle display system 10 may not have the function of estimating the notification necessity degree, and a part or all of the function of estimating the notification necessity degree may be included in the display control device of the vehicle display system 10. It may be provided separately from 13.
- the image type determination module 506 for example, the type and position of the real object detected by the real object related information detection module 502, the type, number, or / and of the real object related information detected by the real object related information detection module 502.
- the type of the image 200 to be displayed on the real object can be determined based on the degree of the (necessary) notification need detected by the notification need detection module 504. Further, the image type determination module 506 may increase or decrease the type of the image 200 to be displayed according to the determination result by the visual recognition detection module 514 described later. Specifically, when the real object 310 is in a state where it is difficult for the driver 4 to visually recognize it, the types of the image 200 visually recognized by the driver 4 in the vicinity of the real object may be increased.
- the image position determination module 508 is based on the position of the real object detected by the real object-related information detection module 502, and the coordinates of the image 200 (when the driver 4 views the direction of the display area 100 from the driver's seat of the own vehicle 1).
- the horizontal direction (X-axis direction) and the vertical direction (Y-axis direction) are determined.
- the image position determination module 508 determines the coordinates of the image 200 so as to have a predetermined positional relationship with the specific real object. For example, the positions of the image 200 in the left-right direction and the vertical direction are determined so that the center of the image 200 is visually recognized so as to overlap with the center of the real object.
- the image position determination module 508 can determine the coordinates of the image 200 so that the image 200 has a predetermined positional relationship with reference to a real object that is not directly related. For example, as shown in FIG. 5A, the lane markings based on the left lane marking 311a (an example of a real object) and the right lane marking 311b (an example of a real object) of the lane (road surface 310) on which the vehicle 1 is traveling are used.
- the coordinates of the first FCW image 221 described later that is related to the preceding vehicle 312 (an example of a specific real object) that is not directly related to may be determined (or corrected).
- the "predetermined positional relationship" can be adjusted depending on the situation of the real object or the host vehicle 1, the type of the real object, the type of the displayed image, and the like.
- the image size determination module 510 associates and displays the image 200 detected by the real object related information detection module 502 with the type and position of the real object and the type of real object related information detected by the real object related information detection module 502. , Number, and / or the size of the (necessary) notification need detected by the notification need detection module 504, the size of the image 200 can be determined. Also, the image size determination module 510 can change the image size according to the number of types of the image 200. For example, the image size may be reduced as the number of types of the image 200 increases.
- the eye position detection module 512 detects the position of the eyes of the driver 4 of the vehicle 1.
- the eye position detection module 512 determines where the eye height of the driver 4 is in the height region provided in a plurality of stages, detects the eye height of the driver 4 (position in the Y-axis direction), Detection of the eye height (position in the Y axis direction) and position in the depth direction (position in the Z axis direction) of the driver 4, and / or the position of the eye of the driver 4 (position in the X, Y, Z axis directions) ) Detection, and various software components for performing various operations associated therewith.
- the eye position detection module 512 can acquire the eye position of the driver 4 from the eye position detection unit 411, or can estimate the eye position including the eye height of the driver 4 from the eye position detection unit 411, for example. Information is received and the eye position including the eye height of the driver 4 is estimated.
- the information capable of estimating the position of the eyes includes, for example, the position of the driver's seat of the vehicle 1, the position of the driver's 4 face, the height of the sitting height, and the input value by the driver 4 on the operation unit (not shown). Good.
- the visual recognition detection module 514 detects whether the driver 4 of the vehicle 1 visually recognizes the predetermined image 200.
- the visual recognition detection module 514 is for executing various operations regarding whether the driver 4 visually recognizes the predetermined image 200 and whether the driver 4 visually recognizes the periphery (vicinity) of the predetermined image 200. Includes various software components.
- the visual recognition detection module 514 compares the gaze position GZ of the driver 4, which will be described later, acquired from the line-of-sight direction detection unit 409 with the position of the image 200 acquired from the graphic module 518, and the driver 4 visually recognizes the image 200. It may be determined whether or not the image has been visually recognized, and the determination result as to whether or not the image is visually recognized, and information for identifying the visually recognized image 200 may be transmitted to the processor 16.
- the visual recognition detection module 514 determines an area of a predetermined width that is preset from the outer edge of the image 200 to the outside in order to determine whether the driver 4 visually recognizes the periphery (vicinity) of the predetermined image 200. It may be set as the periphery of the image 200, and when a gaze position GZ described later enters the periphery of the image 200, it may be determined that the driver 4 of the own vehicle 1 visually recognizes the predetermined image 200. The visual recognition determination is not limited to these means.
- the visual recognition detection module 514 may detect what the driver 4 is visually recognizing other than the image 200. For example, the visual recognition detection module 514 detects the position of the real object 310 existing in the foreground 300 of the vehicle 1 detected by the real object-related information detection module 502 and the gaze position of the driver 4 to be described later acquired from the line-of-sight direction detection unit 409. Information that identifies the real object 310 that is gazing and that identifies the real object 310 that has been visually recognized may be transmitted to the processor 16 by comparing GZ and GZ.
- the behavior determination module 516 detects the behavior of the driver 4 that is not appropriate for the information shown in the first information image 210 described later. In some of the first information images 210, the inappropriate behavior of the driver 4 is stored in the storage unit 18 in association with each other. The behavior determination module 516 particularly determines whether or not an inappropriate behavior of the driver 4 associated with the first information image 210 with reduced visibility is detected. For example, when the first information image 210 includes route guidance information, if the driver 4 is gazing at a branch road different from the direction indicated by the route guidance, it is determined that an inappropriate behavior of the driver 4 has been detected. You may.
- the first information image 210 includes traffic regulation information
- the information collected by the action determination module 516 to determine the action is the state of the vehicle 1 input from the vehicle ECU 401 (for example, mileage, vehicle speed, accelerator pedal opening, engine throttle opening, injector fuel injection amount, The engine rotation speed, the motor rotation speed, the steering angle, the shift position, the drive mode, various warning states), the line-of-sight direction input from the line-of-sight direction detection unit 409, and the like, are not limited to these.
- the graphics module 518 allows the displayed image 200 to have a visual effect (eg, brightness, transparency, saturation, contrast, or other visual characteristic), size, display position, display distance (from driver 4 to image 200). Various known software components for changing the distance).
- the graphic module 518 displays the type set by the image type determination module 506 and the coordinates set by the image position determination module 508 (the left-right direction when the driver 4 views the display area 100 from the driver's seat of the vehicle 1 (X).
- the image 200 is displayed so that the driver 4 can visually recognize it in the image size set by the image size determination module 510 in the axial direction) and the vertical direction (including at least the Y-axis direction).
- the graphic module 518 has a first information image 210 and a second information image 220, which are augmented reality images (AR images) arranged so as to have a predetermined positional relationship with the real object 310 of the foreground 300 of the vehicle 1. , At least.
- the first information image 210 reduces the visibility (including non-display).
- the second information image 220 does not have lower visibility than the first information image 210 (smaller than the degree of reduction in visibility of the first information image 210 ( (To a lesser extent) including lowering the visibility, or not changing the visibility, or increasing the visibility).
- Reduce visibility means to reduce brightness, increase transparency, decrease saturation, reduce contrast, reduce size, reduce image types, combinations of these, or It may include a combination of other elements.
- increasing visibility means increasing brightness, decreasing transparency, increasing saturation, increasing contrast, increasing size, increasing the number of image types, combinations of these, Alternatively, a combination of these and other elements may be included.
- the first information image 210 is information with a relatively low risk derived from the degree of seriousness that can occur, and for example, an arrow image (an example of navigation information) displayed on a road surface to indicate a route, a destination.
- Text image an example of navigation information
- an image showing the distance to the next turning point an example of navigation information
- POI Point of Interest
- image a feature information of the feature information
- a store or facility existing in the foreground 300 images related to road signs (guidance signs, warning signs, regulation signs, instruction signs, auxiliary signs), and ACC (Adaptive Cruise Control) that displays the inter-vehicle distance set when following a preceding vehicle on the road surface.
- ACC Adaptive Cruise Control
- the second information image 220 is information having a relatively high degree of danger derived from the degree of seriousness of the vehicle itself, and for example, a forward collision prediction visually recognized near an obstacle existing in the foreground 300 of the vehicle 1. It is an image of a warning (FCW: Forward Collision Warning).
- FIG. 3 is a flow diagram of a process for reducing the visibility of an image according to some embodiments.
- the line-of-sight direction (gaze position GZ described later) of the driver 4 of the vehicle 1 is acquired (step S11), and the position of the displayed image 200 is acquired (step S12).
- the processor 16 identifies the target visually recognized by the driver 4 by comparing the line-of-sight direction acquired in step S11 with the position of the image 200 acquired in step S12 (step S13). Specifically, the processor 16 determines whether the first information image 210 is visually recognized, the second information image 220 is visually recognized, or the first information image 210 or the second information image 220 is not visually recognized. To do. If the first information image 210 is visually recognized, which image of the first information image 210 is visually recognized is specified.
- the processor 16 When it is determined that the driver 4 visually recognizes the first information image 210 in step S13, the processor 16 reduces the visibility of the visually recognized first information image 210. At this time, the processor 16 may hide the first information image 210. Further, when determining that the driver 4 visually recognizes the second information image 220 in step S ⁇ b> 13, the processor 16 does not reduce the visibility of the second information image 220 or makes the visibility of the second information image 220 first. The information image 210 is made smaller (smaller) than the degree of visibility deterioration.
- FIG. 4 is a flow diagram of a process of increasing the visibility of the second information image according to some embodiments.
- the processor 16 acquires the behavior of the driver 4 (step S21), and the behavior of the driver 4 that is not appropriate for the information indicated by the first information image 210 whose visibility is reduced by the processing of step S14. Is detected (step S22), and when it is determined that the inappropriate behavior of the driver 4 is made, the visibility of the second information image 220, which has been reduced in visibility, is increased (step S23). ..
- the operation of the processing process described above can be performed by executing one or more functional modules of an information processing device such as a general-purpose processor or an application-specific chip. All of these modules, a combination of these modules, and / or a combination with general hardware capable of substituting their functions are included in the scope of protection of the present invention.
- the functional blocks of the vehicular display system 10 are optionally implemented by hardware, software, or a combination of hardware and software to implement the principles of the various described embodiments.
- the functional blocks described in FIG. 2 may optionally be combined or one functional block may be separated into two or more sub-blocks to implement the principles of the described embodiments.
- the description herein optionally supports any possible combination or division of the functional blocks described herein.
- the first information image 210 includes a navigation image 211 that is visually recognized as being superimposed on a road surface 311 and that indicates a guide route, and a notation “P” that indicates a parking lot that instructs a building 314 (actual object 310).
- a POI image 212 that is an illustration.
- the second information image 220 is a first FCW image 221 visually recognized linearly (in a line) on the road surface 311 behind the preceding vehicle 312 traveling in front of the host vehicle 1, and the opposite lane side of the traveling lane of the host vehicle 1.
- the second FCW image 222 visually recognized in an arc shape on the road surface 311 around the pedestrian 313 who walks on the sidewalk.
- a third information image that is not an augmented reality image (AR image) that is arranged so as to have a predetermined positional relationship with the real object 310 of the foreground 300 of the vehicle 1, notation of "80" indicating the speed limit
- a road information image 231 that is an illustration including ",” and a speed image 232 that indicates the speed of the host vehicle 1 and is displayed as "35 km / h" are displayed.
- the display area 100 is a first display area 110 and a second display area 120 that is arranged vertically below the first display area 110 (Y axis negative direction) when the front is visually recognized from the driver's seat of the vehicle 1.
- the first information image 210 and the second information image 220 which are AR images, are displayed in the first display area 110, and the third information images 231 and 232 are displayed in the second display area 120.
- the processor 16 executes the instruction of step S14 in FIG. 3 and is visually recognized as shown in FIG. 5B.
- the navigation image 211 (first information image 210) is hidden (an example of reduced visibility).
- the navigation image 211 which is the augmented reality image (AR image) whose displayed position is related to the position of the real object 310
- the displayed position is a non-AR image which is not related to the position of the real object 310.
- the first navigation image 213 (an example of a related image) and the second navigation image 214 (an example of a related image) are displayed in the second display area 120.
- the first navigation image 213 is a simplified image showing the approximate direction of the next fork.
- the second navigation image 214 is a text described as "200 m ahead" indicating the distance to the next branch road.
- the processor 16 increases the visibility of the navigation image 211, which is the first information image 210 that has been reduced in visibility. That is, the state shown in FIG. 5B is changed to the state shown in FIG. 5A. This allows the driver 4 to recognize that the vehicle should not turn at the nearest intersection.
- FIG. 6 is a flow diagram of a process for reducing the visibility of an image according to some embodiments.
- the flow chart of FIG. 6 corresponds to the flow chart of FIG. 3, and steps S31, 32, and 33 of FIG. 6 correspond to steps S11, S12, and S13 of FIG. 3, respectively.
- step S34 which is a change point from FIG. 3 will be described.
- the processor 16 When it is determined that the driver 4 visually recognizes the first information image 210 in step S13, the processor 16 reduces the visibility of the visually recognized first information image 210. Further, when it is determined that the driver 4 visually recognizes the second information image 220 in step S33, the processor 16 changes the visually recognized second information image 220 from a still image to a moving image, or the visually recognized second information image. 220 and other nearby second information image 220 may be changed from a still image to a moving image.
- the processor 16 may continuously change the number of second information images 220 that are viewed and other second information images 220 that are nearby. Specifically, when the gaze position GZ is in the vicinity of the first FCW image 221 which is the second information image 220, the processor 16 displays the first FCW image 221 that has been visually recognized and the other second FCW image 222 that is close to the first FCW image 221. 7B may be changed continuously and / or intermittently between the state where the number of images shown in FIG. 7B is small and the state where the number of images shown in FIG. 7B is large.
- the moving image is not particularly defined, but the shape of the image repeatedly and / or intermittently changes, the number of images repeatedly changes, the position of the image repeatedly changes, repeatedly blinks, and repeatedly.
- the size may be changed, and the like.
- the display control device of the present embodiment controls the image display unit 11 (12) that displays the image 200 in the area overlapping the foreground 300 when viewed from the driver 4 of the vehicle 1. 13, one or more I / O interfaces 14, one or more processors 16, a memory 18, and stored in memory 18 and executed by one or more processors 16. And one or more computer programs configured as described above, the one or more processors 16 obtaining the line-of-sight direction of the driver 4 from the one or more I / O interfaces 14. However, based on the line-of-sight direction, a first information image 210 that reduces visibility when it is determined to be viewed, and a second information image 220 that does not reduce visibility when compared to the first information image 210 when determined to be viewed.
- the command is executed.
- the change in visibility after visual recognition differs depending on the type of image, and in the case of the first information image, the visibility is reduced, making it easier to see the real scene in front of the driver's field of view. If it is the second information image, the visibility is not lowered so much, so that the image can be made easy to see even after the visual recognition.
- the degree of change in the visibility of the second information image 220 when the line of sight is directed may be determined according to the magnitude of the risk potential of the information indicated by the second information image 220. ..
- the processor 16 may reduce the degree of reduction in visibility when the line of sight is directed, as the risk potential of the information indicated by the second information image 220 is higher. That is, if the risk potential is low, the visibility of the second information image 220 when the line of sight is directed is greatly reduced (the degree of reduction is increased).
- the processor 16 may change the degree of reduction in visibility according to a risk potential that is predetermined according to the type of the second information image 220, and the second information image 220 is being displayed (I / I). It may be changed according to the risk potential calculated according to the information obtained from the O interface 14).
- the processor 16 sets the risk potential of the information indicated by the second information image 220 for a predetermined period. If the risk potential does not become higher than the predetermined threshold value, does not significantly increase, or decreases, it is determined that it is not necessary to maintain the visibility as it is, and the second information image 220 is visually recognized. The sex may be reduced.
- the second information image 220 may be displayed as a moving image when the line of sight turns toward the second information image 220.
- the processor 16 normally displays the second information image 220 as a still image and, when the line of sight is turned, makes part or all of the second information image 220 a moving image. It is recognized that the first information image 210 has a lower visibility when visually recognized, but becomes a moving image when the second information image 220 is visually recognized, and thus is different from the information indicated by the first information image 210. Thus, the attention can be directed to the second information image 220.
- the second information image 220 may be changed to a still image after being displayed as a moving image for a certain period. Further, the processor 16 may reduce the visibility of the second information image 220 that is visually recognized at the same timing that the second information image 220 is displayed as a moving image, or the visibility is reduced after the moving image is displayed. You may let me.
- the predetermined second information image 220 and other second information images are displayed.
- 220 may be displayed as a moving image.
- the driver 4 visually recognizes the second FCW image 222 displayed corresponding to the pedestrian 313, the second FCW image 222 and the first FCW image displayed corresponding to the other preceding vehicle 312.
- 221 may be a moving image. In this manner, by displaying the other information images in the same manner, the visual attention can be directed not only to the visually recognized image but also to the similar image.
- the predetermined second information image 220 and the predetermined second information image are displayed.
- Another second information image 220 near 220 may be displayed as a moving image.
- the predetermined second information image 220 and other second information images are displayed.
- 220 may be displayed as a moving image at the same cycle. This makes it easy to identify where an image similar to the image with the field of view is displayed.
- the first information image 210 is displayed.
- the visibility of the first information image 210 may be increased.
- the related images (213, 214) related to the first information image 210 are changed to the first information image 210 or the second information image.
- 220 may be displayed in the second display area 102 different from the first display area 101 in which it is displayed. That is, the area of the foreground 300 that overlaps the first information image 210 can be easily seen, and the information indicated by the first information image 210 can be confirmed in the other second display area 102.
- the processor 16 may reduce the visibility of one first information image 210 and newly display two or more related images. As a result, more information can be recognized in the related image, and it is possible to suppress deterioration in information recognition due to deterioration in visibility of the first information image 210 that is an AR image.
- Road information image ( Third information image), 232 ... Velocity image (third information image), 300 ... Foreground, 310 ... Real object, 311 ... Road surface, 311a ... Marking line, 311b ... Marking line, 312 ... Leading vehicle, 313 ... Pedestrian, 314 ... Building, 320 ... Lane, 401 ... Vehicle ECU, 403 ... Road information database, 405 ... Own vehicle position detection unit, 407 ... Exterior sensor, 409 ... Line-of-sight direction detection unit, 411 ... Eye position detection unit, 413 ... Mobile information Terminal, 420 ... External communication connection device, 502 ... Real object related information detection module, 504 ... Notification necessity detection module, 506 ... Image type determination module, 508 ... Image position determination module, 510 ... Image size determination module, 512 ... Eyes Position detection module, 514 ... Visual detection module, 516 ... Behavior determination module, 518 ... Graphic module, GZ ... Gaze position
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Traffic Control Systems (AREA)
- Instrument Panels (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
La présente invention améliore la visibilité d'une vue réelle devant le champ de vision d'un conducteur, ainsi que d'une image affichée tout en chevauchant la vue réelle. Un dispositif de commande d'affichage qui commande une unité d'affichage d'image pour afficher une image dans une région qui chevauche un premier plan lorsqu'il est vu depuis un conducteur d'un véhicule affiche une première image d'informations (210) et une seconde image d'informations (220) sur l'unité d'affichage d'image, détecte la direction de ligne de vue du conducteur du véhicule, lorsque la première image d'informations (210) est déterminée comme étant visuellement reconnue, diminue sa visibilité et, lorsque la seconde image d'informations (220) est déterminée comme étant reconnue visuellement, empêche la visibilité de celle-ci de devenir inférieure à celle de la première image d'informations (210).
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201980076258.0A CN113165510B (zh) | 2018-11-23 | 2019-11-20 | 显示控制装置、方法和计算机程序 |
| DE112019005849.5T DE112019005849T5 (de) | 2018-11-23 | 2019-11-20 | Anzeigesteuervorrichtung, Anzeigesteuerverfahren und Computerprogramm zur Anzeigesteuerung |
| JP2020557598A JP7255608B2 (ja) | 2018-11-23 | 2019-11-20 | 表示制御装置、方法、及びコンピュータ・プログラム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018219802 | 2018-11-23 | ||
| JP2018-219802 | 2018-11-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020105685A1 true WO2020105685A1 (fr) | 2020-05-28 |
Family
ID=70773132
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/045494 Ceased WO2020105685A1 (fr) | 2018-11-23 | 2019-11-20 | Dispositif, procédé et programme informatique de commande d'affichage |
Country Status (4)
| Country | Link |
|---|---|
| JP (1) | JP7255608B2 (fr) |
| CN (1) | CN113165510B (fr) |
| DE (1) | DE112019005849T5 (fr) |
| WO (1) | WO2020105685A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022095787A (ja) * | 2021-06-25 | 2022-06-28 | 阿波▲羅▼智▲聯▼(北京)科技有限公司 | 表示方法、装置、端末デバイス、コンピュータ可読記憶媒体、およびコンピュータプログラム |
| JP2023015543A (ja) * | 2021-07-20 | 2023-02-01 | 日本精機株式会社 | 表示装置 |
| EP4265463A1 (fr) * | 2022-04-19 | 2023-10-25 | Volkswagen Ag | Véhicule, affichage tête haute, dispositif de réalité augmentée, appareils, procédés et programmes informatiques pour commander un dispositif de réalité augmentée et pour commander un dispositif de visualisation |
| JP2024078189A (ja) * | 2022-11-29 | 2024-06-10 | トヨタ自動車株式会社 | 車両用表示制御装置 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7487714B2 (ja) * | 2021-08-31 | 2024-05-21 | トヨタ自動車株式会社 | 表示制御装置、表示システム、表示方法及び表示プログラム |
| CN116572837A (zh) * | 2023-04-27 | 2023-08-11 | 江苏泽景汽车电子股份有限公司 | 一种信息显示控制方法及装置、电子设备、存储介质 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007045169A (ja) * | 2005-08-05 | 2007-02-22 | Aisin Aw Co Ltd | 車両用情報処理装置 |
| JP2017039373A (ja) * | 2015-08-19 | 2017-02-23 | トヨタ自動車株式会社 | 車両用映像表示システム |
| JP2017097687A (ja) * | 2015-11-26 | 2017-06-01 | 矢崎総業株式会社 | 車両用情報提示装置 |
| JP2017226272A (ja) * | 2016-06-21 | 2017-12-28 | 日本精機株式会社 | 車両用情報提供装置 |
Family Cites Families (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8330812B2 (en) * | 1995-05-30 | 2012-12-11 | Simulated Percepts, Llc | Method and apparatus for producing and storing, on a resultant non-transitory storage medium, computer generated (CG) video in correspondence with images acquired by an image acquisition device tracked in motion with respect to a 3D reference frame |
| JP3877127B2 (ja) * | 2000-06-15 | 2007-02-07 | マツダ株式会社 | 車両用表示装置 |
| JP4698002B2 (ja) * | 2000-07-11 | 2011-06-08 | マツダ株式会社 | 車両の表示装置 |
| JP3870409B2 (ja) * | 2000-08-03 | 2007-01-17 | マツダ株式会社 | 車両用表示装置 |
| JP2002293162A (ja) * | 2001-03-30 | 2002-10-09 | Yazaki Corp | 車両用表示装置 |
| JP4026144B2 (ja) * | 2004-01-20 | 2007-12-26 | マツダ株式会社 | 車両用画像表示装置 |
| JP4715718B2 (ja) * | 2006-10-24 | 2011-07-06 | 株式会社デンソー | 車両用表示装置 |
| JP2008282168A (ja) * | 2007-05-09 | 2008-11-20 | Toyota Motor Corp | 意識状態検出装置 |
| JP2009292409A (ja) * | 2008-06-09 | 2009-12-17 | Yazaki Corp | ヘッドアップディスプレイ装置 |
| JP5245930B2 (ja) * | 2009-03-09 | 2013-07-24 | 株式会社デンソー | 車載表示装置 |
| JP5842419B2 (ja) * | 2011-07-06 | 2016-01-13 | 日本精機株式会社 | ヘッドアップディスプレイ装置 |
| JP5406328B2 (ja) * | 2012-03-27 | 2014-02-05 | 株式会社デンソーアイティーラボラトリ | 車両用表示装置、その制御方法及びプログラム |
| JP6232691B2 (ja) * | 2012-07-27 | 2017-11-22 | 株式会社Jvcケンウッド | 車両用表示制御装置、車両用表示装置及び車両用表示制御方法 |
| WO2014097404A1 (fr) * | 2012-12-18 | 2014-06-26 | パイオニア株式会社 | Affichage à tête haute, procédé de commande, programme et support d'informations |
| JP6037923B2 (ja) * | 2013-04-08 | 2016-12-07 | 三菱電機株式会社 | 表示情報生成装置および表示情報生成方法 |
| JP6413207B2 (ja) * | 2013-05-20 | 2018-10-31 | 日本精機株式会社 | 車両用表示装置 |
| JP2015041969A (ja) * | 2013-08-23 | 2015-03-02 | ソニー株式会社 | 画像取得装置及び画像取得方法、並びに情報配信システム。 |
| JP6225379B2 (ja) * | 2013-12-23 | 2017-11-08 | 日本精機株式会社 | 車両情報投影システム |
| JP6253417B2 (ja) * | 2014-01-16 | 2017-12-27 | 三菱電機株式会社 | 車両情報表示制御装置 |
| JP6443716B2 (ja) * | 2014-05-19 | 2018-12-26 | 株式会社リコー | 画像表示装置、画像表示方法及び画像表示制御プログラム |
| JP6348791B2 (ja) * | 2014-07-16 | 2018-06-27 | クラリオン株式会社 | 表示制御装置および表示制御方法 |
| JP6379779B2 (ja) * | 2014-07-16 | 2018-08-29 | 日産自動車株式会社 | 車両用表示装置 |
| JP2016031603A (ja) * | 2014-07-28 | 2016-03-07 | 日本精機株式会社 | 車両用表示システム |
| JP2016055801A (ja) * | 2014-09-11 | 2016-04-21 | トヨタ自動車株式会社 | 車載表示装置 |
| JP6504431B2 (ja) * | 2014-12-10 | 2019-04-24 | 株式会社リコー | 画像表示装置、移動体、画像表示方法及びプログラム |
| JP2016107947A (ja) * | 2014-12-10 | 2016-06-20 | 株式会社リコー | 情報提供装置、情報提供方法及び情報提供用制御プログラム |
| US10232772B2 (en) * | 2015-03-26 | 2019-03-19 | Mitsubishi Electric Corporation | Driver assistance system |
| WO2017094427A1 (fr) * | 2015-12-01 | 2017-06-08 | 日本精機株式会社 | Afficheur tête haute |
| JP2017138350A (ja) * | 2016-02-01 | 2017-08-10 | アルプス電気株式会社 | 画像表示装置 |
| JP6272375B2 (ja) * | 2016-03-18 | 2018-01-31 | 株式会社Subaru | 車両用表示制御装置 |
| JP2017200786A (ja) * | 2016-05-02 | 2017-11-09 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
| JP2016193723A (ja) * | 2016-06-24 | 2016-11-17 | パイオニア株式会社 | 表示装置、プログラム、及び記憶媒体 |
| JP2018022958A (ja) * | 2016-08-01 | 2018-02-08 | 株式会社デンソー | 車両用表示制御装置及び車両用モニタシステム |
| JP6643969B2 (ja) * | 2016-11-01 | 2020-02-12 | 矢崎総業株式会社 | 車両用表示装置 |
| JP2018120135A (ja) * | 2017-01-26 | 2018-08-02 | 日本精機株式会社 | ヘッドアップディスプレイ |
-
2019
- 2019-11-20 CN CN201980076258.0A patent/CN113165510B/zh active Active
- 2019-11-20 DE DE112019005849.5T patent/DE112019005849T5/de active Pending
- 2019-11-20 WO PCT/JP2019/045494 patent/WO2020105685A1/fr not_active Ceased
- 2019-11-20 JP JP2020557598A patent/JP7255608B2/ja active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007045169A (ja) * | 2005-08-05 | 2007-02-22 | Aisin Aw Co Ltd | 車両用情報処理装置 |
| JP2017039373A (ja) * | 2015-08-19 | 2017-02-23 | トヨタ自動車株式会社 | 車両用映像表示システム |
| JP2017097687A (ja) * | 2015-11-26 | 2017-06-01 | 矢崎総業株式会社 | 車両用情報提示装置 |
| JP2017226272A (ja) * | 2016-06-21 | 2017-12-28 | 日本精機株式会社 | 車両用情報提供装置 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022095787A (ja) * | 2021-06-25 | 2022-06-28 | 阿波▲羅▼智▲聯▼(北京)科技有限公司 | 表示方法、装置、端末デバイス、コンピュータ可読記憶媒体、およびコンピュータプログラム |
| JP2023015543A (ja) * | 2021-07-20 | 2023-02-01 | 日本精機株式会社 | 表示装置 |
| EP4265463A1 (fr) * | 2022-04-19 | 2023-10-25 | Volkswagen Ag | Véhicule, affichage tête haute, dispositif de réalité augmentée, appareils, procédés et programmes informatiques pour commander un dispositif de réalité augmentée et pour commander un dispositif de visualisation |
| JP2024078189A (ja) * | 2022-11-29 | 2024-06-10 | トヨタ自動車株式会社 | 車両用表示制御装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113165510A (zh) | 2021-07-23 |
| JPWO2020105685A1 (ja) | 2021-11-04 |
| JP7255608B2 (ja) | 2023-04-11 |
| CN113165510B (zh) | 2024-01-30 |
| DE112019005849T5 (de) | 2021-09-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7255608B2 (ja) | 表示制御装置、方法、及びコンピュータ・プログラム | |
| US11803053B2 (en) | Display control device and non-transitory tangible computer-readable medium therefor | |
| US20210104212A1 (en) | Display control device, and nontransitory tangible computer-readable medium therefor | |
| JP2023174676A (ja) | 車両用表示制御装置、方法およびプログラム | |
| JP2015077876A (ja) | ヘッドアップディスプレイ装置 | |
| US20240042857A1 (en) | Vehicle display system, vehicle display method, and computer-readable non-transitory storage medium storing vehicle display program | |
| JP2020032866A (ja) | 車両用仮想現実提供装置、方法、及びコンピュータ・プログラム | |
| JP2024029051A (ja) | 車載表示装置、方法およびプログラム | |
| JP7459883B2 (ja) | 表示制御装置、ヘッドアップディスプレイ装置、及び方法 | |
| US12485755B2 (en) | Display control device, display system, and display control method | |
| JP6186905B2 (ja) | 車載表示装置およびプログラム | |
| JP2020117104A (ja) | 表示制御装置、表示システム、方法、及びコンピュータ・プログラム | |
| JP2020117105A (ja) | 表示制御装置、方法、及びコンピュータ・プログラム | |
| JP2020199883A (ja) | 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム | |
| JP2021133874A (ja) | 表示制御装置、ヘッドアップディスプレイ装置、及び方法 | |
| JP7619007B2 (ja) | 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法 | |
| WO2020158601A1 (fr) | Dispositif, procédé et programme informatique de commande d'affichage | |
| JP7255596B2 (ja) | 表示制御装置、ヘッドアップディスプレイ装置 | |
| JP2021160409A (ja) | 表示制御装置、画像表示装置、及び方法 | |
| WO2023213416A1 (fr) | Procédé et dispositif utilisateur pour la détection d'un environnement du dispositif utilisateur | |
| JP2020121607A (ja) | 表示制御装置、方法、及びコンピュータ・プログラム | |
| JP2020121704A (ja) | 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム | |
| JP7434894B2 (ja) | 車両用表示装置 | |
| JP7738382B2 (ja) | 車両用表示装置 | |
| JP7635559B2 (ja) | 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19887706 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020557598 Country of ref document: JP Kind code of ref document: A |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19887706 Country of ref document: EP Kind code of ref document: A1 |