[go: up one dir, main page]

WO2020158601A1 - Dispositif, procédé et programme informatique de commande d'affichage - Google Patents

Dispositif, procédé et programme informatique de commande d'affichage Download PDF

Info

Publication number
WO2020158601A1
WO2020158601A1 PCT/JP2020/002504 JP2020002504W WO2020158601A1 WO 2020158601 A1 WO2020158601 A1 WO 2020158601A1 JP 2020002504 W JP2020002504 W JP 2020002504W WO 2020158601 A1 WO2020158601 A1 WO 2020158601A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
update cycle
real object
display update
setting process
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/002504
Other languages
English (en)
Japanese (ja)
Inventor
誠 秦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Seiki Co Ltd
Original Assignee
Nippon Seiki Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Seiki Co Ltd filed Critical Nippon Seiki Co Ltd
Publication of WO2020158601A1 publication Critical patent/WO2020158601A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/20Dashboard panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data

Definitions

  • the present disclosure relates to a display control device, a method, and a computer program that are used in a vehicle and that superimpose an image on the foreground of the vehicle for visual recognition.
  • Patent Document 1 discloses data in which the position of an object (other vehicle) around the vehicle is detected, and the size and position of the object after a predetermined time is predicted from the relative speed between the vehicle and the object. There is disclosed a display device that performs display according to the position of an object by generating the image and performing display based on the generated image.
  • the outline of this disclosure relates to prompt visual attention to the target. More specifically, the present invention also relates to prompting visual attention while smoothing the display transition of an image based on the predicted position where the position of a real object is predicted.
  • the display control device described in the present specification sets the first position setting process for setting the position of the image 200 based on the position of the real object acquired immediately before, and at least the position of the real object acquired immediately before.
  • Second position setting process for setting the position of the image 200 based on the predicted position of the real object in the display update cycle of the image 200 predicted based on the position of one or more real objects acquired in the past
  • a second display that is not immediately after the position of the real object is acquired in the first display update cycle F ⁇ immediately after the position of the real object is acquired.
  • the second position setting process is executed.
  • the first position setting process is executed in the first display update period F ⁇
  • the second position setting process is executed in the second display update period F ⁇ .
  • the second position setting process is executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • FIG. 3 is a block diagram of a vehicular display system according to some embodiments.
  • FIG. 6 is a diagram showing a first AR image displayed in association with a first real object and a second AR image displayed in association with a second real object according to some embodiments. It is a figure explaining how the specific position of the 1st real object and the specific position of the 2nd real object are set for every display update cycle of AR image concerning some embodiments.
  • FIG. 3 provides a description of the image displayed by the vehicular display system.
  • FIG. 4 provides a description of the processing method.
  • the present invention is not limited to the following embodiments (including the contents of the drawings). Of course, changes (including deletion of components) can be added to the following embodiments. In addition, in the following description, in order to facilitate understanding of the present invention, description of known technical matters is omitted as appropriate.
  • the image display unit 11 in the vehicle display system 10 is a head-up display (HUD: Head-Up Display) device provided in the dashboard 5 of the vehicle 1.
  • the HUD device emits the display light 11 a toward the front windshield 2 (which is an example of a member to be projected) and displays the image 200 in the virtual display area 100, so that the display light 11 a is transmitted through the front windshield 2.
  • the image 200 is overlaid and visually recognized on the foreground 300 which is the visually recognized real space.
  • the horizontal direction when the driver 4 seated in the driver's seat of the host vehicle 1 faces the front of the host vehicle 1 is the X axis (the left direction is the positive direction of the X axis), and the vertical direction is the vertical direction.
  • the Y-axis (the upward direction is the positive Y-axis direction) and the front-back direction is the Z-axis (the forward direction is the positive Z-axis direction).
  • the image display unit 11 may be a head mounted display (hereinafter, HMD) device.
  • HMD head mounted display
  • the driver 4 mounts the HMD device on his/her head and sits on the seat of the own vehicle 1 to visually recognize the displayed image 200 by superimposing it on the foreground 300 through the front windshield 2 of the own vehicle 1.
  • the display area 100 in which the vehicle display system 10 displays the predetermined image 200 is fixed at a specific position based on the coordinate system of the host vehicle 1, and when the driver 4 turns in that direction, the display area 100 is displayed at the specific position.
  • the image 200 displayed in the fixed display area 100 can be visually recognized.
  • the image display unit 11 is an obstacle (pedestrian) existing in a foreground 300 which is a real space (real scene) visually recognized through the front windshield 2 of the vehicle 1 under the control of a display control device 13 described later. , Bicycles, motorcycles, other vehicles, etc.), road surfaces, road signs, and the vicinity of real objects 310 such as features (buildings, bridges, etc.) (an example of a specific positional relationship between images and real objects), real objects 310 To display the image 200 at a position overlapping with (an example of a specific positional relationship between the image and the real object) or at a position set with reference to the real object 310 (an example of a specific positional relationship between the image and the real object).
  • a position overlapping with an example of a specific positional relationship between the image and the real object
  • a position set with reference to the real object 310 an example of a specific positional relationship between the image and the real object.
  • the image display unit 11 changes the display position according to the position of the real object 310 (described in detail later) and a non-AR image (not shown) that does not change the display position according to the position of the real object 310. And can be displayed.
  • AR visual augmented reality
  • FIG. 2 is a block diagram of a vehicle display system 10 according to some embodiments.
  • the vehicle display system 10 includes an image display unit 11 and a display control device 13 that controls the image display unit 11.
  • the display controller 13 comprises one or more I/O interfaces 14, one or more processors 16, one or more memories 18, and one or more image processing circuits 20.
  • the various functional blocks depicted in FIG. 2 may be implemented in hardware, software, or a combination of both.
  • FIG. 2 is only one embodiment of an implementation and the illustrated components may be combined into fewer components or there may be additional components.
  • image processing circuitry 20 eg, a graphics processing unit
  • processors 16 e.g, a graphics processing unit
  • the processor 16 and the image processing circuit 20 are operably connected to the memory 18. More specifically, the processor 16 and the image processing circuit 20 execute a program stored in the memory 18 to operate the vehicular display system 10, for example, to generate or/and transmit image data. be able to.
  • the processor 16 or/and the image processing circuit 20 includes at least one general purpose microprocessor (eg, central processing unit (CPU)), at least one application specific integrated circuit (ASIC), at least one field programmable gate array (FPGA). , Or any combination thereof.
  • the memory 18 includes any type of magnetic media such as a hard disk, any type of optical media such as CDs and DVDs, any type of semiconductor memory such as volatile memory, and non-volatile memory. Volatile memory may include DRAM and SRAM, and non-volatile memory may include ROM and NVROM.
  • the processor 16 is operably connected to the I/O interface 14.
  • the I/O interface 14 uses the vehicle display system 10 as a personal area network (PAN) such as a Bluetooth (registered trademark) network or a local area network (LAN) such as an 802.11x Wi-Fi (registered trademark) network. It may include a wireless communication interface for connecting to a wide area network (WAN) such as a 4G or LTE cellular network.
  • the I/O interface 14 may include a wired communication interface such as, for example, a USB port, a serial port, a parallel port, an OBDII, and/or any other suitable wired communication port.
  • the processor 16 is operably connected to the I/O interface 14 so as to communicate with various other electronic devices connected to the vehicle display system 10 (I/O interface 14). Can be given and received.
  • the I/O interface 14 includes, for example, a vehicle ECU 401 provided in the host vehicle 1, a road information database 403, a host vehicle position detection unit 405, a vehicle exterior sensor 407, a line-of-sight direction detection unit 409, an eye position detection unit 411, and portable information.
  • the terminal 413, the vehicle exterior communication connection device 420, and the like are operably connected.
  • the image display unit 11 is operably connected to the processor 16 and the image processing circuit 20.
  • the image displayed by the image display unit 11 may be based on the image data received from the processor 16 and/or the image processing circuit 20.
  • the processor 16 and the image processing circuit 20 control the image displayed by the image display unit 11 based on the information obtained from the I/O interface 14.
  • the I/O interface 14 may include a function of processing (converting, calculating, analyzing) information received from another electronic device or the like connected to the vehicle display system 10.
  • the host vehicle 1 is in the state of the host vehicle 1 (for example, mileage, vehicle speed, accelerator pedal opening, engine throttle opening, injector fuel injection amount, engine speed, motor speed, steering angle, shift position, drive mode).
  • the vehicle ECU 401 controls each unit of the host vehicle 1, and can transmit vehicle speed information indicating the current vehicle speed of the host vehicle 1 to the processor 16, for example.
  • vehicle ECU 401 may transmit the determination result of the data detected by the sensor and/or the analysis result to processor 16 in addition to or instead of simply transmitting the data detected by the sensor to processor 16. For example, information indicating whether the host vehicle 1 is traveling at a low speed or stopped may be transmitted to the processor 16.
  • the vehicle ECU 401 may transmit an instruction signal for instructing the image 200 displayed by the vehicle display system 10 to the I/O interface 14, in which case the coordinates of the image 200, the notification necessity degree of the image 200, or / And the necessity degree related information that is a basis for determining the notification necessity degree may be added to the instruction signal and transmitted.
  • the host vehicle 1 may include a road information database 403 including a navigation system and the like.
  • the road information database 403 is based on the position of the vehicle 1 acquired from the vehicle position detection unit 405 described later, and is road information (lane, white line, stop line, Crosswalk, width of road, number of lanes, intersection, curve, branch road, traffic regulation, etc.), presence/absence of feature information (buildings, bridges, rivers, etc.), position (including distance to own vehicle 1), direction The direction (based on the host vehicle 1), shape, type, detailed information, etc. may be read and transmitted to the processor 16. Further, the road information database 403 may calculate an appropriate route from the starting point to the destination and send it to the processor 16 as navigation information.
  • the host vehicle 1 may include a host vehicle position detection unit 405 such as a GNSS (Global Navigation Satellite System).
  • the road information database 403, the portable information terminal 413, which will be described later, and/or the external communication connecting device 420 acquires the position information of the own vehicle 1 from the own vehicle position detection unit 405 continuously, intermittently, or at every predetermined event. Thus, the information around the vehicle 1 can be selected and/or generated and transmitted to the processor 16.
  • GNSS Global Navigation Satellite System
  • the host vehicle 1 may include one or more vehicle exterior sensors 407 that detect the real objects 310 existing around the host vehicle 1 (front, side, and rear).
  • the real object 310 detected by the vehicle exterior sensor 407 is, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (the preceding vehicle 320 or the like), a road surface (a traveling lane 330), a marking line, a roadside object, and/or a feature (building. Etc.) etc. may be included.
  • the vehicle exterior sensor for example, a millimeter wave radar, an ultrasonic radar, a radar sensor such as a laser radar, there is a camera sensor consisting of a camera and an image processing device, may be configured by a combination of both the radar sensor, the camera sensor, It may be configured with only one of them.
  • a conventionally known method is applied to the object detection by the radar sensor and the camera sensor.
  • the position of the real object (relative distance from the own vehicle 1, the traveling direction of the own vehicle 1 Position in the left-right direction in the front-back direction, vertical position, etc.), size (size in the horizontal direction (left-right direction), height direction (up-down direction), etc.), movement direction (lateral direction (left-right direction)) , Depth direction (front-back direction), movement speed (lateral direction (left-right direction), depth direction (front-back direction)), and/or type may be detected.
  • One or more vehicle exterior sensors 407 detect a real object in front of the own vehicle 1 for each detection cycle of each sensor, and detect real object related information (presence or absence of real object, which is an example of real object related information).
  • information such as the position, size, and/or type of each real object
  • the real object related information may be transmitted to the processor 16 via another device (for example, the vehicle ECU 401).
  • An infrared camera or a near-infrared camera is desirable when using a camera as a sensor so that a real object can be detected even when the surroundings are dark such as at night. Further, when using a camera as a sensor, a stereo camera capable of acquiring a distance and the like by parallax is desirable.
  • the host vehicle 1 may include a line-of-sight direction detection unit 409 including an infrared camera that detects the gaze direction of the driver 4 (hereinafter, also referred to as “line-of-sight direction”) and that captures an image of the face of the driver 4.
  • the processor 16 can specify the line-of-sight direction of the driver 4 by acquiring an image captured by the infrared camera (an example of information that can estimate the line-of-sight direction) and analyzing the captured image. Note that the processor 16 may acquire the line-of-sight direction of the driver 4 specified by the line-of-sight direction detection unit 409 (or another analysis unit) from the image captured by the infrared camera from the I/O interface 14.
  • the method of acquiring the driver's 4 line-of-sight direction of the vehicle 1 or the information capable of estimating the driver's 4 line-of-sight direction is not limited to these, and the EOG (Electro-oculogram) method, the corneal reflex Method, scleral reflection method, Purkinje image detection method, search coil method, infrared fundus camera method, and other known gaze direction detection (estimation) techniques may be used.
  • EOG Electro-oculogram
  • the host vehicle 1 may include an eye position detection unit 411 including an infrared camera that detects the position of the eyes of the driver 4.
  • the processor 16 can specify the eye position of the driver 4 by acquiring an image (an example of information that can estimate the eye position) captured by the infrared camera and analyzing the captured image.
  • the processor 16 may acquire the information on the position of the eyes of the driver 4 identified from the image captured by the infrared camera from the I/O interface 14.
  • the method for acquiring the position of the eyes of the driver 4 of the vehicle 1 or the information capable of estimating the position of the eyes of the driver 4 is not limited to these, and known eye position detection (estimation) It may be acquired using a technology.
  • the processor 16 adjusts at least the position of the image 200 based on the position of the eyes of the driver 4 so that the image 200 in which the image 200 superimposed on a desired position of the foreground 300 is detected is the viewer (driver 4). May be visually confirmed.
  • the mobile information terminal 413 is a smartphone, a laptop computer, a smart watch, or other information device that can be carried by the driver 4 (or another occupant of the vehicle 1).
  • the I/O interface 14 can communicate with the mobile information terminal 413, and acquires the data recorded in the mobile information terminal 413 (or the server via the mobile information terminal).
  • the mobile information terminal 413 has, for example, the same function as the road information database 403 and the vehicle position detection unit 405 described above, acquires the road information (an example of real object-related information), and transmits it to the processor 16. Good.
  • the mobile information terminal 413 may also acquire commercial information (an example of real object-related information) related to a commercial facility in the vicinity of the host vehicle 1 and send it to the processor 16.
  • the portable information terminal 413 transmits schedule information of the owner (for example, the driver 4) of the portable information terminal 413, incoming information at the portable information terminal 413, mail reception information, etc. to the processor 16, and the processor 16 and The image processing circuit 20 may generate or/and transmit image data regarding these.
  • the outside-vehicle communication connection device 420 is a communication device for exchanging information with the own vehicle 1, and for example, another vehicle connected to the own vehicle 1 through vehicle-to-vehicle communication (V2V: Vehicle To Vehicle), pedestrian-to-vehicle communication (V2P: It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything).
  • V2V Vehicle To Vehicle
  • V2P pedestrian-to-vehicle communication
  • V2P It is a network communication device connected by a pedestrian (a portable information terminal carried by a pedestrian) and a road-to-vehicle communication (V2I: Vehicle To road Infrastructure) connected by a Vehicle To Pedestrian. Includes everything connected by V2X (Vehicle To Everything).
  • the extra-vehicle communication connection device 420 acquires the position of, for example, a pedestrian, a bicycle, a motorcycle, another vehicle (such as a preceding vehicle), a road surface, a marking line, a roadside object, and/or a feature (such as a building), and the processor 16 May be sent to.
  • the vehicle exterior communication connection device 420 may have the same function as the own vehicle position detection unit 405 described above, may acquire the position information of the own vehicle 1 and may transmit the position information to the processor 16, and further, the above road information database. It also has the function of 403, and may acquire the road information (an example of real object related information) and send it to the processor 16.
  • the information acquired from the vehicle exterior communication connection device 420 is not limited to the above.
  • the software components stored in the memory 18 are the real object related information detection module 502, the real object position setting module 504, the difference determination module 506, the distance determination module 508, the speed determination module 510, the notification necessity determination module 512, and the image position.
  • a decision module 514, an image size decision module 516, and a graphics module 518 are included.
  • the real object related information detection module 502 acquires information (also called real object related information) including at least the position of the real object 310 existing in front of the host vehicle 1.
  • the real object-related information detection module 502 uses, for example, the vehicle exterior sensor 407 to detect the position of the real object 310 existing in the foreground 300 of the host vehicle 1 (the traveling direction of the host vehicle 1 from the driver 4 in the driver's seat of the host vehicle 1 ( The position in the height direction (vertical direction) and the lateral direction (horizontal direction) when visually recognizing the front), and the position in the depth direction (front direction) may be added to these), and of the real object 310.
  • the real object related information detection module 502 uses the external communication connection device 420 to detect the position, relative speed, and type of the real object (other vehicle), the lighting state of the direction indicator of the real object (other vehicle), and the steering angle operation. (Or an example of the actual object and related information) indicating the state, or/and the planned traveling route and the traveling schedule by the driving support system.
  • the real object related information detection module 502 detects the position of the left lane marking 331 (see FIG. 3) of the vehicle driving lane 330 (see FIG. 3) and the right lane marking 332 (see FIG. 3)), and the area (running lane 330) between the left and right partition lines 331 and 332 may be recognized.
  • the real object position setting module 504 acquires an observation position indicating the current position of the real object 310 from the road information database 403, the vehicle exterior sensor 407, the portable information terminal 413, or the vehicle exterior communication connection device 420 via the I/O interface 14. Alternatively, the observation position of the real object obtained by fusing these two or more observation positions is acquired, and the position of the real object 310 (also referred to as a specific position) is set based on the acquired observation position.
  • An image position determination module 514 which will be described later, determines the position of the AR image 210 based on the specific position of the real object 310 set by the real object position setting module 504.
  • the real object position setting module 504 sets the specific position of the real object 310 based on the observation position of the real object 310 acquired immediately before, and the one acquired in the past including at least the observation position of the real object 310 acquired immediately before.
  • Position setting processing and a real object in the display update cycle of the AR image 210 predicted based on the observation positions of one or more real objects 310 acquired in the past including at least the observation position of the real object 310 acquired immediately before
  • a second position setting process of setting the position of the AR image 210 based on the predicted position of 310 can be executed.
  • the real object position setting module 504 sets the specific position of the real object 310, which is the reference of the position to be displayed, in the time series of the first position setting process and the second position setting process.
  • the specific position of the real object 310 that serves as the reference of the display position is calculated by the second position setting process.
  • FIG. 3 is a diagram showing a first AR image 220 displayed in association with the first real object 320 and a second AR image 230 displayed in association with the second real object 330.
  • the first AR image 220 is an arcuate shape that draws attention to the preceding vehicle (first real object) 320 preceding the traveling lane 330 of the own vehicle 1 and is visually recognized so as to surround the preceding vehicle 320 behind. This is a warning image.
  • the second AR image 230 is a route image having a single arrow shape that shows the planned route of the host vehicle 1 and is visually recognized in an overlapping manner on the traveling lane (second real object) 330 of the host vehicle 1.
  • the processor 16 uses the real object-related information detection module 502 to observe the position Im of the preceding vehicle (first real object) 320, which serves as a reference for the position where the attention image (first AR image) 220 is displayed, and the route image.
  • the observation position In indicating the position of the traveling lane (second real object) 330 that serves as a reference for the position where the (second AR image) 230 is displayed is acquired.
  • the processor 16 uses the observation positions Im to Im-5 of the first real object 320 acquired in the past, including the observation position Im acquired immediately before, from the observation positions Im to Im-5 of the first time for each display update cycle k+2 to k ⁇ 2 of the AR image 210.
  • the second at every display update cycle k+2 to k-2 of the AR image 210.
  • the specific positions Qk+2 to Qk-2 of the real object 330 are set.
  • FIG. 4 is a diagram for explaining how the specific position of the first real object and the specific position of the second real object are set for each display update cycle of the AR image.
  • the display update cycle k+2 is the newest display update cycle, and becomes older in the order of k+1, k, k-1, k-2.
  • Pk+2 (Qk+2) is the newest in the figure.
  • Im(In) is the newest observation position of the first real object 320 (second real object 330) in the figure, and Im-1(In-1), Im-2(In-2)... It becomes old in order.
  • the cycle in which the observation position Im of the first real object 320 is acquired and the cycle in which the observation position In of the second real object 330 is acquired are described as different, but they are acquired.
  • the periods may be aligned.
  • the setting of the specific position Pk of the first real object 320 will be described.
  • the processor 16 acquires the specific position Pk of the first real object 320 existing in front of the host vehicle 1 from the one or more I/O interfaces 14 in the setting of the specific position Pk of the first real object 320.
  • the first display update cycle F ⁇ the first position setting processing is executed
  • the second display update cycle F ⁇ which is not immediately after the observation position Im of the first real object 320 is acquired
  • the second position setting processing is executed.
  • the display update cycle k-1, k+1 is the first display update cycle F ⁇ immediately after the observation positions Im-1, Im are acquired
  • the display update cycle k-2, k, k+2 is the observation position Im-1.
  • Im is the second display update cycle F ⁇ that is not the first display update cycle F ⁇ immediately after the acquisition. Since the display update cycle k is the second display update cycle F ⁇ , the specific position Pk in the display update cycle k is determined by the second position setting process, and specifically, the four observation positions Im ⁇ 1, Im-2, Im-3, Im-4. Since the next display update cycle k+1 is the first display update cycle F ⁇ , the specific position Pk+1 in the display update cycle k+1 is determined by the first position setting process, and specifically, the first actual acquired immediately before. The observation position Im of the object 320 is set.
  • the specific position Pk+2 in the display update cycle k+2 is determined by the second position setting process, and specifically, the four observation positions before that. It is predicted based on Im, Im-1, Im-2 and Im-3.
  • the method of calculating the predicted position by the real object position setting module 504 and the real object position setting module 504 obtains a display update cycle (eg, the display update cycle k in FIG. 4) that is the target of processing in the past. Any method may be used as long as the prediction is performed based on the observed positions (observed positions Im-1, Im-2,... In FIG. 4).
  • the real object position setting module 504 uses the least squares method or a prediction algorithm such as a Kalman filter, an ⁇ - ⁇ filter, or a particle filter, and uses one or more past observed positions to calculate the next value. May be predicted.
  • the processor 16 executes the second position setting process in the first display update cycle F ⁇ and the second display update cycle F ⁇ in setting the specific position Qk of the second real object 330.
  • the display update cycle k-2, k, k+2 is the first display update cycle F ⁇ immediately after the observation positions In-2, In-1, In are acquired, and the display update cycle k-1, k+1 is observed.
  • the second display update cycle F ⁇ is not the first display update cycle F ⁇ immediately after the positions In-2, In-1, and In are acquired. That is, the display update cycle k is the first display update cycle F ⁇ , but the specific position Qk in the display update cycle k is determined by the second position setting process.
  • the specific position Qk+1 in the next display update cycle k+1 is also determined by the second position setting process, and specifically, the four observation positions In-1, In-2, In-3, In-4 before that are determined.
  • the next display update cycle k+2 is the second display update cycle F ⁇ , but the specific position Qk+2 in the display update cycle k+2 is determined by the second position setting process, and specifically, the four observation positions before that.
  • the difference determination module 506 of FIG. 2 predicts the predicted position in the first display update cycle F ⁇ based on the observation position acquired immediately before and one or more observation positions including at least the observation position acquired immediately before. And are compared, and it is determined whether the difference between them is larger than a predetermined difference threshold value stored in the memory 18. When these differences are larger than the predetermined threshold value, the processor 16 executes the first position setting process in the first display update cycle F ⁇ , and executes the second position setting process in the second display update cycle F ⁇ . After continuing this for a predetermined number of display update cycles, the second position setting process may be executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • the processor 16 executes the second position setting process in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • the memory 18 may store two or more difference thresholds, and the distance determination module 508 may determine the degree of difference between the observed position acquired immediately before and the predicted position in three or more stages. Further, these difference thresholds may be variable. For example, the difference determination module 506 may be changed according to the relative speed between the real object 310 and the host vehicle 1, and in this case, the higher the relative speed, the longer the difference threshold may be set.
  • the difference determination module 506 may determine whether the observed position is closer to the host vehicle 1 than the predicted position.
  • the processor 16 executes the first position setting process in the first display update cycle F ⁇ .
  • the second display update cycle F ⁇ when the second position setting process is executed and the observed position acquired immediately before is not nearer to the vehicle 1 than the predicted position, the first display update cycle F ⁇ and the second display update In the period F ⁇ , the second position setting process may be executed. According to this, when the real object is assumed to be closer than the predicted position, the AR image 210 can be quickly displayed based on the observation position where the real object is highly likely to exist.
  • the distance determination module 508 determines the degree of distance between the real object 310 and the host vehicle 1. For example, the distance determination module 508 determines that the distance between the real object 310 and the host vehicle 1 that can be acquired by executing the real object related information detection module 502 is greater than the predetermined distance threshold stored in the memory 18. You may judge whether it is long.
  • the memory 18 may store two or more distance thresholds, and the distance determination module 508 may determine the degree of the distance between the real object 310 and the vehicle 1 in three or more steps. Further, these distance thresholds may be variable. For example, the distance determination module 508 may change the distance according to the relative speed between the real object 310 and the host vehicle 1. In this case, the faster the relative speed, the longer the distance threshold may be set.
  • the speed determination module 510 determines the degree of relative speed between the real object 310 and the host vehicle 1. For example, the speed determination module 510 calculates the real object 310 and the own vehicle based on the time change of the distance between the real object 310 and the own vehicle 1 that can be acquired by the real object related information detection module 502 being executed. It may be determined whether the relative speed with respect to 1 is faster than a predetermined relative speed threshold value stored in the memory 18.
  • the memory 18 may store two or more relative speed thresholds, and the speed determination module 510 may determine the degree of relative speed between the real object 310 and the vehicle 1 in three or more stages. Further, these relative speed thresholds may be variable. For example, the speed determination module 510 may change the speed according to the distance between the real object 310 and the host vehicle 1. In this case, the longer the distance between the real object 310 and the host vehicle 1, the relative speed. You may set so that a threshold value may become fast.
  • the notification necessity degree determination module 512 determines whether or not each image 200 displayed by the vehicle display system 10 is the content to be notified to the driver 4.
  • the notification necessity degree determination module 512 may obtain information from various other electronic devices connected to the I/O interface 14 and calculate the notification necessity degree.
  • the electronic device connected to the I/O interface 14 in FIG. 2 transmits information to the vehicle ECU 401, and the notification necessity degree determination module 512 detects (acquires) the notification necessity degree determined by the vehicle ECU 401 based on the received information. ) May be.
  • the "information need level” is, for example, a risk level derived from the degree of seriousness of the possibility itself, an urgency level derived from the length of the reaction time required to take a reaction action, the own vehicle 1 or the driver 4 ( Alternatively, it can be determined based on the effectiveness derived from the situation of other occupants of the vehicle 1 or a combination thereof (the indicator of the notification necessity degree is not limited to these). That is, the notification necessity degree determination module 512 may determine whether to notify the driver 4 and may select not to display the route image 220, the warning image 230 described below, or both of them.
  • the vehicle display system 10 may not have a function of estimating (calculating) the notification necessity degree, and a part or all of the function of estimating the notification necessity degree may be displayed by the vehicle display system 10. It may be provided separately from the control device 13.
  • the image position determination module 514 determines the determined position (observation position or predicted position) of the real object 310 set by the real object position setting module 504 so that the image 200 is visually recognized in a specific positional relationship with the real object 310. Based on the coordinates of the image 200 (including at least the horizontal direction (X-axis direction) and the vertical direction (Y-axis direction) when the driver 4 views the direction of the display area 100 from the driver's seat of the vehicle 1) decide. In addition to this, the image position determination module 514 determines when the driver 4 views the direction of the display area 100 from the driver's seat of the own vehicle 1 based on the determined position of the real object 310 set by the real object position setting module 504. The front-back direction (Z-axis direction) may be determined.
  • the image position determination module 514 adjusts the position of the image 200 based on the position of the eyes of the driver 4 detected by the eye position detection unit 411. For example, the image position determination module 514 determines the horizontal position and the vertical position of the image 200 so that the center of the image 200 is visually recognized so as to overlap with the center of the real object.
  • the “specific positional relationship” can be adjusted depending on the situation of the real object or the host vehicle 1, the type of the real object, the type of the displayed image, or the like.
  • the image size determination module 516 may change the size of the AR image 210 in accordance with the position and/or size of the real object 310 to be associated. For example, the image size determination module 516 can reduce the size of the AR image 210 if the position of the real object 310 to be associated is distant. Further, the image size determination module 516 can increase the size of the AR image 210 if the size of the real object 310 to be associated is large.
  • the image size determination module 516 detects the type, number, and/or notification necessity determination module 512 of the real object that displays the image 200 detected by the real object related information detection module 502 in association with each other ( The size of the image 200 may be determined based on the (estimated) notification need.
  • the image size determination module 516 may have a function of predicting and calculating the size of the AR image 210 to be displayed in the display update cycle of this time, based on the size of the real object a predetermined number of times in the past.
  • the image size determination module 516 tracks pixels of the real object 310 between two past captured images captured by the camera (an example of the vehicle exterior sensor 407) using, for example, the Lucas-Kanade method.
  • the size of the real object in the current display update cycle may be predicted, and the size of the AR image may be determined according to the predicted size of the real object.
  • the rate of change of the size of the real object is obtained based on the change of the size of the real object between the past two captured images, and the size of the AR image is determined according to the rate of change of the size of the real object.
  • the method of estimating the size change of the real object from the viewpoint that changes in time series is not limited to the above, and known methods including optical flow estimation algorithms such as the Horn-Schunkk method, the Buxton-Buxton method, and the Black-Jepson method, for example. You may use the method of.
  • the graphics module 518 provides visual effects (eg, brightness, transparency, saturation, contrast, or other visual characteristic), size, display position, distance (distance from the driver 4 to the image 200) of the displayed image 200. ) Includes various known software components for modifying.
  • the graphic module 518 displays the coordinates set by the image position determination module 514 (the left-right direction (X-axis direction) and the up-down direction (Y-axis when the driver 4 views the direction of the display area 100 from the driver's seat of the vehicle 1 ).
  • the image 200 is displayed so as to be visually recognized by the driver 4 with the image size set by the image size determination module 516.
  • the display control device includes the image display unit 11 that superimposes and displays the image 200 on the position associated with the real object existing in the foreground viewed from the driver 4 of the vehicle 1.
  • the display controller 13 to control, one or more I/O interfaces 14, one or more processors 16, a memory 18, and one or more processors 16 stored in the memory 18.
  • One or more computer programs configured to be executed, the one or more processors 16 set the position of the image 200 based on the position of the most recently acquired real object.
  • the first position setting process and the predicted position of the real object in the display update cycle of the image predicted based on the positions of one or more real objects acquired in the past including at least the position of the real object acquired immediately before.
  • a second position setting process for setting the position of the image 200 based on the position of the real object existing in front of the vehicle 1 from one or more I/O interfaces 14.
  • the first position setting process is not executed and the position of the real object existing in front of the own vehicle 1 is acquired from the one or more I/O interfaces 14.
  • the second position setting process is executed. According to this, since the display position of the AR image is determined based on the predicted position of the real object from the past observation position, the observation position of the real object is acquired while preventing the display position of the image from changing rapidly. In the display update cycle immediately after, the display position of the AR image is determined based on the observation position of the real object, so that the viewer can quickly recognize the accurate position of the real object.
  • the image 200 includes a first AR image 220 and a second AR image 230 that is of a different type than the first AR image 220, and one or more processors 16 may cause the first AR image 220 to include the first AR image 220.
  • the first position setting process is executed
  • the second position setting process is executed
  • the second AR image 230 in the second display update cycle F ⁇
  • the second position setting process may be executed. According to this, in one of the types of images to be displayed, the position of the image can be updated only at the predicted position of the real object, and a smoothly changing image in which abrupt changes are suppressed can be displayed.
  • the first AR image 220 may be a warning image that calls attention to the real object
  • the second AR image 230 may be a route image that shows the route of the host vehicle 1.
  • the first AR image 220 is an image that requires a relatively high degree of notification, and includes signs such as "stop" on the road surface, road surface conditions (wet, freezing) that may cause slippage, and automatic manual operation of the host vehicle 1. It may be an image showing a driving switching point.
  • the second AR image 230 is an image having a lower notification necessity than the first AR image 220, and shows road signs, signboards, POI information, information about the direction of the final destination, and the like, which are not so important as "stop”. It may be an image.
  • the one or more processors 16 include the position of the real object acquired immediately before and the position of the real object acquired at least immediately before in the first display update cycle F ⁇ .
  • the predicted position of the real object in the display update cycle of the image predicted based on the acquired position of one or more real objects is compared, and if the difference between these is greater than a predetermined threshold value, the first display update In the period F ⁇ , the first position setting process is executed, in the second display update period F ⁇ , the second position setting process is executed, and when the difference is not larger than the predetermined threshold, the first display update period F ⁇ and the second display
  • the second position setting process may be executed in the update cycle F ⁇ .
  • the image is quickly displayed at the position based on the latest observed position, and the AR image and the actual image in which the AR image is associated with each other are associated with each other.
  • the visual attention of the driver 4 can be promptly directed to the object.
  • the one or more processors 16 include the position of the real object acquired immediately before and the position of the real object acquired at least immediately before in the first display update cycle F ⁇ .
  • the predicted position of the real object in the display update cycle of the image predicted based on the acquired position of one or more real objects is compared, and the position of the real object acquired immediately before is the predicted position of the real object.
  • the first position setting process is executed in the first display update cycle F ⁇
  • the second position setting process is executed in the second display update cycle F ⁇
  • the actual position acquired immediately before is executed.
  • the second position setting process may be executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ . According to this, when the latest observed position of the real object is closer to the host vehicle 1 than the predicted position, and the rapid approach of the real object can be assumed, the image is displayed at the position based on the latest observed position, so that the AR image is displayed. Also, the visual attention of the driver 4 can be promptly directed to the real object associated with this AR image.
  • one or more processors 16 may use the one or more I/O interfaces 14 to set the position of the image 200 relative to a reference real object and the host vehicle 1.
  • the first position setting process is executed in the first display update cycle F ⁇
  • the second position setting process is executed in the second display update cycle F ⁇ . If the relative speed is not faster than the predetermined threshold value, the second position setting process may be executed in the first display update cycle F ⁇ and the second display update cycle F ⁇ .
  • one or more of the processors 16 may determine, in the first display update period F ⁇ , if the real object serving as the reference for setting the position of the image 200 is determined to be approaching.
  • the second position setting process is executed in the second display update cycle F ⁇ , and it is not determined that the real object serving as the reference for setting the position of the image 200 is approaching, the first position setting process is performed.
  • the second position setting process may be executed. According to this, when it can be assumed that the real object is approaching, the driver displays the AR image and the real object associated with the AR image by displaying the image at the position based on the latest observation position. The visual attention of 4 can be directed quickly.
  • the display area 100 is not limited to an arrangement that is substantially along a plane (XY plane) consisting of up, down, left and right as seen from the driver 4.
  • the display region 100 may be rotated about the left-right direction (X-axis direction) viewed from the driver 4 and arranged substantially along the traveling lane 330 (ZX plane).
  • the display area 100 may be a curved surface instead of a flat surface.
  • a stereoscopic display may be adopted for the image display unit 11 and the image 200 may be displayed in the display area 100 which is a three-dimensional area.
  • the second AR image 230 may be a route image with two or more illustrations or the like that are visually recognized to be superimposed on the traveling lane 330 of the host vehicle 1.
  • Gaze direction detecting section 411... Eye position detecting section, 413... Portable information terminal, 420... Out-of-vehicle communication connecting device, 502... Real object related information detecting module, 504... Real object position setting module, 506... Difference determination module, 508... Distance determination module, 510... Speed determination module, 512... Notification necessity determination module, 514... Image position determination module, 516... Image size determination module, 518... Graphic module, F ⁇ ... First display Update cycle, F ⁇ ... Second display update cycle

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention lisse la transition d'affichage d'images sur la base d'une position prédite qui est une position prédite d'un objet réel et dirige rapidement l'attention visuelle. Un dispositif de commande d'affichage est apte à exécuter un premier processus de réglage de position pour régler une position d'une image (210) sur la base de la position acquise la plus récemment d'un objet réel (310) et un second processus de réglage de position pour régler une position d'une image (220) sur la base d'une position prédite de l'objet réel (310) dans un cycle de mise à jour d'affichage de l'image (210) qui est prédite sur la base d'une ou plusieurs positions précédemment acquises de l'objet réel (310), comprenant au moins la position acquise la plus récemment de l'objet réel (310). Le dispositif de commande d'affichage exécute le premier processus de réglage de position dans un premier cycle de mise à jour d'affichage (Fα) immédiatement après l'acquisition de la position de l'objet réel (310) existant devant un véhicule propre, et exécute le second processus de réglage de position dans un second cycle de mise à jour d'affichage (Fβ) qui n'est pas immédiatement après l'acquisition de la position de l'objet réel (310) existant devant le véhicule propre.
PCT/JP2020/002504 2019-01-29 2020-01-24 Dispositif, procédé et programme informatique de commande d'affichage Ceased WO2020158601A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019013681 2019-01-29
JP2019-013681 2019-01-29

Publications (1)

Publication Number Publication Date
WO2020158601A1 true WO2020158601A1 (fr) 2020-08-06

Family

ID=71840946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/002504 Ceased WO2020158601A1 (fr) 2019-01-29 2020-01-24 Dispositif, procédé et programme informatique de commande d'affichage

Country Status (1)

Country Link
WO (1) WO2020158601A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023010236A1 (fr) * 2021-07-31 2023-02-09 华为技术有限公司 Procédé, dispositif et système d'affichage

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119917A (ja) * 2009-12-02 2011-06-16 Denso Corp 車両用表示装置
JP2015141155A (ja) * 2014-01-30 2015-08-03 パイオニア株式会社 虚像表示装置、制御方法、プログラム、及び記憶媒体
WO2017069038A1 (fr) * 2015-10-22 2017-04-27 日本精機株式会社 Système d'affichage embarqué
WO2018105052A1 (fr) * 2016-12-07 2018-06-14 三菱電機株式会社 Dispositif de commande d'affichage, système d'affichage et procédé de commande d'affichage
JP2018151903A (ja) * 2017-03-14 2018-09-27 アイシン・エィ・ダブリュ株式会社 虚像表示装置及びコンピュータプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011119917A (ja) * 2009-12-02 2011-06-16 Denso Corp 車両用表示装置
JP2015141155A (ja) * 2014-01-30 2015-08-03 パイオニア株式会社 虚像表示装置、制御方法、プログラム、及び記憶媒体
WO2017069038A1 (fr) * 2015-10-22 2017-04-27 日本精機株式会社 Système d'affichage embarqué
WO2018105052A1 (fr) * 2016-12-07 2018-06-14 三菱電機株式会社 Dispositif de commande d'affichage, système d'affichage et procédé de commande d'affichage
JP2018151903A (ja) * 2017-03-14 2018-09-27 アイシン・エィ・ダブリュ株式会社 虚像表示装置及びコンピュータプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023010236A1 (fr) * 2021-07-31 2023-02-09 华为技术有限公司 Procédé, dispositif et système d'affichage

Similar Documents

Publication Publication Date Title
JP7255608B2 (ja) 表示制御装置、方法、及びコンピュータ・プログラム
EP3339124B1 (fr) Système de conduite autonome
US11803053B2 (en) Display control device and non-transitory tangible computer-readable medium therefor
JP6223630B2 (ja) 表示制御装置、表示システム、表示制御方法および表示制御プログラム
JP2020032866A (ja) 車両用仮想現実提供装置、方法、及びコンピュータ・プログラム
JP2020086884A (ja) 区画線推定装置、表示制御装置、方法、及びコンピュータ・プログラム
US12162354B2 (en) Display control device, head-up display device, and display control method
JP7459883B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び方法
US12485755B2 (en) Display control device, display system, and display control method
WO2016056199A1 (fr) Dispositif d'affichage tête haute, et procédé d'affichage pour affichage tête haute
JP2020117104A (ja) 表示制御装置、表示システム、方法、及びコンピュータ・プログラム
WO2020158601A1 (fr) Dispositif, procédé et programme informatique de commande d'affichage
JP2020199883A (ja) 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム
JP2020117105A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP7619007B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
JP2020121607A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
WO2021200914A1 (fr) Dispositif de commande d'affichage, dispositif d'affichage tête haute et procédé
JP2021160409A (ja) 表示制御装置、画像表示装置、及び方法
JP2020121704A (ja) 表示制御装置、ヘッドアップディスプレイ装置、方法、及びコンピュータ・プログラム
JP7635559B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
JP7738382B2 (ja) 車両用表示装置
JP7434894B2 (ja) 車両用表示装置
JP7655103B2 (ja) 表示制御装置、ヘッドアップディスプレイ装置、及び表示制御方法
JP2020106911A (ja) 表示制御装置、方法、及びコンピュータ・プログラム
JP2020086882A (ja) 表示制御装置、方法、及びコンピュータ・プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20749375

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20749375

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP