[go: up one dir, main page]

WO2020121810A1 - Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement tangible non transitoire lisible par ordinateur - Google Patents

Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement tangible non transitoire lisible par ordinateur Download PDF

Info

Publication number
WO2020121810A1
WO2020121810A1 PCT/JP2019/046318 JP2019046318W WO2020121810A1 WO 2020121810 A1 WO2020121810 A1 WO 2020121810A1 JP 2019046318 W JP2019046318 W JP 2019046318W WO 2020121810 A1 WO2020121810 A1 WO 2020121810A1
Authority
WO
WIPO (PCT)
Prior art keywords
map information
display
information
precision map
display mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/046318
Other languages
English (en)
Japanese (ja)
Inventor
智 堀畑
祐介 近藤
猛 羽藤
一輝 小島
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019196468A external-priority patent/JP7052786B2/ja
Application filed by Denso Corp filed Critical Denso Corp
Priority to DE112019006171.2T priority Critical patent/DE112019006171T5/de
Publication of WO2020121810A1 publication Critical patent/WO2020121810A1/fr
Priority to US17/222,259 priority patent/US20210223058A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present disclosure relates to a display control device for displaying a virtual image, a display control program, and a computer-readable persistent tangible recording medium.
  • Patent Document 1 discloses a head-up display device that uses map information for display control of a virtual image. This device displays the shape of the road ahead of the vehicle as a virtual image based on the current position of the vehicle and map information.
  • map information includes high-precision map information and low-precision map information, which is relatively less accurate than high-precision map information.
  • Patent Document 1 does not assume that the map information is effectively used.
  • the present disclosure aims to provide a display control device, a display control program, and a computer-readable persistent tangible recording medium that can effectively use map information.
  • a display control device that is used in a vehicle and that controls the display of a virtual image that is superimposed on the foreground of an occupant includes a vehicle position acquisition unit that acquires the position of the vehicle and a height corresponding to the position.
  • Accuracy map information or a low accuracy map information that is less accurate than the high accuracy map information; a map information acquisition unit; and if the high accuracy map information can be acquired, the first based on the high accuracy map information.
  • the virtual image is generated in the display mode and the high-accuracy map information cannot be acquired, the virtual image is generated in the second display mode different from the first display mode based on the low-accuracy map information.
  • a display generation unit When the virtual image is generated in the display mode and the high-accuracy map information cannot be acquired, the virtual image is generated in the second display mode different from the first display mode based on the low-accuracy map information.
  • a display control program which is used in a vehicle and controls display of a virtual image superimposed on a foreground of an occupant, includes at least one processing unit and a vehicle position acquisition unit that acquires the position of the vehicle.
  • the virtual image is generated in the first display mode based on the accuracy map information and the high accuracy map information cannot be acquired, the second display different from the first display mode based on the low accuracy map information.
  • the second display different from the first display mode based on the low accuracy map information.
  • function as a display generation unit that generates the virtual image.
  • a computer-readable persistent tangible recording medium including computer-implemented instructions, the instructions being used in a vehicle to control the display of a virtual image superimposed on the occupant's foreground.
  • the instruction is to acquire the position of the vehicle, to acquire high-precision map information corresponding to the position or low-precision map information having a lower accuracy than the high-precision map information, and the high-precision map information.
  • the precision map information can be acquired, the virtual image is generated in the first display mode based on the high precision map information, and when the high precision map information cannot be acquired, the low precision map information is generated. And generating the virtual image in a second display mode different from the first display mode.
  • a display control device that is used in a vehicle and that controls the display of a virtual image that is superimposed on the occupant's foreground includes at least one processing unit, and the at least one processing unit is the vehicle.
  • the position is acquired, high-precision map information corresponding to the position or low-precision map information having a lower accuracy than the high-precision map information is acquired, and when the high-precision map information can be acquired, the high-precision map information is acquired.
  • the second display mode based on the low-precision map information, which is different from the first display mode. To generate the virtual image.
  • the high-precision map information when the high-precision map information can be acquired, the high-precision map information is used to generate the virtual image, and when the high-precision map information cannot be acquired, the virtual image is generated. Low precision map information is used. As described above, it is possible to display the virtual image by selectively using the high-precision map information and the low-precision map information. Therefore, it is possible to provide a display control device, a display control program, and a computer-readable persistent tangible recording medium that can effectively use map information.
  • the drawing is 1 is a schematic diagram of a vehicle system including an HCU according to a first embodiment, It is a figure which shows the example of mounting in a vehicle of HUD, It is a block diagram showing a schematic structure of HCU, It is a figure showing an example of a superimposed display, It is a figure showing an example of a superimposed display, It is a figure showing an example of non-overlap display, It is a figure which shows the mode of the display gap by the superimposed display of a modification, It is a conceptual diagram showing an example of a display switching timing, It is a flow chart which shows an example of processing which HCU performs, It is a schematic diagram of a vehicle system containing HCU concerning a 2nd embodiment, It is a block diagram which shows the schematic structure of HCU of 2nd Embodiment.
  • the display control device of the first embodiment will be described with reference to FIGS. 1 to 9.
  • the display control device of the first embodiment is provided as an HCU (Human Machine Interface Control Unit) 20 used in the vehicle system 1.
  • the vehicle system 1 is used in a vehicle A that travels on the road such as an automobile.
  • the vehicle system 1 includes, for example, an HMI (Human Machine Interface) system 2, a locator 3, a periphery monitoring sensor 4, a driving support ECU 6, and a navigation device 7.
  • the HMI system 2, the locator 3, the peripheral monitoring sensor 4, the driving support ECU 6, and the navigation device 7 are connected to, for example, an in-vehicle LAN.
  • the locator 3 includes a GNSS (Global Navigation Satellite System) receiver 30, an inertial sensor 31, a high precision map database (hereinafter, high precision map DB) 32, and a locator ECU 33.
  • the GNSS receiver 30 receives positioning signals from a plurality of artificial satellites.
  • the inertial sensor 31 includes, for example, a gyro sensor and an acceleration sensor.
  • the high-precision map DB 32 is a non-volatile memory and stores high-precision map data (high-precision map information).
  • the high precision map DB 32 is provided by a memory device of a locator ECU 33 described later.
  • the high-precision map data has information about roads, information about marking lines such as white lines and road markings, information about structures, and the like.
  • the information about roads includes, for example, position information for each point, curve curvature and slope, and shape information such as connection relationship with other roads.
  • the information about the lane markings and road markings includes, for example, type information of lane markings and road markings, position information, and three-dimensional shape information.
  • the information about the structure includes, for example, type information, position information, and shape information of each structure.
  • the structures are road signs, traffic lights, street lights, tunnels, overpasses, buildings facing roads, and the like.
  • the high-precision map data has the above-mentioned various position information and shape information as point cloud data and vector data of feature points represented by three-dimensional coordinates. That is, it can be said that the high-precision map data is a three-dimensional map that includes altitude in addition to latitude and longitude with respect to position information.
  • the high-precision map data has such positional information with a relatively small error (for example, on the order of centimeters).
  • the high-precision map data is high-precision map data in that it has position information based on three-dimensional coordinates including height information, and is also accurate in that the error in the position information is relatively small. High map data.
  • High-precision map data is created based on the information collected by surveying vehicles that actually travel on the road. Therefore, the high-precision map data is created for the area where the information is collected, and is out of the range for the area where the information is not collected.
  • high-precision map data is currently prepared with a relatively wide coverage for highways and motorways, and with a relatively narrow coverage for general roads.
  • the locator ECU 33 is mainly composed of a microcomputer including a processor, a RAM, a memory device, an I/O, and a bus connecting these.
  • the locator ECU 33 is connected to the GNSS receiver 30, the inertial sensor 31, and the in-vehicle LAN.
  • the locator ECU 33 sequentially measures the position of the vehicle A of the vehicle A by combining the positioning signal received by the GNSS receiver 30 and the measurement result of the inertial sensor 31.
  • the locator ECU 33 may use the traveling distance or the like obtained from the detection result sequentially output from the vehicle speed sensor mounted in the own vehicle for positioning the own vehicle position. In addition, the locator ECU 33 identifies the vehicle position using the high-precision map data described below and the detection result of the peripheral monitoring sensor 4 such as LIDAR that detects the point group of the road shape and the feature points of the structure. May be. Locator ECU 33 outputs the vehicle position information to the in-vehicle LAN.
  • the locator ECU 33 has a map notification unit 301 as a functional block, as shown in FIG.
  • the map notification unit 301 is highly accurate in information about the current vehicle position of the vehicle A, which is information corresponding to the vehicle position, based on the measured vehicle position information and the high accuracy map data of the high accuracy map DB 32. It is determined whether it is included in the map data.
  • the map notification unit 301 executes, for example, so-called map matching processing in which the traveling locus of the vehicle A is calculated from the vehicle position information and is superimposed on the road shape of the high precision map data.
  • the map notification unit 301 determines whether the current vehicle position is included in the high-precision map data based on the result of the map matching process.
  • the map notification unit 301 uses the two-dimensional position information (for example, longitude and latitude) of the vehicle A as well as the height information based on the own vehicle position information, and the information about the current own vehicle position is highly accurate. It is determined whether it is included in the map data.
  • the map notification unit 301 uses the above-described map matching process or the process using height information to determine which road is to be used for vehicles even when roads with different heights (for example, an elevated road and a ground road) are close to each other. It is possible to determine whether A is traveling. Accordingly, the map notification unit 301 can improve the determination accuracy.
  • the map notification unit 301 outputs, to the HCU 20, notification information indicating that information about the vehicle position is included or not included in the high-precision map data based on the determination result.
  • the peripheral monitoring sensor 4 is an autonomous sensor that monitors the surrounding environment of the vehicle.
  • the perimeter monitoring sensor 4 is a moving dynamic target such as a pedestrian, an animal other than a human being, a vehicle other than the own vehicle, a road surface display such as a falling object on the road, a guardrail, a curbstone, a lane marking, or a tree. It detects objects around the vehicle, such as stationary static targets. ..
  • the peripheral monitoring sensor 4 there are a peripheral monitoring camera that images a predetermined range around the own vehicle, a millimeter wave radar that transmits a search wave to a predetermined range around the own vehicle, a sonar, a search wave sensor such as LIDAR.
  • the perimeter monitoring camera sequentially outputs captured images that are sequentially captured as sensing information to the in-vehicle LAN.
  • the exploration wave sensor sequentially outputs the scanning result based on the received signal obtained when the reflected wave reflected by the object is received, to the in-vehicle LAN as sensing information.
  • the perimeter monitoring sensor 4 of the first embodiment includes at least a front camera 41 whose imaging range is a predetermined range in front of the vehicle.
  • the front camera 41 is provided, for example, on the rearview mirror of the own vehicle, the upper surface of the instrument panel, or the like.
  • the driving support ECU 6 executes an automatic driving function that acts on behalf of a passenger.
  • the driving support ECU 6 recognizes the traveling environment of the own vehicle based on the vehicle position and map data of the own vehicle acquired from the locator 3, and the sensing information from the surroundings monitoring sensor 4.
  • An example of the automatic driving function executed by the driving support ECU 6 is to adjust the driving force and the braking force to control the traveling speed of the own vehicle so as to maintain the target distance between the preceding vehicle and ACC (Adaptive Cruise Control). ) There is a function. In addition, there is an AEB (Autonomous Energy Breaking) function that forcibly decelerates the vehicle by generating a braking force based on the front sensing information.
  • the driving support ECU 6 may have other functions as a function of automatic driving.
  • the navigation device 7 includes a navigation map database (hereinafter, navigation map DB) 70 that stores navigation map data.
  • the navigation device 7 searches for a route that satisfies conditions such as time priority and distance priority to the set destination, and provides route guidance according to the searched route.
  • the navigation device 7 outputs the searched route to the in-vehicle LAN as scheduled route information.
  • the navigation map DB 70 is a non-volatile memory and stores navigation map data such as link data, node data, and road shapes.
  • Navigation map data is prepared in a relatively wider area than high-precision map data.
  • the link data is composed of a link ID for identifying the link, a link length indicating the length of the link, a link azimuth, a link travel time, node coordinates of the start and end of the link, road attributes, and the like.
  • the node data is each data such as a node ID given a unique number for each node on the map, a node coordinate, a node name, a node type, a connection link ID in which a link ID of a link connecting to the node is described, an intersection type, etc. Composed of.
  • the navigation map data has node coordinates as two-dimensional position coordinate information. That is, it can be said that the navigation map data is a two-dimensional map including the latitude and longitude with respect to the position information.
  • the navigation map data is a map data having a relatively lower accuracy than the high accuracy map data in that it does not have height information regarding the position information, and also has a low accuracy in that the error of the position information is relatively large. It is map data.
  • the navigation map data is an example of low-precision map information.
  • the HMI system 2 includes an operation device 21, a display device 23, and an HCU 20, and receives an input operation from an occupant who is a user of the own vehicle and presents information to the occupant of the own vehicle.
  • the operation device 21 is a switch group operated by an occupant of the vehicle.
  • the operation device 21 is used to make various settings. For example, as the operation device 21, there is a steering switch or the like provided on the spoke portion of the steering of the vehicle.
  • the display device 23 includes, for example, a head-up display (hereinafter referred to as HUD) 230, a multi-information display (MID) 231 provided on the meter, and a center information display (CID) 232.
  • HUD head-up display
  • MID multi-information display
  • CID center information display
  • the HUD 230 is provided on the instrument panel 12 of the own vehicle.
  • the HUD 230 forms a display image based on the image data output from the HCU 20 by a liquid crystal type or scanning type projector 230a, for example.
  • the navigation device 7 displays navigation map data, route information for the destination, and the like.
  • the HUD 230 projects the display image formed by the projector 230a onto a projection area PA defined by the front windshield WS as a projection member through an optical system 230b such as a concave mirror.
  • the projection area PA is assumed to be located in front of the driver's seat.
  • the luminous flux of the display image reflected by the front windshield WS toward the vehicle interior is perceived by an occupant sitting in the driver's seat.
  • the light flux from the foreground as a landscape existing in front of the vehicle, which is transmitted through the front windshield WS formed of translucent glass, is also perceived by the occupant sitting in the driver's seat.
  • the occupant can visually recognize the virtual image Vi of the display image formed in front of the front windshield WS, overlapping a part of the foreground.
  • the HUD 230 superimposes and displays the virtual image Vi on the foreground of the vehicle A.
  • the HUD 230 superimposes the virtual image Vi on a specific superimposition target in the foreground to realize so-called AR (Augmented Reality) display.
  • AR Augmented Reality
  • the HUD 230 realizes a non-AR display in which the virtual image Vi is not superimposed on a specific superimposition target, but simply superimposed and displayed on the foreground.
  • the projection member on which the HUD 230 projects the display image is not limited to the front windshield WS and may be a translucent combiner.
  • the HCU 20 is mainly composed of a microcomputer having a processor 20a, a RAM 20b, a memory device 20c, an I/O 20d, and a bus connecting these, and is connected to the HUD 230 and the in-vehicle LAN.
  • the HCU 20 controls the display by the HUD 230 by executing the display control program stored in the memory device 20c.
  • the HCU 20 is an example of a display control device, and the processor 20a is an example of a processing unit.
  • the memory device 20c is a non-transitional tangible storage medium that non-temporarily stores computer-readable programs and data.
  • the non-transitional physical storage medium is realized by a semiconductor memory or a magnetic disk.
  • the HCU 20 generates an image of the content displayed as the virtual image Vi on the HUD 230 and outputs it to the HUD 230.
  • the HCU 20 generates a route guidance image that guides the scheduled traveling route of the vehicle A to the occupants, as shown in FIGS. 4 to 6.
  • the HCU 20 generates an AR guide image Gi1 superimposed on the road surface as shown in FIGS. 4 and 5.
  • the AR guidance image Gi1 is generated, for example, in a three-dimensional display mode (hereinafter, three-dimensional display mode) arranged continuously on the road surface along the planned travel route.
  • FIG. 4 is an example in which the AR guidance image Gi1 is displayed in a superimposed manner on a road with a slope.
  • FIG. 5 is an example in which the AR guidance image Gi1 is displayed in a superimposed manner along the road shape in which the number of lanes is increasing at the destination.
  • the HCU 20 generates a non-AR guidance image Gi2 simply displayed in the foreground as a route guidance image as shown in FIG.
  • the non-AR guidance image Gi2 is a two-dimensional display mode that is fixed with respect to the front windshield WS, such as an image that highlights the lane to be traveled and an image of an intersection where the traveling route is shown (hereinafter, 2D display mode). That is, the non-AR guide image Gi2 is a virtual image Vi that is not superimposed on a specific superimposition target in the foreground but is simply superimposed on the foreground.
  • the three-dimensional display mode is an example of the first display mode
  • the two-dimensional display mode is an example of the second display mode.
  • the HCU 20 has a vehicle position acquisition unit 201, a map determination unit 202, a map information acquisition unit 203, a sensor information acquisition unit 204, and a display mode determination unit 205 as functional blocks related to the generation of the AR guidance image Gi1 and the non-AR guidance image Gi2. , And a display generation unit 206.
  • the vehicle position acquisition unit 201 acquires vehicle position information from the locator 3.
  • the vehicle position acquisition unit 201 is an example of a vehicle position acquisition unit.
  • the map determination unit 202 determines, based on the notification information or the like acquired from the locator 3, which of high-precision map data and navigation map data is to be acquired as the map information used to generate the virtual image Vi.
  • the map determination unit 202 determines whether or not high precision map data can be acquired.
  • the map determination unit 202 determines that the high-precision map data can be acquired when the current vehicle position of the vehicle A is included in the high-precision map data.
  • the map determination unit 202 performs this determination process based on the notification information output from the locator ECU 33.
  • the vehicle position used in the determination processing here may include an area around the vehicle A on which the virtual image Vi can be superimposed.
  • the map determination unit 202 determines whether or not the high-accuracy map data can be acquired by itself, based on the own vehicle position information and the high-accuracy map data acquired from the locator 3, regardless of the notification information from the locator 3. You may.
  • the map determination unit 202 may continuously perform the above-described determination processing during traveling, or may intermittently perform the determination processing every predetermined traveling section.
  • the map determination unit 202 also determines whether or not the high-precision map data includes information about the future traveling section GS of the vehicle A (section determination processing).
  • the future traveling section GS is, for example, the latest traveling section of the planned traveling route of the vehicle A for which the route guidance image needs to be displayed.
  • the display section in which the route guidance image needs to be displayed is, for example, a section including a point where a plurality of roads are connected, such as an intersection, or a section in which the lane needs to be changed.
  • the map determination unit 202 determines whether or not the entire range of the future traveling section GS as shown in FIG. 8 is included in the high precision map data.
  • FIG. 8 shows a situation in which vehicle A tries to enter a general road from a highway through a rampway. In FIG. 8, it is assumed that the vehicle A turns left at the intersection CP where the rampway and the general road are connected.
  • the road in FIG. 8 is divided into an area where both high-precision map data and navigation map data are provided and an area where only navigation map data is provided, with the two-dot chain line shown on the rampway as the boundary line.
  • the section from the start point ps for example, a point 300 m before the intersection CP
  • the section from the boundary line to the end point pf for example, the exit point of the intersection
  • the map determination unit 202 determines that the high-precision map data does not include information about the future traveling section GS of the vehicle A.
  • the map determination unit 202 executes this section determination processing based on, for example, the planned route information provided by the navigation device 7 and the high-precision map data provided by the locator 3.
  • the map determination unit 202 executes this section determination processing at the timing when the vehicle A reaches the starting point ps or when the vehicle A approaches the starting point ps.
  • the map determination unit 202 may be configured to acquire the determination result of the above-described section determination processing executed by the locator ECU 33.
  • the map determination unit 202 determines whether or not the shape condition for which the generation of the AR guidance image Gi1 is unnecessary for the road shape on which the vehicle A is traveling, that is, the shape condition for stopping the generation of the AR guidance image Gi1 is satisfied. (Shape determination process).
  • the shape condition is satisfied, for example, when the route is evaluated to be a road shape that can accurately transmit the planned traveling route to the occupant by the non-AR guidance image Gi2. Then, if it is evaluated that the occupant may misidentify the planned traveling route when the non-AR guidance image Gi2 is displayed instead of the AR guidance image Gi1, the shape condition is not satisfied.
  • the road shape includes the number of lanes the road has, the gradient and the curvature, the connection relationship with other roads, and the like.
  • the lane to which the vehicle travels is uniquely determined, so that the planned traveling route can be accurately transmitted by the non-AR guidance image Gi2, and the shape condition is satisfied.
  • the intersection for performing the right/left turn guidance and the vehicle A the intersection for performing the right/left turn is uniquely determined. Therefore, the planned traveling route is accurately determined by the non-AR guidance image Gi2.
  • the shape condition is satisfied.
  • the road is a flat road having substantially no slope, it is possible to see the traveling destination of the vehicle A, so that the planned traveling route can be accurately transmitted by the non-AR guidance image Gi2, and the shape condition is To establish.
  • the establishment of the shape condition may be determined by a combination of a plurality of cases described above, for example, when the road is a flat road and has only one lane.
  • the map determination unit 202 determines whether or not the shape condition is satisfied, based on the high-precision map data provided from the locator 3, the detection information from the perimeter monitoring sensor 4, and the like. Alternatively, the map determination unit 202 may be configured to acquire the determination result of the above-described shape determination process executed by the locator ECU 33.
  • the map determination unit 202 determines to acquire the high-precision map data when the high-precision map data can be acquired at the current vehicle position. However, the map determination unit 202 can acquire the high-precision map data at the current vehicle position when the high-precision map data does not include information about the future traveling section GS or when the shape condition is satisfied. Even if there is, it is determined to acquire the navigation map data.
  • the map information acquisition unit 203 acquires either high-accuracy map data or navigation map data based on the determination result of the map determination unit 202.
  • the map information acquisition unit 203 acquires high-accuracy map data when it is determined that high-accuracy map data can be acquired.
  • the map information acquisition unit 203 acquires the navigation map data instead of the high accuracy map data when it is determined that the high accuracy map data cannot be acquired.
  • the map information acquisition unit 203 determines that the high-precision map data can be acquired. Also gets navigation map data. In addition, when it is determined that the shape condition is satisfied, the map information acquisition unit 203 acquires the navigation map data even when it is determined that the high-accuracy map data can be acquired. To do.
  • the map information acquisition unit 203 sequentially outputs the acquired map information to the display mode determination unit 205.
  • the sensor information acquisition unit 204 acquires detection information regarding a detected object in front of the vehicle A.
  • the detection information includes the height information of the road surface on which the AR guide image Gi1 is to be superimposed, or the height information of the detected object from which the height information can be estimated.
  • the detected objects include road markings such as stop lines, center markings at intersections, and lane markings, road markings, curbs, and road installations such as traffic lights.
  • the detection information is information for correcting the superimposed position of the navigation map data or the AR guide image Gi1 when the AR guide image Gi1 is generated using the navigation map data.
  • the detection information may include shape information of the traveling road, information about the number of lanes on the traveling road, information about the lane in which the vehicle A is currently traveling, and the like.
  • the sensor information acquisition unit 204 attempts to acquire the detection information, and when the detection information is acquired, sequentially outputs the detection information to the display mode determination unit 205.
  • the display mode determining unit 205 displays in which of the three-dimensional display mode and the two-dimensional display mode the route guidance image is generated, that is, which of the AR guidance image Gi1 and the non-AR guidance image Gi2 is displayed as the route guidance image.
  • the generation unit 206 determines whether to generate.
  • the AR guidance image Gi1 when the AR guidance image Gi1 is displayed based on the navigation map data, the AR guidance image Gi1 is displayed as if it is floating on the road surface as in the modification shown in FIG. 7, or on the road surface. It may be displayed as if it was filled.
  • Such a deviation of the superimposition position is such that the navigation map data has a particularly low accuracy of height information as compared with the high-precision map data or does not have height information, and the AR map reflects the gradient shape of the road. This occurs because the guide image Gi1 cannot be generated.
  • the display mode determination unit 205 In order to suppress the generation of the AR guidance image Gi1 with the superimposed position shifted, the display mode determination unit 205 generates the route guidance image based on the availability of the high-accuracy map data as the AR guidance image Gi1 and the non-AR guidance image Gi2. Select from.
  • the display mode determination unit 205 determines the display mode of the route guidance image to be the three-dimensional display mode when the high-precision map data is acquired by the map information acquisition unit 203.
  • the display mode determination unit 205 determines the display mode of the route guidance image to be the two-dimensional display mode.
  • the display mode of the route guidance image is determined to be a three-dimensional display mode.
  • the display mode determination unit 205 outputs the determined display mode to the display generation unit 206.
  • the display generation unit 206 generates a route guidance image in the display mode determined by the display mode determination unit 205 based on the acquired various information.
  • the display mode is determined to be the three-dimensional display mode
  • the display generation unit 206 determines the three-dimensional position coordinates of the road surface on which the AR guidance image Gi1 is superimposed, based on the information of the three-dimensional position coordinates of the high accuracy map data. Identify.
  • the display generation unit 206 specifies the relative three-dimensional position (relative position) of the road surface with respect to the vehicle A based on the position coordinates of the road surface and the vehicle position coordinates.
  • the display generation unit 206 also calculates or acquires road surface gradient information based on the high-precision map data.
  • the display generation unit 206 calculates the gradient information by, for example, a geometric calculation using position coordinates of two points that define a slope. Alternatively, the display generation unit 206 may calculate the gradient information based on the three-dimensional shape information of the lane markings. Alternatively, the display generation unit 206 may estimate the gradient information based on the information that can estimate the gradient information among the information included in the high-precision map data.
  • the display generation unit 206 performs AR guidance by geometric calculation based on the specified relative position, the occupant's viewpoint position acquired from the DSM 22, the position of the projection area PA, the road surface gradient at the relative position, and the like.
  • the projection position and projection shape of the image Gi1 are calculated.
  • the display generation unit 206 generates the AR guide image Gi1 based on the calculation result, outputs the data to the HUD 230, and displays the AR guide image Gi1 as the virtual image Vi.
  • the display generation unit 206 combines the two-dimensional position coordinates of the navigation map with the peripheral information when the three-dimensional display mode is determined based on the fact that the sensor information acquisition unit 204 has acquired the detection information. Then, the AR guidance image Gi1 is generated. For example, the display generation unit 206 specifies the three-dimensional position coordinates of the road surface on which the AR guide image Gi1 is superimposed, from the height information acquired or estimated from the detection information and the two-dimensional position coordinates of the navigation map. .. The display generation unit 206 calculates the projection position and the projection shape of the AR guidance image Gi1 using the specified position coordinates, as in the case of using the high-precision map data.
  • the display generation unit 206 includes these pieces of information in the AR guidance image Gi1. It may be used to correct the superimposed position of.
  • the display generation unit 206 acquires information on the two-dimensional position coordinates of the navigation map and generates a route guidance image.
  • the display generation unit 206 determines the superimposed position of the route guidance image with respect to the foreground to the preset position based on the acquisition of the two-dimensional position coordinates.
  • the display generation unit 206 determines the projected shape of the route guidance image based on the two-dimensional position coordinates and generates the route guidance image.
  • the display generation unit 206 outputs the generated data to the HUD 230 and displays the route guidance image as a virtual image Vi for non-AR display.
  • the display generation unit 206 generates a mode presentation image Ii for presenting the display mode of the displayed route guidance image to the occupant.
  • the display generation unit 206 generates, for example, the aspect presentation image Ii as a character image.
  • the display generation unit 206 when the AR guidance image Gi1 is displayed, the display generation unit 206 generates the mode presentation image Ii indicating the three-dimensional display mode with the character image of “3D”. ..
  • the display generation unit 206 has generated the character image of “2D” as the mode presentation image Ii indicating the two-dimensional display mode.
  • the display generation unit 206 may present the mode presentation image Ii as information other than character information such as symbols and designs. Further, the display generation unit 206 may display the aspect presentation image Ii on a display device other than the HUD 230 such as the CID 232 or the MID 231. In this case, the display generation unit 206 can reduce the amount of information in the projection area PA of the HUD 230 while presenting the display mode to the occupant, and can reduce the annoyance of the occupant.
  • the “display generation unit 206” is an example of the “mode presentation unit”.
  • the HCU 20 starts the process of FIG. 9 when the destination is set in the navigation device 7 and the planned travel route is set.
  • step S10 it is determined whether or not the route guidance display is started. For example, in step S10, when the distance between the guidance point and the vehicle A is less than a threshold value (for example, 300 m), it is determined to start the route guidance display. When it is determined that the route guidance display is started, the process proceeds to step S20, and the vehicle position information is acquired from the locator 3.
  • a threshold value for example, 300 m
  • step S30 notification information about the vehicle position and its surroundings is acquired from the locator 3, and the process proceeds to step S40.
  • step 40 it is determined based on the notification information or the like whether or not the high precision map data can be acquired. If it is determined that it can be acquired, the process proceeds to step S42.
  • step S42 based on the information from the locator 3, it is determined whether or not there is high-precision map data in the future traveling section GS. If it is determined that there is high-precision map data in the future traveling section GS, the process proceeds to step S44, and it is determined whether or not the shape condition is satisfied. If it is determined that the shape condition is not satisfied, the process proceeds to step S50.
  • step S50 the map information acquisition unit 203 acquires high precision map data.
  • step S60 a route guidance image in a three-dimensional display mode is generated based on the acquired three-dimensional coordinates of the high-precision map data, and the process proceeds to step S120.
  • step S120 the generated route guidance image is output to the HUD 230, and the HUD 230 is caused to generate the route guidance image as the virtual image Vi.
  • step S40 determines whether high-precision map data cannot be acquired. If it is determined in step S40 that high-precision map data cannot be acquired, the process proceeds to step S70.
  • step S70 it is determined whether the detection information can be acquired from the vehicle-mounted sensor. If it is determined that the detection information cannot be acquired, the process proceeds to step S80.
  • step S80 the navigation map data is acquired from the navigation device 7, and the process proceeds to step S90.
  • step S90 a route guidance image is generated in a two-dimensional display mode based on the navigation map data. After that, the process proceeds to step S120, and the generated route guidance image is output to the HUD 230.
  • step S42 determines whether the future traveling section GS is included in the high-precision map data. If it is determined in step S42 that the future traveling section GS is not included in the high-precision map data, the process proceeds to step S80. In addition, if it is determined in step S44 that the shape condition is satisfied, the process proceeds to step S80.
  • step S70 if it is determined in step S70 that the detection information can be acquired from the peripheral monitoring sensor 4, the process proceeds to step S100.
  • step S100 navigation map data and detection information are acquired.
  • step S110 a route guidance image in a three-dimensional display mode is generated based on the navigation map data and the detection information.
  • step S120 the generated image data is output to the HUD 230.
  • the HCU 20 includes a map information acquisition unit 203 that acquires map information relating to the superimposed position of the virtual image Vi in the foreground as high-precision map data or navigation map data, and a display generation unit 206 that generates the virtual image Vi based on the map information.
  • a map information acquisition unit 203 that acquires map information relating to the superimposed position of the virtual image Vi in the foreground as high-precision map data or navigation map data
  • a display generation unit 206 that generates the virtual image Vi based on the map information.
  • the high-accuracy map data when the high-accuracy map data can be acquired, the high-accuracy map data is used to generate the virtual image Vi, and when the high-accuracy map data cannot be acquired, the virtual image Vi is generated. Low precision map information is used. As a result, the virtual image Vi can be displayed by selectively using the high-precision map data and the low-precision map information. As described above, it is possible to provide the HCU 20 and the display control program that can effectively use the map information.
  • the display generation unit 206 does not superimpose the virtual image Vi on the road surface in the two-dimensional display mode and the virtual image Vi on the road surface in the foreground, which is a specific superimposition target, in the three-dimensional display mode.
  • the HCU 20 can avoid superimposing the virtual image Vi on the road surface based on the navigation map data of relatively low accuracy. Therefore, the HCU 20 can suppress the occurrence of the displacement of the display position due to the superimposed display of the virtual image Vi based on the map information with low accuracy.
  • the display generation unit 206 uses the two-dimensional display mode even if the high-precision map data can be acquired. A virtual image Vi is generated. According to this, even if the high-precision map data can be obtained at the current position, the virtual image Vi is not generated in the three-dimensional display mode if the high-precision map data does not exist at the guide point. Therefore, it is possible to avoid changing the display mode of the virtual image Vi from the three-dimensional display mode to the two-dimensional display mode near the guide point. Therefore, the HCU 20 can prevent the occupant from being bothered by changing the display mode of the virtual image Vi.
  • the display generation unit 206 can acquire high-accuracy map data even when the high-precision map data can be acquired when the shape condition for stopping the generation of the virtual image Vi in the three-dimensional display mode is satisfied with respect to the road shape on which the vehicle A is traveling.
  • the virtual image Vi is generated in a two-dimensional display mode.
  • the HCU 20 can generate the virtual image Vi in the two-dimensional display mode when the traveling road is the virtual image Vi in the two-dimensional display mode and the road shape is relatively easy to transmit information to the occupant.
  • the HCU 20 can suppress the complication of the processing due to the use of the high precision map data while transmitting the information of the virtual image Vi to the occupant. ..
  • the HCU 20 includes a sensor information acquisition unit 204 that acquires detection information from the peripheral monitoring sensor 4.
  • the display generation unit 206 determines a three-dimensional shape based on the combination of the navigation map data and the detection information.
  • the virtual image Vi is generated in the display mode. According to this, even when the HCU 20 cannot acquire the high-accuracy map data, the HCU 20 combines the detection information with the navigation map data to display the same display mode as the high-accuracy map data.
  • the virtual image Vi can be generated.
  • the display generation unit 206 indicates to the occupant which of the three-dimensional display mode and the two-dimensional display mode the virtual image Vi is generated. According to this, the HCU 20 can more directly present the display mode of the virtual image Vi to the occupant. Therefore, the HCU 20 can facilitate the occupant to understand the information indicated by the virtual image Vi.
  • the map information acquisition unit 203 acquires map information including at least one of road gradient information, lane marking three-dimensional shape information, and road gradient estimation information as high-precision map data. According to this, the HCU 20 can obtain or estimate the road gradient information and generate the virtual image Vi in the three-dimensional display mode. Therefore, the HCU 20 can more reliably suppress the shift in the display position of the virtual image Vi in the three-dimensional display mode.
  • the HCU 20 acquires the high precision map data stored in the locator 3. Instead of this, the HCU 20 may acquire the probe map data as high precision map information.
  • the center 9 receives the probe information transmitted from the plurality of probe vehicles M at the communication unit 91 and stores it in the control unit 90.
  • the probe information is information acquired by the perimeter monitoring sensor 4 in each probe vehicle M, the locator 3, or the like, and is information in which the traveling locus of the probe vehicle M, the road shape information, and the like are represented by three-dimensional position coordinates. Included as.
  • the control unit 90 is mainly composed of a microcomputer including a processor, a RAM, a memory device, an I/O, and a bus connecting these.
  • the control unit 90 includes a map generation unit 90a as a functional block.
  • the map generator 90a generates probe map data based on the acquired probe information. Since the probe information is data including three-dimensional position coordinates, the generated probe map data is three-dimensional map data including height information of each point.
  • the vehicle system 1 communicates with the center 9 via the wireless communication network at the communication unit 8 and acquires probe map data.
  • the communication unit 8 stores the acquired probe map data in the driving support ECU 6.
  • the driving support ECU 6 has a map notification unit 601 as a functional block. Similar to the map notification unit 301 of the locator 3 in the first embodiment, the map notification unit 601 provides information regarding the own vehicle position and the surrounding area based on the measured vehicle position and the information acquired from the navigation device 7. It is determined whether or not it is included in the probe map data. When the map notification unit 601 determines that the probe map data includes information about the vehicle position and the area around it, the map notification unit 601 outputs that to the HCU 20 as notification information.
  • the map information acquisition unit 203 of the HCU 20 acquires the probe map data from the driving support ECU 6 when the map determination unit 202 determines that the probe map data that is the high-accuracy map information can be acquired.
  • the display generation unit 206 generates the AR guide image Gi1 based on the probe map data.
  • the HCU 20 of the third embodiment causes the route guidance image to be superimposed and displayed on the road surface at the superimposition position based on the high-precision map data in the first display mode, and the superimposition position based on the navigation map data in the second display mode.
  • the route guidance image is superimposed and displayed on the road surface.
  • the route guidance image in the first display mode will be referred to as a first AR guidance image CT1
  • the route guidance image in the second display mode will be referred to as a second AR guidance image CT2.
  • the display mode determination unit 205 determines the display of the first AR guide image CT1, and when the high-accuracy map data cannot be acquired and the navigation map data can be acquired. Determines to display the second AR guide image CT2.
  • the display mode determination unit 205 can obtain the high-precision map data even if the high-precision map data can be acquired. Determine the display.
  • the freshness condition is satisfied, for example, when the high-precision map data is older than the navigation map data.
  • the display mode determination unit 205 evaluates the magnitude of the superimposed position shift when displaying in the second display mode based on the acquired various information.
  • the display mode determination unit 205 evaluates the magnitude of the superimposed position shift, for example, based on the positioning accuracy of the vehicle position and the presence/absence of feature recognition information.
  • the display mode determination unit 205 determines whether or not the positioning accuracy of the vehicle position is at a predetermined level or higher. Specifically, the display mode determination unit 205 evaluates the vehicle position acquired from the locator 3 based on the detection information acquired from the surroundings monitoring sensor 4. For example, the display mode determination unit 205 detects the intersection CP from the image captured by the front camera 41 and analyzes the relative position of the vehicle A with respect to the intersection CP. Then, the display mode determination unit 205 determines whether or not the magnitude of the deviation between the position of the vehicle A specified from the relative position and the map data and the own vehicle position acquired from the locator 3 is equal to or higher than a predetermined level. judge.
  • the display mode determination unit 205 may detect an object other than the intersection CP capable of specifying the position of the vehicle A from the captured image and perform the above-described processing.
  • the display mode determination unit 205 may acquire the analysis result of the captured image from another ECU such as the driving assistance ECU 6.
  • the display mode determination unit 205 determines that the evaluation value of the positioning accuracy based on the residual of the pseudo distance, the number of positioning satellites captured by the locator 3, the S/N ratio of the positioning signal, or the like is equal to or higher than a predetermined level. May be determined.
  • the display mode determination unit 205 determines whether the feature recognition information is acquired from the surroundings monitoring sensor 4.
  • the feature recognition information is recognition information of the feature by the peripheral monitoring sensor 4, and is information that can be used to correct the overlapping position of the vehicle A in the front, rear, left, and right directions.
  • the features include, for example, road markings such as stop lines, intersection central markings, and lane markings. By correcting the own vehicle position on the map data based on the relative positions of these features with respect to the vehicle A, it is possible to correct the overlapping position of the second AR guide image CT2 in the front-rear and left-right directions.
  • road boundaries such as curbs and road installations such as signs may be included in the features that can be used to correct the vehicle position.
  • the display mode determination unit 205 determines the superimposed position shift of the displayed second AR guidance image CT2 based on the combination of the above various types of information, that is, the combination of the positioning accuracy of the vehicle position and the presence or absence of the feature recognition information. Evaluate the size. For example, the display mode determination unit 205 classifies the magnitude of the superposition positional deviation into three levels of “small”, “medium”, and “large” according to the combination.
  • the display mode determination unit 205 determines that the deviation is small.
  • the display mode determination unit 205 determines that the deviation is in the middle.
  • the display mode determination unit 205 determines that the deviation is in the middle even when the positioning accuracy is less than the predetermined level and the feature recognition information is present.
  • the display mode determination unit 205 determines that the magnitude of the deviation is large when the positioning accuracy is less than the predetermined level and there is no feature recognition information.
  • the display mode determination unit 205 provides the display generation unit 206 with the determination result of the display mode and the magnitude of the deviation evaluated in the case of the second display mode together with the information necessary for generating the route guidance image.
  • the display generation unit 206 generates either the first AR guidance image CT1 or the second AR guidance image CT2 based on the information provided by the display mode determination unit 205.
  • Each AR guidance image CT1, CT2 shows the planned traveling route of the vehicle A at the guidance point by AR display.
  • Each of the AR guide images CT1 and CT2 is an AR virtual image in which the road surface is to be superimposed, as in the first embodiment.
  • each AR guidance image CT1, CT2 includes an entry route content CTa indicating an approach route to the intersection CP.
  • an exit route content CTe indicating an exit route from the intersection CP.
  • the approach route content CTa is, for example, a plurality of triangular objects arranged along the planned travel route.
  • the exit route content CTe is a plurality of arrow-shaped objects arranged along the planned travel route.
  • the display generation unit 206 determines the superposition position and the superimposition shape of the first AR guidance image CT1 using the high-precision map data. Specifically, the display generation unit 206 provides various position information such as the road surface position based on the high-precision map data, the own vehicle position by the locator 3, the occupant's viewpoint position by the DSM 22, and the positional relationship of the set projection area PA. To use. The display generation unit 206 calculates the superimposed position and the superimposed shape of the first AR guide image CT1 by geometrical calculation based on the various position information.
  • the display generation unit 206 reproduces the current traveling environment of the vehicle A in the virtual space based on the vehicle position information based on the high precision map data, the high precision map data, the detection information, and the like. Specifically, as shown in FIG. 12, the display generation unit 206 sets the own vehicle object AO at the reference position in the virtual three-dimensional space. The display generation unit 206 maps the road model having the shape indicated by the map data in the three-dimensional space in association with the own vehicle object AO based on the own vehicle position information.
  • the display generation unit 206 sets the virtual camera position VP and the superposition range SA in association with the own vehicle object AO.
  • the virtual camera position VP is a virtual position corresponding to the viewpoint position of the occupant.
  • the display generation unit 206 sequentially corrects the virtual camera position VP for the own vehicle object AO based on the latest viewpoint position coordinates acquired from the DSM 22.
  • the superposition range SA is a range in which the virtual image Vi can be superposed and displayed. When the display generation unit 206 looks forward from the virtual camera position VP based on the virtual camera position VP and the outer edge position (coordinates) information of the projection area PA stored in advance in the storage unit 13 (see FIG. 1) and the like.
  • the front area inside the image plane is set as the superposition area SA.
  • the superposition range SA corresponds to the projection area PA and the angle of view of the HUD 230.
  • the display generation unit 206 arranges a virtual object VO imitating the first AR guide image CT1 in the virtual space.
  • the virtual object VO is arranged along the planned traveling route on the road surface of the road model in the three-dimensional space.
  • the virtual object VO is set in the virtual space when displaying the first AR guide image CT1 as a virtual image.
  • the virtual object VO defines the position and shape of the first AR guide image CT1. That is, the shape of the virtual object VO viewed from the virtual camera position VP becomes the virtual image shape of the first AR guide image CT1 visually recognized from the viewpoint position.
  • the display generation unit 206 arranges the virtual object VO on the own lane Lns in the central portion Lc of the own lane Lns in the lane width direction.
  • the central portion Lc is, for example, a midway point between the lane boundary lines on both sides defined by the lane markings of the own lane Lns or the road edges.
  • the superposition position of the approach route content CTa is set to the substantial center portion Lc of the own lane Lns (see FIG. 3).
  • the approach route content CTa moves from the center of the own lane Lns to the center of the approach lane, It may be displayed so as to extend along the central portion.
  • the exit route content CTe is arranged so as to be lined up following the approach route content CTa along the planned travel route.
  • the exit route content CTe is superimposed on the intersection CP and a position floating from the road surface in the center of the exit lane. Note that, as shown in FIG. 13, when the road surface to be superimposed is invisible, the exit route content CTe determines the overlapping position so as to be floated above the upper end of the road surface within the angle of view and visually recognized. To be done.
  • the display generation unit 206 starts displaying the above first AR guidance image CT1 when the remaining distance to the intersection CP is below a threshold value (for example, 300 m).
  • the display generation unit 206 sequentially updates the overlapping position and the overlapping shape of the first AR guide image CT1 so that the first AR guide image CT1 is displayed so as to be relatively fixed to the road surface. That is, the display generation unit 206 displays the first AR guide image CT1 so that the occupant can visually move the first AR guide image CT1 so as to follow the road surface that relatively moves as the vehicle A travels.
  • the display generation unit 206 determines the superimposed position and the superimposed shape of the second AR guidance image CT2 using the navigation map data instead of the high accuracy map data.
  • the display generation unit 206 sets the road surface position on the assumption that the road surface to be superimposed is a flat road surface without undulations.
  • the display generation unit 206 sets a horizontal road surface as a virtual road surface to be superimposed, and performs geometrical calculation based on the virtual road surface position and other various position information, and thereby the superimposed position of the second AR guide image CT2. And calculate the overlapping shape.
  • the virtual road surface set by the display generation unit 206 can be more inaccurate as compared with the virtual road surface set based on the high-precision map data.
  • the virtual road surface at the intersection CP portion may be displaced from the actual road surface.
  • the shape of the virtual road surface reflects the upward slope in order to clearly show the deviation of the virtual road surface at the intersection CP portion, but in reality, the upward slope is reflected on the virtual road surface. Not exclusively.
  • the display generation unit 206 determines the horizontal position of the virtual object VO in the virtual space based on the size of the shift. Specifically, when the magnitude of the shift is at a small level, the display generation unit 206 arranges the virtual object VO at the vehicle center position Vc, which is a position within the superposition range SA corresponding to the center of the vehicle A.
  • the vehicle center position Vc is the position of the straight line within the overlapping range SA when a virtual straight line passing through the center of the vehicle A in the vehicle width direction and extending in the front-rear direction of the vehicle A is assumed on the virtual road surface. Is.
  • the entry route contents CTa are arranged obliquely with respect to the vertical direction of the projection area PA, as shown in FIG.
  • the display generation unit 206 arranges the second AR guide image CT2 in the central portion Ac in the left-right direction of the projection area PA.
  • the approach route contents CTa are displayed in a state of being arranged side by side in the vertical direction of the projection area PA, as shown in FIG.
  • the display generation unit 206 corrects the superimposed position based on the feature recognition information. For example, the display generation unit 206 corrects the vehicle position in the front-rear, left-right direction on the virtual road surface set based on the navigation map data based on the feature recognition information, and then determines the overlapping position of the second AR guidance image CT2 and Calculate the overlapping shape.
  • the display generation unit 206 corrects the superposition position based on the height correction information.
  • the height correction information is, for example, three-dimensional position information of the roadside device acquired by road-to-vehicle communication.
  • the display generation unit 206 may acquire the information via the V2X communication device mounted on the vehicle A.
  • the display generation unit 206 or the height correction information may be height information of an object detected by the periphery monitoring sensor 4. That is, when the three-dimensional position information such as a road installation or a road marking can be specified by analyzing the detection information of the perimeter monitoring sensor 4, the height information included in the three-dimensional position information is included in the height correction information. May be.
  • the display generation unit 206 changes the position and shape of the virtual road surface from the horizontal road surface based on the height correction information, so that the height direction of the second AR guide image CT2 virtually arranged on the virtual road surface, for example. Correct the superimposed position of.
  • the display generation unit 206 limits the superimposed display of the second AR guidance image CT2 to the front side of the planned traveling route with respect to the first AR guidance image CT1. Specifically, the display generation unit 206 hides a portion of the exit route content CTe of the second AR guide image CT2 that is superimposed on the side of the planned traveling route farther from the vehicle A than the first AR guide image CT1. Only the part that is superimposed on the front side is displayed. In the example shown in FIGS. 15 and 16, three exit route contents CTe are displayed when the first AR guidance image CT1 is displayed, whereas when the second AR guidance image CT2 is displayed, The exit route content CTe is limited to only one on the near side. That is, the second AR guide image CT2 is a content that presents the exit direction from the intersection CP and does not present the route of the exit route, and is simpler than the first AR guide image CT1.
  • the display generation unit 206 starts displaying the second AR guidance image CT2 described above at a timing different from that of the first AR guidance image CT1. Specifically, the display generation unit 206 displays the non-AR guidance image Gi2 instead of the second AR guidance image CT2 when the remaining distance to the intersection CP is below the first threshold. Then, the display generation unit 206 switches the display from the non-AR guidance image Gi2 to the second AR guidance image CT2 when the remaining distance is less than the second threshold (for example, 100 m) smaller than the first threshold. That is, the display generation unit 206 starts displaying the second AR guide image CT2 at a stage closer to the intersection CP than when displaying the first AR guide image CT1.
  • the threshold value for displaying the non-AR guidance image Gi2 may not be the first threshold value as long as it is a value larger than the second threshold value.
  • step S44 determines whether the shape condition is not satisfied. If it is determined in step S44 that the shape condition is not satisfied, the HCU 20 proceeds to step S46.
  • step S46 the display mode determination unit 205 determines the freshness condition of the high precision map data. If it is determined that the freshness condition is not satisfied, the process proceeds to step S50, and if it is determined that the freshness condition is satisfied, the process proceeds to step S80.
  • step S50 When the high-precision map data is acquired in step S50, the process proceeds to step S65, and the display generation unit 206 generates the first AR guidance image CT1. On the other hand, when the navigation map data is acquired in step S80, the process proceeds to step S81.
  • step S81 the display generation unit 206 determines whether the remaining distance to the intersection CP is less than the second threshold value. When it is determined that it is not below the second threshold value, the process proceeds to step S82, the non-AR guidance image Gi2 is generated, and then the process proceeds to step S120. On the other hand, when it is determined in step S81 that the value is below the second threshold, the process proceeds to step S83.
  • step S83 the display mode determination unit 205 or the like acquires the correction information of the superimposed position via the sensor information acquisition unit 204. If there is no correction information that can be acquired, step S83 is skipped.
  • step S84 the display mode determination unit 205 evaluates the magnitude of the positional deviation of the second AR guide image CT2, and proceeds to step S95.
  • step S95 the display generation unit 206 generates the second AR guide image CT2 based on the acquired navigation map data, the correction information, the information regarding the magnitude of the positional deviation, and the like.
  • the HCU 20 can superimpose and display the virtual image Vi on the specific superimposition target while properly using the map data in the area where the high-precision map data can be used and the area where the high-precision map data cannot be used.
  • the display generation unit 206 starts displaying the second AR guidance image CT2 when the remaining distance to the intersection CP reaches a second threshold shorter than the first threshold at which the first AR guidance image CT1 is displayed. Since the intersection CP is often a relatively flat terrain, the display generation unit 206 starts displaying the second AR guidance image CT2 at a stage closer to the intersection CP than the display scene of the first AR guidance image CT1. The magnitude of the positional deviation of the second AR guide image CT2 can be suppressed. Alternatively, the display generation unit 206 can shorten the traveling section in which the positional deviation of the second AR guide image CT2 becomes large.
  • the disclosure herein is not limited to the illustrated embodiments.
  • the disclosure encompasses the illustrated embodiments and variations on them based on them.
  • the disclosure is not limited to the combination of parts and/or elements shown in the embodiments.
  • the disclosure can be implemented in various combinations.
  • the disclosure may have additional parts that may be added to the embodiments.
  • the disclosure includes omissions of parts and/or elements of the embodiments.
  • the disclosure includes replacements or combinations of parts and/or elements between one embodiment and another.
  • the disclosed technical scope is not limited to the description of the embodiments.
  • the display generation unit 206 generates the AR guidance image Gi1 as the route guidance image based on the high-precision map information and the non-AR guidance image Gi2 as the route guidance image based on the navigation map data.
  • the display generation unit 206 may be configured to generate a virtual image Vi other than the route guidance image in a different display mode depending on the map information to be acquired. For example, the display generation unit 206 sets an image prompting the occupant to gaze at an object to be watched (for example, a preceding vehicle, a pedestrian, a road sign, etc.) when the high-accuracy map information can be acquired. Superimposition may be performed, and if acquisition is not possible, superimposition on the object may be stopped.
  • the display generation unit 206 displays the mode presentation image Ii together with the route guidance image, but the mode presentation image Ii may be displayed before the route guidance image is displayed.
  • the HCU 20 is supposed to display the non-AR guidance image Gi2 based on the navigation map data when the shape condition is satisfied.
  • the HCU 20 may display the non-AR guidance image Gi2 based on the high-precision map data when the shape condition is satisfied and the high-precision map data can be acquired.
  • the display generation unit 206 sets the overlapping position of the second AR guide image CT2 to the vehicle center position Vc and the center part Ac of the projection area PA in accordance with the size of the overlapping position shift of the second AR guide image CT2. I decided to choose one. Instead of this, the display generation unit 206 may be configured to be superimposed on only one of them.
  • the display generation unit 206 performs switching from the non-AR guidance image Gi2 to the second AR guidance image CT2 based on the remaining distance to the intersection CP, but the condition for switching is not limited to this. ..
  • the display generation unit 206 may be configured to switch at the time when the correction information regarding the superimposed position of the second AR guide image CT2 can be acquired.
  • the correction information is information that can be used to correct the superimposed position of the second AR guide image CT2, and includes, for example, the stop line of the intersection CP, the center marking of the intersection CP, and the road marking of another own vehicle lane Lns. It is the position information of.
  • the correction information is acquired as an analysis result of the detection information of the peripheral monitoring sensor 4.
  • the display generation unit 206 generates the route guidance image in the second display mode when the high-precision map data does not include the information about the future traveling section GS. Then, the display generation unit 206 may be configured to generate the route guidance image in the first display mode as long as the high-precision map data corresponding to the current position of the vehicle can be acquired. In this case, the display generation unit 206 may switch from the first display mode to the second display mode when the high-precision map data corresponding to the current position of the vehicle cannot be acquired.
  • the display generation unit 206 of the third embodiment continuously displays the route guidance image from the superimposing position of the first AR guide image CT1 to the superimposing position of the second AR guide image CT2. You may display so that it may move to. As a result, the display generation unit 206 can reduce the occupant's discomfort due to the instantaneous switching of the overlapping position. It should be noted that the moving speed of the route guidance image at this time is preferably slow enough not to guide the occupant's consciousness to the movement itself of the route guidance image.
  • the display generation unit 206 displays the entry route content CTa and the exit route content CTe of the first AR guidance image CT1 in different shapes.
  • the display generation unit 206 may set the contents CTa and CTe as contents having substantially the same shape, as shown in FIG. In the example shown in FIG. 18, each of the contents CTa and CTe is in the shape of a plurality of triangles arranged along the planned travel route.
  • the display generation unit 206 may change the exit route content CTe to an arrow-shaped image indicating the exit direction in the display of the second AR guide image CT2 (see FIG. 19).
  • the display generation unit 206 may display the route guidance image as a strip of content that extends continuously along the planned travel route.
  • the second AR guide image CT2 may be displayed in a mode in which the length of the second AR guide image CT2 to the front side of the planned traveling route is limited to that of the first AR guide image CT1.
  • the processor of the above-described embodiment is a processing unit including one or more CPUs (Central Processing Units).
  • a processor may be a processing unit including a GPU (Graphics Processing Unit) and a DFP (Data Flow Processor) in addition to the CPU.
  • the processor may be a processing unit including an FPGA (Field-Programmable Gate Array) and an IP core specialized in specific processing such as learning and inference of AI.
  • Each arithmetic circuit unit of such a processor may be mounted individually on a printed circuit board, or may be mounted on an ASIC (Application Specific Integrated Circuit) and FPGA.
  • ASIC Application Specific Integrated Circuit
  • non-transitional tangible storage mediums such as a flash memory and a hard disk can be adopted.
  • the form of such a storage medium may be appropriately changed.
  • the storage medium may be in the form of a memory card or the like, and may be configured to be inserted into a slot portion provided in the vehicle-mounted ECU and electrically connected to the control circuit.
  • control unit and the method thereof described in the present disclosure may be realized by a dedicated computer that configures a processor programmed to execute one or more functions embodied by a computer program.
  • apparatus and method described in the present disclosure may be realized by a dedicated hardware logic circuit.
  • device and method described in the present disclosure may be implemented by one or more dedicated computers configured by a combination of a processor that executes a computer program and one or more hardware logic circuits.
  • the computer program may be stored in a computer-readable non-transition tangible recording medium as an instruction executed by a computer.
  • control unit and the method thereof described in the present disclosure are realized by a dedicated computer provided by configuring a processor and a memory programmed to execute one or more functions embodied by a computer program. May be done.
  • control unit and the method described in the present disclosure may be realized by a dedicated computer provided by configuring a processor with one or more dedicated hardware logic circuits.
  • control unit and the method thereof described in the present disclosure are based on a combination of a processor and a memory programmed to execute one or more functions and a processor configured by one or more hardware logic circuits. It may be realized by one or more configured dedicated computers.
  • the computer program may be stored in a computer-readable non-transition tangible recording medium as an instruction executed by a computer.
  • each section is expressed as, for example, S10. Further, each section can be divided into multiple subsections, while multiple sections can be combined into one section. Further, each section thus configured can be referred to as a device, module, means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Navigation (AREA)

Abstract

La présente invention commande l'affichage d'une image virtuelle (Vi) superposée sur le paysage devant un occupant d'un véhicule (A). L'invention acquiert la position du véhicule, acquiert des informations cartographiques de haute précision correspondant à la position ou des informations cartographiques de faible précision correspondant à la position et présentant une précision inférieure à celle des informations cartographiques de haute précision, génère l'image virtuelle dans un premier mode d'affichage sur la base des informations cartographiques de haute précision si les informations cartographiques de haute précision peuvent être acquises, et génère l'image virtuelle dans un second mode d'affichage qui est différent du premier mode d'affichage et est basé sur les informations cartographiques de faible précision si les informations cartographiques de haute précision ne peuvent pas être acquises.
PCT/JP2019/046318 2018-12-14 2019-11-27 Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement tangible non transitoire lisible par ordinateur Ceased WO2020121810A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112019006171.2T DE112019006171T5 (de) 2018-12-14 2019-11-27 Anzeigesteuervorrichtung, Anzeigesteuerprogramm und nicht-transitorisches greifbares computerlesbares Speichermedium
US17/222,259 US20210223058A1 (en) 2018-12-14 2021-04-05 Display control device and non-transitory computer-readable storage medium for the same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018-234566 2018-12-14
JP2018234566 2018-12-14
JP2019196468A JP7052786B2 (ja) 2018-12-14 2019-10-29 表示制御装置および表示制御プログラム
JP2019-196468 2019-10-29

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/222,259 Continuation US20210223058A1 (en) 2018-12-14 2021-04-05 Display control device and non-transitory computer-readable storage medium for the same

Publications (1)

Publication Number Publication Date
WO2020121810A1 true WO2020121810A1 (fr) 2020-06-18

Family

ID=71076404

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/046318 Ceased WO2020121810A1 (fr) 2018-12-14 2019-11-27 Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement tangible non transitoire lisible par ordinateur

Country Status (2)

Country Link
JP (1) JP7416114B2 (fr)
WO (1) WO2020121810A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114993326A (zh) * 2022-05-25 2022-09-02 阿波罗智联(北京)科技有限公司 导航处理方法、装置及电子设备
US20220308240A1 (en) * 2021-03-25 2022-09-29 Casio Computer Co., Ltd. Information processing device, information processing system, information processing method and storage medium
US20230021643A1 (en) * 2019-12-10 2023-01-26 Audi Ag Method for providing a three-dimensional map in a motor vehicle
CN115917616A (zh) * 2020-06-23 2023-04-04 株式会社电装 障碍物信息管理装置、障碍物信息管理方法、车辆用装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011135660A1 (fr) * 2010-04-26 2011-11-03 パイオニア株式会社 Système de navigation, procédé de navigation, programme de navigation, et support de stockage
JP2017167053A (ja) * 2016-03-17 2017-09-21 株式会社デンソー 車両位置決定装置
JP2018133031A (ja) * 2017-02-17 2018-08-23 オムロン株式会社 運転切替支援装置、及び運転切替支援方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5382356B2 (ja) * 2010-03-25 2014-01-08 株式会社エクォス・リサーチ 運転アシストシステム
JP7009747B2 (ja) * 2017-02-20 2022-01-26 株式会社Jvcケンウッド 端末装置、制御方法、プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011135660A1 (fr) * 2010-04-26 2011-11-03 パイオニア株式会社 Système de navigation, procédé de navigation, programme de navigation, et support de stockage
JP2017167053A (ja) * 2016-03-17 2017-09-21 株式会社デンソー 車両位置決定装置
JP2018133031A (ja) * 2017-02-17 2018-08-23 オムロン株式会社 運転切替支援装置、及び運転切替支援方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230021643A1 (en) * 2019-12-10 2023-01-26 Audi Ag Method for providing a three-dimensional map in a motor vehicle
CN115917616A (zh) * 2020-06-23 2023-04-04 株式会社电装 障碍物信息管理装置、障碍物信息管理方法、车辆用装置
US20220308240A1 (en) * 2021-03-25 2022-09-29 Casio Computer Co., Ltd. Information processing device, information processing system, information processing method and storage medium
US12140685B2 (en) * 2021-03-25 2024-11-12 Casio Computer Co., Ltd. Information processing device, information processing system, information processing method and storage medium
CN114993326A (zh) * 2022-05-25 2022-09-02 阿波罗智联(北京)科技有限公司 导航处理方法、装置及电子设备

Also Published As

Publication number Publication date
JP2022079590A (ja) 2022-05-26
JP7416114B2 (ja) 2024-01-17

Similar Documents

Publication Publication Date Title
JP7052786B2 (ja) 表示制御装置および表示制御プログラム
US11996018B2 (en) Display control device and display control program product
US10293748B2 (en) Information presentation system
US20230191911A1 (en) Vehicle display apparatus
US11535155B2 (en) Superimposed-image display device and computer program
JP6566132B2 (ja) 物体検出方法及び物体検出装置
JP6775188B2 (ja) ヘッドアップディスプレイ装置および表示制御方法
JP7251582B2 (ja) 表示制御装置および表示制御プログラム
JP7416114B2 (ja) 表示制御装置および表示制御プログラム
JP7420165B2 (ja) 表示制御装置、および表示制御プログラム
WO2020208989A1 (fr) Dispositif et programme de commande d'affichage
JP7001169B2 (ja) 運転計画表示方法及び運転計画表示装置
WO2020246113A1 (fr) Dispositif de commande d'affichage et programme de commande d'affichage
JP2020199839A (ja) 表示制御装置
JP7294091B2 (ja) 表示制御装置および表示制御プログラム
JP2020138609A (ja) 車両用表示制御装置、車両用表示制御方法、車両用表示制御プログラム
WO2020246114A1 (fr) Dispositif de commande d'affichage et programme de commande d'affichage
JP2021028587A (ja) 車載表示制御装置
JP7487713B2 (ja) 車両用表示制御装置、車両用表示装置、車両用表示制御方法およびプログラム
JP2020091148A (ja) 表示制御装置および表示制御プログラム
JP2020118545A (ja) 表示制御装置及び表示制御プログラム
JP7151653B2 (ja) 車載表示制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19894773

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19894773

Country of ref document: EP

Kind code of ref document: A1