[go: up one dir, main page]

WO2021033312A1 - Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations - Google Patents

Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations Download PDF

Info

Publication number
WO2021033312A1
WO2021033312A1 PCT/JP2019/032816 JP2019032816W WO2021033312A1 WO 2021033312 A1 WO2021033312 A1 WO 2021033312A1 JP 2019032816 W JP2019032816 W JP 2019032816W WO 2021033312 A1 WO2021033312 A1 WO 2021033312A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
information
shape
map
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/032816
Other languages
English (en)
Japanese (ja)
Inventor
雄治 五十嵐
優子 大田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2019/032816 priority Critical patent/WO2021033312A1/fr
Publication of WO2021033312A1 publication Critical patent/WO2021033312A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information output device, an automatic driving device, and an information output method.
  • Patent Document 1 proposes a traveling control device that realizes automatic operation by using a lane marking position detected by a lane marking position sensor device. This travel control device can continue automatic driving on roads where lanes such as lane markings cannot be seen due to snowy roads, etc., based on the shape of the lane on the map data and the current position of the vehicle. ..
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique capable of appropriately using map-related information.
  • the information output device is generated based on the result of detecting the vicinity of the vehicle, and acquires the position of the vehicle and the first acquisition unit that acquires the first lane shape information indicating the shape of the first lane around the vehicle. Based on the second acquisition unit, the position of the vehicle acquired by the second acquisition unit, and the map data, the map vehicle position information indicating the position of the vehicle on the map data and the vehicle among the lanes on the map data.
  • the first generation unit that generates the second lane shape information indicating the shape of the surrounding second lane, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit, and the first generation unit.
  • Map accuracy indicating the accuracy of the map-related information including at least one of the map vehicle position information and the second lane shape information based on the difference from the shape of the second lane indicated by the second lane shape information generated in. It includes a second generation unit that generates information and an output unit that outputs map-related information and map accuracy information.
  • the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit and the shape of the second lane indicated by the second lane shape information generated by the first generation unit Based on the difference, map accuracy information indicating the accuracy of map-related information is generated, and map-related information and map accuracy information are output.
  • map-related information can be used appropriately.
  • FIG. It is a block diagram which shows the structure of the information output device which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the information output device which concerns on Embodiment 2.
  • FIG. It is a flowchart which shows the operation of the information output device which concerns on Embodiment 2.
  • FIG. It is a figure for demonstrating the operation of the information output device which concerns on Embodiment 2.
  • FIG. It is a figure for demonstrating the operation of the information output device which concerns on Embodiment 2.
  • FIG. It is a figure for demonstrating operation of the information output device which concerns on modification 1.
  • FIG. It is a block diagram which shows the hardware configuration of the information output device which concerns on other modification.
  • It is a block diagram which shows the structure of the communication terminal which concerns on other modification.
  • FIG. 1 is a block diagram showing a configuration of an information output device 30 according to a first embodiment of the present invention.
  • the vehicle on which the information output device 30 is mounted and which is the subject of attention will be described as “own vehicle”.
  • the information output device 30 of FIG. 1 includes a first acquisition unit 31, a second acquisition unit 32, a first generation unit 34, a second generation unit 35, and an output unit 36.
  • the first acquisition unit 31 acquires the detected lane shape information which is the first lane shape information.
  • the detected lane shape information is generated based on the result of detecting the vicinity of the own vehicle, and is information indicating the shape of the first lane (hereinafter referred to as "detection lane") around the own vehicle.
  • the lane such as the first lane may be various lines of the road such as a lane marking, or may be a road portion between adjacent lane markings.
  • the periphery of the own vehicle may be a range from the own vehicle to a position advanced by about several tens of meters to several hundreds of meters in the traveling direction of the own vehicle, or may be a range other than this.
  • the first acquisition unit 31 can receive, for example, the detected lane shape information from the external device.
  • the interface to get is used.
  • the first acquisition unit 31 may include, for example, a detection device for detecting the vicinity of the own vehicle.
  • an analysis device that analyzes the detection result of the detection device is used.
  • an image sensor such as a camera
  • an optical range finder such as LiDAR (Light Detection and Ringing), or the like is used.
  • the second acquisition unit 32 acquires the position of the own vehicle.
  • the position of the own vehicle includes, for example, latitude, longitude, altitude, head direction, and the like.
  • the head direction is the angle between a specific direction (for example, due north) in the world geodetic system and the head direction, and is an angle from 0 degrees to 360 degrees measured clockwise from the specific direction.
  • the second acquisition unit 32 includes, for example, a device that detects the position of the own vehicle using satellite positioning results obtained from GNSS (Global Navigation Satellite System), etc., and inertial navigation (Dead Reckoning) based on the detection results of a camera, a sensor, or the like.
  • GNSS Global Navigation Satellite System
  • a device for detecting the position of the own vehicle using the above, or an interface thereof or the like is used.
  • the first generation unit 34 generates the map vehicle position information and the map lane shape information which is the second lane shape information based on the position of the own vehicle acquired by the second acquisition unit 32 and the map data. ..
  • the map vehicle position information is information indicating the position of the own vehicle on the map data.
  • the map lane shape information is information indicating the shape of the second lane (hereinafter referred to as "map lane") around the own vehicle among the lanes on the map data.
  • the map data used in the first generation unit 34 is data indicating the shape and connection of lanes.
  • This map data may be data stored in the memory of the information output device 30, and when the information output device 30 includes a communication device, the communication device receives the data from an external device of the information output device 30. It may be data.
  • the second generation unit 35 is the difference between the shape of the detection lane indicated by the detection lane shape information acquired by the first acquisition unit 31 and the shape of the map lane indicated by the map lane shape information generated by the first generation unit 34. Is calculated. Then, the second generation unit 35 generates map accuracy information indicating the accuracy of the map-related information including at least one of the map vehicle position information and the map lane shape information based on the calculated difference.
  • the output unit 36 outputs map-related information and map accuracy information.
  • the map-related information and the map accuracy information indicating the accuracy of the map-related information are output.
  • the output destination of the information output device 30 may, for example, use at least a part of the map-related information based on the accuracy indicated by the map accuracy information, or may not use the map-related information. Can be done. That is, since the output destination of the information output device 30 can appropriately use the map-related information, the robustness can be enhanced.
  • the output destination of the information output device 30 will be described as being a travel control device that controls the travel of the own vehicle, but the present invention is not limited to this.
  • the output destination of the information output device 30 may be a communication device or the like that transmits map-related information to a server or the like when it is determined that the accuracy indicated by the map accuracy information is low.
  • the server referred to here includes a server that manages map data, which needs to be updated at any time due to construction work or the like.
  • the output destination of the information output device 30 may be a display device or the like that displays that automatic operation or the like is not appropriate when it is determined that the accuracy indicated by the map accuracy information is low.
  • FIG. 2 is a block diagram showing a configuration of an automatic driving device including the information output device 30 according to the second embodiment of the present invention.
  • the components according to the second embodiment the components that are the same as or similar to the above-mentioned components are designated by the same or similar reference numerals, and different components will be mainly described.
  • the automatic driving device of FIG. 2 includes an in-vehicle sensor device group 10, a lane shape measuring device 20, an information output device 30, and a traveling control device 40.
  • the information output device 30 of FIG. 2 is the same device as the information output device 30 described in the first embodiment, and outputs map-related information and map accuracy information.
  • the information output device 30 of FIG. 2 is connected to the vehicle-mounted sensor device group 10, the lane shape measuring device 20, and the traveling control device 40 via a wire or wirelessly.
  • the vehicle-mounted sensor device group 10 outputs vehicle-mounted sensor information including at least one of vehicle speed, yaw rate, satellite positioning position (latitude, longitude, altitude), and time information to the information output device 30 and the travel control device 40.
  • the vehicle-mounted sensor device group 10 displays all of the vehicle speed, yaw rate, satellite positioning position (latitude, longitude, altitude), and time information according to the specifications predetermined for each information item, the information output device 30 and the information output device 30. It shall be output to the travel control device 40.
  • the predetermined specifications here specify, for example, that the output interval of vehicle speed and yaw rate is a 20-millisecond cycle, and that the output interval of satellite positioning position and time information is a 100-millisecond cycle. ..
  • the lane shape measuring device 20 generates detection lane shape information indicating the shape of the detection lane described in the first embodiment based on the result of detecting the vicinity of the own vehicle with a camera mounted on the own vehicle, LiDAR, or the like. Then, the lane shape measuring device 20 outputs the detected lane shape information to the information output device 30 and the traveling control device 40 at a fixed cycle (for example, a cycle of 100 milliseconds).
  • the travel control device 40 includes vehicle-mounted sensor information output from the vehicle-mounted sensor device group 10, detected lane shape information output from the lane shape measuring device 20, map-related information and map accuracy information output from the information output device 30. Based on the above, the running control of the own vehicle is performed.
  • the information output device 30 of FIG. 2 includes a lane shape acquisition unit 31a, a position calculation unit 32a, a map data storage unit 33, a map vehicle position generation unit 34a, a map lane shape generation unit 34b, and a map accuracy generation unit 35a. And an information output unit 36a.
  • the lane shape acquisition unit 31a is included in the concept of the first acquisition unit 31 in FIG. 1
  • the position calculation unit 32a is included in the concept of the second acquisition unit 32 in FIG. 1
  • the map lane shape generation unit 34b is included in the concept of the first generation unit 34 in FIG.
  • the map accuracy generation unit 35a is included in the concept of the second generation unit 35 in FIG. 1
  • the information output unit 36a is included in the concept of the output unit 36 in FIG.
  • the lane shape acquisition unit 31a acquires the detected lane shape information output from the lane shape measuring device 20.
  • the position calculation unit 32a determines the position (latitude, longitude, altitude, head direction) of the own vehicle at the current time in a fixed cycle (for example, 100 milliseconds cycle) based on the vehicle-mounted sensor information output from the vehicle-mounted sensor device group 10. calculate. For the calculation of the position of the own vehicle by the position calculation unit 32a, for example, satellite positioning result or inertial navigation is used.
  • the position calculation unit 32a may calculate the position of the own vehicle based on the detected lane shape information acquired by the lane shape acquisition unit 31a.
  • the position in the lane of the own vehicle, particularly the position in the left-right direction of the own vehicle, which is included in the detected lane shape information and is acquired by a camera or a sensor, is more accurate than the satellite positioning position. Therefore, if the detected lane shape information is used to calculate the position of the own vehicle, the accuracy of the position of the own vehicle can be expected to be improved.
  • the map data storage unit 33 stores map data indicating the shape and connection of lanes.
  • the map vehicle position generation unit 34a on the map data for example, by performing map matching based on the position of the own vehicle calculated by the position calculation unit 32a and the map data stored in the map data storage unit 33. Generates map vehicle position information indicating the position of the own vehicle.
  • the map lane shape generation unit 34b maps the shape of the lane around the position indicated by the map vehicle position information generated by the map vehicle position generation unit 34a among the lanes on the map data stored in the map data storage unit 33. Search as the shape of the lane. Then, the map lane shape generation unit 34b generates map lane shape information indicating the shape of the searched map lane.
  • the map accuracy generation unit 35a has a shape of the detection lane indicated by the detection lane shape information acquired by the lane shape acquisition unit 31a and a map lane shape indicated by the map lane shape information generated by the map lane shape generation unit 34b. Based on the difference, map accuracy information indicating the accuracy of map-related information including at least one of map vehicle position information and map lane shape information is generated.
  • the map accuracy generation unit 35a generates the map accuracy information, for example, every fixed time, every constant mileage, or every output of the detected lane shape information from the lane shape measuring device 20.
  • the information output unit 36a outputs map-related information and map accuracy information to the travel control device 40.
  • the map-related information may be generated by the map accuracy generation unit 35a or may be generated by the information output unit 36a.
  • FIG. 3 is a flowchart showing the operation of the information output device 30 according to the second embodiment.
  • step S1 the lane shape acquisition unit 31a acquires the detected lane shape information from the lane shape measuring device 20.
  • step S2 the position calculation unit 32a calculates the position of the own vehicle at the current time based on the in-vehicle sensor information output from the in-vehicle sensor device group 10.
  • step S3 the map vehicle position generation unit 34a generates map vehicle position information based on the position of the own vehicle calculated by the position calculation unit 32a and the map data.
  • step S4 the map lane shape generation unit 34b generates map lane shape information based on the map vehicle position information generated by the map vehicle position generation unit 34a and the map data.
  • the map accuracy generation unit 35a has the shape of the detection lane indicated by the detection lane shape information acquired by the lane shape acquisition unit 31a and the map indicated by the map lane shape information generated by the map lane shape generation unit 34b. Map accuracy information is generated based on the difference from the shape of the lane. The generation of map accuracy information by the map accuracy generation unit 35a will be described in detail later.
  • step S6 the information output unit 36a outputs the map-related information and the map accuracy information to the travel control device 40.
  • the operation of the travel control device 40 will be described in detail later. After that, the operation of FIG. 3 ends. After step S6, the process may return to step S1 as appropriate.
  • FIG. 4 shows the shape of the detection lane indicated by the detection lane shape information acquired by the map accuracy generation unit 35a in step S5 and the map lane shape generated by the map lane shape generation unit 34b. It is a conceptual diagram for demonstrating the process of calculating the difference from the shape of a map lane indicated by information.
  • the detected lane shape information acquired from the lane shape measuring device 20 by the lane shape acquiring unit 31a indicates the shape of the detected lane as a plurality of points on the coordinate system.
  • the lane shape measuring device 20 is the origin
  • the traveling direction of the own vehicle is the Y axis (the traveling direction is a positive value)
  • the lateral direction with respect to the traveling direction is the X axis (the right hand direction of the traveling direction is positive).
  • the XY coordinate system used as the value) is used.
  • the position of each point on the shape of the detection lane is a relative position with the lane shape measuring device 20 as the origin.
  • the map accuracy generation unit 35a converts the coordinate system of the map lane shape information generated by the map lane shape generation unit 34b into the coordinate system of the detection lane shape information generated by the lane shape measuring device 20.
  • the coordinate system of the map lane shape information is the coordinate system of the world geodetic system (values represented by latitude, longitude, altitude, and head orientation), and the coordinate system of the detected lane shape information originates from the above-mentioned own vehicle.
  • the coordinate system of the world geodetic system may be converted into the XY coordinate system.
  • the movement of coordinates such that the coordinates (latitude, longitude, altitude) of the position of the own vehicle used when generating the map lane shape information becomes the origin of the XY coordinate system, and the map lane shape information indicate.
  • the process may be performed in combination with the rotation of the coordinates so that the vehicle head orientation is in the Y-axis direction of the XY coordinate system.
  • the position of each point on the shape of the map lane becomes a relative position with the lane shape measuring device 20 as the origin, similarly to the position of each point on the shape of the detection lane.
  • the absolute coordinate system of the map lane shape information is converted to the relative coordinate system of the detected lane shape information, but conversely, the relative coordinate system of the detected lane shape information is the map lane shape. It may be converted into an absolute coordinate system of information.
  • the absolute coordinate system of the map lane shape information is used in view of the fact that the relative coordinates with the own vehicle as the origin are generally used. It is preferable that the detected lane shape information is converted into a relative coordinate system.
  • FIG. 4 shows the lane marking 51, which is the detection lane, and the lane marking 52, which is the map lane, after the above coordinate system conversion is performed.
  • FIG. 4 also shows a calculation range L, which is a range of lane shapes for which the difference should be calculated. If the calculation range L is too large, the detection accuracy of the shape of the detection lane tends to be low. Therefore, the maximum distance of the calculation range L is set to, for example, about 100 [m].
  • the map accuracy generation unit 35a calculates the average value Davg and the maximum value Dmax of the difference D between the division line 51 and the division line 52 in the calculation range L.
  • the map accuracy generation unit 35a may calculate these values for the division line 51 and the division line 52 on only one side of the left side and the right side in the calculation range L, or the divisions on both the left side and the right side in the calculation range L. These values may be calculated for the line 51 and the lane marking 52.
  • the map accuracy generation unit 35a determines the accuracy of the map-related information based on the calculated average value Davg and the maximum value Dmax.
  • FIG. 5 is a diagram for explaining the determination by the map accuracy generation unit 35a.
  • the determination condition in which the average value Davg and the maximum value Dmax are classified is associated with the accuracy level in which the accuracy of the map-related information is classified.
  • the map accuracy generation unit 35a searches the information in FIG. 5 for the determination condition to which the calculated average value Davg and the maximum value Dmax correspond, and determines the accuracy level corresponding to the determination condition as the accuracy of the map-related information.
  • the accuracy level “high” means that the accuracy of the map-related information is high, and the accuracy level “low” means that the accuracy of the map-related information is low.
  • the accuracy level “undetermined” means that the information on the lane marking 51 and the lane marking 52 was not normally obtained.
  • the accuracy level “undetermined” means that the information output device 30 obtains only one of the left and right division lines at at least one of the division line 51 and the division line 52, or the left side and the right side. This means, for example, when both lane markings cannot be obtained.
  • the accuracy level is classified into four (high, medium, low, undetermined), but is not limited to four.
  • the map accuracy generation unit 35a generates the information indicating the accuracy determined as described above as the map accuracy information indicating the accuracy of the map-related information.
  • the travel control device 40 receives the map-related information and the map accuracy information output from the information output device 30 in step S6 of FIG.
  • the travel control device 40 determines whether or not to execute automatic driving using the received map-related information based on the accuracy indicated by the received map accuracy information, and performs travel control related to automatic driving based on the determination result. Do. For example, when the map accuracy information indicates "high”, the travel control device 40 performs travel control related to automatic driving using the map-related information, and when the map accuracy information indicates "low”, the travel control device 40 performs map-related information. Stop driving control related to automatic driving using information.
  • ⁇ Modification example 1> For the calculation range L in FIG. 4, it is desirable to use the range in which the accuracy of the lane shape measuring device 20 is the best. For example, when the lane shape measuring device 20 generates the detected lane shape information based on the image of the camera, the detection accuracy of the detected lane shape in a place where the road has a curvature such as a curve is determined by the detection lane in a straight road. It tends to be lower than the detection accuracy of the shape of.
  • the map accuracy generation unit 35a may change the calculation range L based on the curvature of the shape of the lane marking 51 which is the detection lane.
  • the curvature here includes not only the curvature itself but also the radius of curvature, which is the reciprocal of the curvature.
  • the map accuracy generation unit 35a will be described by taking as an example a case where the calculation range L is changed based on the radius of curvature of the shape of the lane marking 51.
  • FIG. 6 is a diagram for explaining the change of the calculation range L by the map accuracy generation unit 35a.
  • the radius of curvature R [m] of the shape of the lane marking 51 is associated with the calculation range L shown as the range in the Y-axis direction.
  • the map accuracy generation unit 35a calculates the radius of curvature of the shape of the division line 51, searches the calculation range L corresponding to the calculated radius of curvature from the information in FIG. 6, and searches for the shape of the division line 51 and the division line 52. Used to calculate the difference from the shape.
  • the above configuration it is possible to suppress the influence of the decrease in detection accuracy due to a curve or the like on the accuracy indicated by the map accuracy information.
  • the radius of curvature R of the lane marking 51 is 1000 m or more
  • the road up to 50 m ahead of the own vehicle is regarded as a straight road, and the accuracy of the lane marking 51 is calculated from the position of the vehicle to 50 m ahead.
  • the information output device 30 has low accuracy of map-related information before the own vehicle enters a section where the accuracy of map-related information is low with respect to the travel control device 40 that is automatically driven by the own vehicle. It is preferable to notify that. Therefore, the position of the calculation range L on the own vehicle side may be separated from the own vehicle. With such a configuration, the travel control device 40 can take appropriate measures such as shifting to an automatic driving mode that does not use map-related information before the own vehicle enters the section.
  • the map accuracy generation unit 35a changes the calculation range L based on the curvature (radius of curvature) of the shape of the lane marking 51 which is the detection lane, but the curvature of the shape of the lane marking 52 which is the map lane.
  • the calculation range L may be changed based on.
  • the representation format in which the detection lane shape information indicates the shape of the detection lane and the representation format in which the map lane shape information indicates the shape of the map lane may differ.
  • the position of the detection lane indicated by the detection lane shape information output from the lane shape measuring device 20 includes the position of the end portion of the detection lane on the own vehicle side, the center position in the width direction of the detection lane, or the detection lane. It is assumed that various positions such as the position of the end opposite to the own vehicle will be applied.
  • the map accuracy generation unit 35a determines a predetermined lane width.
  • the above difference may be obtained by using.
  • the position of the lane marking 51 which is the detection lane in the detection lane shape information
  • the position of the lane marking 52 which is the map lane in the map lane shape information.
  • the position is the center position in the width direction of the lane marking.
  • the map accuracy generation unit 35a sets the average value and the maximum value of the values obtained by subtracting W / 2 from the difference D between the shape of the lane marking 51 and the shape of the lane marking 52 as the average value described in the second embodiment, respectively. It may be used as Davg and the maximum value Dmax.
  • the lane width W differs depending on the region or country, the lane width corresponding to the position of the own vehicle may be used in the above calculation among the lane widths shown in the map data (map lane shape information). If it is a general road in Japan, the lane width W is about 15 cm.
  • the accuracy of the difference between the shape of the lane marking 51 output by the lane shape measuring device 20 and the shape of the lane marking 52 in the map data can be further improved, so that the map accuracy information can be obtained.
  • the accuracy of showing can be improved.
  • a portion of the image data that is not a lane marking may be erroneously detected as a lane marking due to a headrace zone or a curb.
  • it can be expected to reduce such false detections.
  • the map accuracy generation unit 35a reduced the influence of the difference between the measurement specifications of the lane shape measuring device 20 and the map data specifications on the map accuracy information by using the lane width W, but the specifications (expression format). ) Is not limited to this.
  • the map accuracy generation unit 35a may obtain the difference between the shape of the center line of the two division lines 51 and the shape of the center line of the two division lines 52 as the above difference.
  • the map accuracy generation unit 35a determines the difference between the first lane center line calculated from the two lane markings 51 and the second lane center line calculated from the two lane markings 52 between the lane marking 51 and the lane marking. It is treated in the same way as the difference D from 52. At this time, the map accuracy generation unit 35a calculates the first lane center line from the midpoints of the two points that are the shortest distances of the two lanes 51, and the midpoints of the two points that are the shortest distances of the two lanes 52. It is preferable to calculate the second lane center line from.
  • the first lane center line and the second lane center line are in the middle of the left and right lane markings, the first lane center line and the lane marking 52 even if the representation formats of the lane marking 51 and the lane marking 52 are slightly different.
  • the expression format of the second lane center line can be the same. Therefore, it is possible to reduce the influence of the difference between the measurement specifications of the lane shape measuring device 20 and the map data specifications on the map accuracy information. Therefore, according to the above configuration, the accuracy of the difference between the shape of the lane marking 51 output by the lane shape measuring device 20 and the shape of the lane marking 52 in the map data can be further improved, and thus the map accuracy information. The accuracy indicated by can be improved.
  • the measurement time required for the lane shape measuring device 20 to measure the shape of the detected lane may be long.
  • the time that the detected lane shape information is used to calculate the difference may be delayed from the time that the map lane shape information is used to calculate the difference.
  • the lane marking 51 which is the detection lane indicated by the detection lane shape information, is in the traveling direction of the own vehicle (plus direction of the Y axis in FIG. 4) with respect to the lane marking 52, which is the map lane indicated by the map lane shape information. ) May shift.
  • the map accuracy generation unit 35a obtains the above difference after moving the shape of the lane marking 51 with respect to the shape of the lane marking 52 based on the generation time T of the detected lane shape information and the speed of the own vehicle. You may. For example, the value of the generation time T for which the lane shape measuring device 20 generates the detected lane shape information is obtained in advance by an experiment or the like, and stored as a parameter in the map accuracy generation unit 35a. Then, in step S5 of FIG. 3, the map accuracy generation unit 35a integrates the generation time T and the vehicle speed V [m / s] of the own vehicle at the current time to obtain the distance Ldelay [m].
  • the map accuracy generation unit 35a converts the lane marking 52 indicated by the map lane shape information into the same XY coordinate system as the lane marking 51 indicated by the detected lane shape information
  • the lane marking 51 is Y by the distance Ldelay. Translate in the minus direction of the axis.
  • the generation time T may include a communication delay time until the detected lane shape information is output from the lane shape measuring device 20 and received by the information output device 30.
  • the influence of the generation time T of the lane shape measuring device 20 on the map accuracy information can be reduced, so that the accuracy indicated by the map accuracy information can be improved.
  • the map accuracy generation unit 35a uses the information shown in FIG. 5 from the average value Davg and the maximum value Dmax of the difference D between the division line 51 and the division line 52 to indicate the accuracy of the map-related information.
  • the map accuracy information is not limited to this.
  • the map accuracy generation unit 35a has the average value Davg and the maximum value Dmax. May be generated as map accuracy information as it is.
  • the first acquisition unit 31, the second acquisition unit 32, the first generation unit 34, the second generation unit 35, and the output unit 36 of FIG. 1 described above are hereinafter referred to as “first acquisition unit 31 and the like”.
  • the first acquisition unit 31 and the like are realized by the processing circuit 81 shown in FIG. That is, the processing circuit 81 acquires the position of the vehicle and the first acquisition unit 31 that is generated based on the result of detecting the vicinity of the vehicle and acquires the first lane shape information indicating the shape of the first lane around the vehicle.
  • the map vehicle position information indicating the vehicle position on the map data and the lane on the map data based on the vehicle positions acquired by the second acquisition unit 32 and the second acquisition unit 32 and the map data.
  • the first generation unit 34 that generates the second lane shape information indicating the shape of the second lane around the vehicle, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit 31, and the first 1. Accuracy of map-related information including at least one of map vehicle position information and second lane shape information based on the difference from the shape of the second lane indicated by the second lane shape information generated by the generation unit 34.
  • a second generation unit 35 that generates map accuracy information indicating the above, and an output unit 36 that outputs map-related information and map accuracy information are provided.
  • Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in the memory may be applied. Examples of the processor include a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), and the like.
  • the processing circuit 81 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate). Array), or a combination of these.
  • Each of the functions of each part such as the first acquisition unit 31 may be realized by a circuit in which processing circuits are dispersed, or the functions of each part may be collectively realized by one processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the first acquisition unit 31 and the like are realized by combining with software and the like.
  • the software and the like correspond to, for example, software, firmware, or software and firmware.
  • Software and the like are described as programs and stored in memory.
  • the processor 82 applied to the processing circuit 81 realizes the functions of each part by reading and executing the program stored in the memory 83. That is, the information output device 30 is generated based on the result of detecting the vicinity of the vehicle when executed by the processing circuit 81, and is a step of acquiring the first lane shape information indicating the shape of the first lane around the vehicle.
  • the map vehicle position information indicating the position of the vehicle on the map data and the lanes on the map data around the vehicle
  • a step of generating map accuracy information indicating the accuracy of map-related information including at least one of map vehicle position information and second lane shape information based on the difference from the shape, and map-related information and map accuracy information. It includes a step to output and a memory 83 for storing a program to be executed as a result.
  • the memory 83 is, for example, non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like. Volatile semiconductor memory, HDD (HardDiskDrive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (DigitalVersatileDisc), its drive device, etc., or any storage medium that will be used in the future. You may.
  • RAM RandomAccessMemory
  • ROM ReadOnlyMemory
  • flash memory EPROM (ErasableProgrammableReadOnlyMemory), EPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like.
  • Volatile semiconductor memory HDD (HardDiskDrive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (DigitalVersatileDisc), its drive device,
  • each function of the first acquisition unit 31 and the like is realized by either hardware or software has been described above.
  • the present invention is not limited to this, and a configuration may be configured in which a part of the first acquisition unit 31 or the like is realized by dedicated hardware and another part is realized by software or the like.
  • the functions of the first acquisition unit 31 and the second acquisition unit 32 are realized by the processing circuit 81 as dedicated hardware, the interface, the receiver, and the like, and the processing circuit 81 as the processor 82 is the memory 83 in other cases.
  • the function can be realized by reading and executing the program stored in.
  • the processing circuit 81 can realize each of the above-mentioned functions by hardware, software, or a combination thereof.
  • the information output device 30 described above includes a vehicle device such as a PND (Portable Navigation Device), a navigation device and a DMS (Driver Monitoring System) device, and a communication terminal including a mobile terminal such as a mobile phone, a smartphone and a tablet. It can also be applied to an information output system constructed as a system by appropriately combining a function of an application installed in at least one of a vehicle device and a communication terminal and a server. In this case, each function or each component of the information output device 30 described above may be distributed to each device for constructing the system, or may be concentrated to any device. Good.
  • a vehicle device such as a PND (Portable Navigation Device), a navigation device and a DMS (Driver Monitoring System) device
  • a communication terminal including a mobile terminal such as a mobile phone, a smartphone and a tablet. It can also be applied to an information output system constructed as a system by appropriately combining a function of an application installed in at least one of a vehicle device and a communication terminal and a server
  • FIG. 9 is a block diagram showing the configuration of the server 91 according to this modification.
  • the server 91 of FIG. 9 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with the vehicle device 93 of the vehicle 92.
  • the communication unit 91a which is the first acquisition unit and the second acquisition unit, receives the first lane shape information acquired by the vehicle device 93 and the position of the vehicle 92 by performing wireless communication with the vehicle device 93.
  • the control unit 91b performs the same functions as the first generation unit 34 and the second generation unit 35 of FIG. 1 by executing a program stored in a memory (not shown) of the server 91 by a processor (not shown) of the server 91. Have. That is, the control unit 91b generates the map vehicle position information and the second lane shape information based on the vehicle position and the map data received by the communication unit 91a. Then, the control unit 91b is the difference between the shape of the first lane indicated by the first lane shape information received by the communication unit 91a and the shape of the second lane indicated by the second lane shape information generated by the control unit 91b. Generates map accuracy information indicating the accuracy of map-related information based on.
  • the communication unit 91a which is an output unit, transmits map-related information and map accuracy information to the vehicle device 93. According to the server 91 configured in this way, the same effect as that of the information output device 30 described in the first embodiment can be obtained.
  • FIG. 10 is a block diagram showing the configuration of the communication terminal 96 according to this modification.
  • the communication terminal 96 of FIG. 10 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can perform wireless communication with the vehicle device 98 of the vehicle 97. ing.
  • a mobile terminal such as a mobile phone, a smartphone, or a tablet carried by the driver of the vehicle 97 is applied to the communication terminal 96, for example.
  • the communication terminal 96 configured in this way, the same effect as that of the information output device 30 described in the first embodiment can be obtained.
  • each embodiment and each modification can be freely combined, and each embodiment and each modification can be appropriately modified or omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

L'objectif de la présente invention est de fournir une technologie qui permet d'utiliser correctement des informations cartographiques. Ledit dispositif d'émission en sortie d'informations comprend : une seconde unité de génération qui génère des informations de précision de carte qui indiquent la précision d'informations cartographiques contenant des informations cartographiques de position de véhicule et/ou des informations de forme de seconde voie, en fonction de la différence entre la forme d'une première voie indiquée par des informations de forme de première voie acquises par une première unité d'acquisition et la forme d'une seconde voie indiquée par les informations de forme de seconde voie générées par une première unité de génération ; et une unité d'émission en sortie qui émet en sortie les informations cartographiques et les informations de précision de carte.
PCT/JP2019/032816 2019-08-22 2019-08-22 Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations Ceased WO2021033312A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/032816 WO2021033312A1 (fr) 2019-08-22 2019-08-22 Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/032816 WO2021033312A1 (fr) 2019-08-22 2019-08-22 Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations

Publications (1)

Publication Number Publication Date
WO2021033312A1 true WO2021033312A1 (fr) 2021-02-25

Family

ID=74660455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/032816 Ceased WO2021033312A1 (fr) 2019-08-22 2019-08-22 Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations

Country Status (1)

Country Link
WO (1) WO2021033312A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023139737A (ja) * 2022-03-22 2023-10-04 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
JP2023143322A (ja) * 2022-03-25 2023-10-06 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11211492A (ja) * 1998-01-29 1999-08-06 Fuji Heavy Ind Ltd 道路情報認識装置
JP2001291197A (ja) * 2000-04-07 2001-10-19 Honda Motor Co Ltd 車両制御装置
JP2006290072A (ja) * 2005-04-07 2006-10-26 Toyota Motor Corp 車両制御装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11211492A (ja) * 1998-01-29 1999-08-06 Fuji Heavy Ind Ltd 道路情報認識装置
JP2001291197A (ja) * 2000-04-07 2001-10-19 Honda Motor Co Ltd 車両制御装置
JP2006290072A (ja) * 2005-04-07 2006-10-26 Toyota Motor Corp 車両制御装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023139737A (ja) * 2022-03-22 2023-10-04 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
JP7376634B2 (ja) 2022-03-22 2023-11-08 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
US12269492B2 (en) 2022-03-22 2025-04-08 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and storage medium
JP2023143322A (ja) * 2022-03-25 2023-10-06 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
JP7449971B2 (ja) 2022-03-25 2024-03-14 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
US12296862B2 (en) 2022-03-25 2025-05-13 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and storage medium

Similar Documents

Publication Publication Date Title
JP6260114B2 (ja) 走行路情報生成装置
JP4366664B2 (ja) 自車位置認識装置及び自車位置認識プログラム
US7463974B2 (en) Systems, methods, and programs for determining whether a vehicle is on-road or off-road
CN110249207B (zh) 用于更新数字地图的方法和设备
US8731825B2 (en) Trajectory display device
US10928819B2 (en) Method and apparatus for comparing relevant information between sensor measurements
WO2015122121A1 (fr) Dispositif de spécification de position de déplacement de véhicule hôte et produit de programme de spécification de position de déplacement de véhicule hôte
US11668583B2 (en) Method, apparatus, and computer program product for establishing reliability of crowd sourced data
WO2015131464A1 (fr) Procédé et dispositif de correction d'erreur de positionnement d'un véhicule
CN110375786B (zh) 一种传感器外参的标定方法、车载设备及存储介质
JP2010276545A (ja) 車両位置測定装置および車両位置測定方法
CN103843035A (zh) 用于几何校准借助车辆的传感器系统形成的传感器数据的设备和方法
JPWO2014132432A1 (ja) 車両位置表示制御装置および車両位置特定プログラム
CN110869864B (zh) 用于定位较高程度自动化的车辆的方法以及相应的驾驶员辅助系统和计算机程序
JP4936070B2 (ja) ナビゲーション装置及びナビゲーションプログラム
JP4953015B2 (ja) 自車位置認識装置と自車位置認識プログラム、及びこれを用いたナビゲーション装置
JP4596566B2 (ja) 自車情報認識装置及び自車情報認識方法
KR20200119092A (ko) 차량 및 차량의 위치 검출 방법
WO2021033312A1 (fr) Dispositif de sortie d'informations, dispositif de conduite automatisée et procédé d'émission en sortie d'informations
JP2017187989A (ja) 位置捕捉システム
US12416510B2 (en) Method and apparatus for providing an updated map model
JP2018091714A (ja) 車両位置測定システム
JP2018159752A (ja) 地図情報学習方法及び地図情報学習装置
US11680808B2 (en) Map selection device, storage medium storing computer program for map selection and map selection method
US11080311B2 (en) Method and apparatus for identifying critical parameters of a localization framework based on an input data source

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19942093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19942093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP