[go: up one dir, main page]

WO2020021596A1 - Vehicle position estimation device and vehicle position estimation method - Google Patents

Vehicle position estimation device and vehicle position estimation method Download PDF

Info

Publication number
WO2020021596A1
WO2020021596A1 PCT/JP2018/027495 JP2018027495W WO2020021596A1 WO 2020021596 A1 WO2020021596 A1 WO 2020021596A1 JP 2018027495 W JP2018027495 W JP 2018027495W WO 2020021596 A1 WO2020021596 A1 WO 2020021596A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
road feature
vehicle position
estimated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/027495
Other languages
French (fr)
Japanese (ja)
Inventor
雄治 五十嵐
優子 大田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2018/027495 priority Critical patent/WO2020021596A1/en
Publication of WO2020021596A1 publication Critical patent/WO2020021596A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle position estimating device for estimating a current position of a vehicle.
  • a road sign is detected from image data obtained by photographing the periphery of a vehicle, and a road sign is detected based on a relative position of the vehicle with respect to the detected road sign and a position of the road sign included in the map information.
  • a technique for estimating the position of a vehicle on a road has been disclosed.
  • An imaging sensor such as a monocular camera has an advantage that it can recognize the type of an imaging target such as a road sign and is relatively inexpensive.
  • the position detection using the image sensor has a detection error in the traveling direction of the vehicle of about several meters, and it is difficult to detect the position with high accuracy. Therefore, the position detection by the image sensor is not suitable for applications that need to detect the position and direction of the vehicle with high accuracy, such as automatic driving of vehicles and preventive safety technology.
  • optical ranging sensors such as LiDAR (Light Detection and Ranging) and stereo cameras have high position detection accuracy.
  • LiDAR Light Detection and Ranging
  • stereo cameras have high position detection accuracy.
  • point cloud information including information on a distance to a detection target and information on reflection intensity or luminance / color.
  • Central Processing Unit Central Processing Unit
  • the processing time required for position detection is also required in units of several seconds, it is not suitable for applications requiring immediate responsiveness, such as automatic driving of vehicles and preventive safety technology.
  • the present invention has been made to solve the above problems, and an object of the present invention is to provide a vehicle position estimating apparatus having both high-accuracy position detection and immediate responsiveness.
  • the vehicle position estimating device is a first vehicle position estimating device that indicates a position and an azimuth of the own vehicle on a map based on absolute positioning information of the own vehicle obtained by satellite positioning and vehicle sensor information obtained from a vehicle sensor of the own vehicle.
  • An orientation detection unit Than it is.
  • the range in which the detection processing is performed on the optical ranging information includes the position of the road feature estimated from the first estimated own-vehicle position information. , The time required for the detection process can be reduced. As a result, a vehicle position estimating device having both high-accuracy position detection and immediate responsiveness can be realized.
  • FIG. 1 is a block diagram illustrating a configuration of a vehicle position estimation system according to an embodiment of the present invention. It is a figure showing an example of a detectable range of an optical distance measuring sensor device. It is a figure which shows the example of derivation of the detectable area of a road feature. It is a figure which shows the example of the target range of the detection processing of a road feature. 4 is a flowchart illustrating an operation of the vehicle position estimation device according to the embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of a vehicle position estimation device.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of a vehicle position estimation device.
  • FIG. 1 is a block diagram showing a configuration of a vehicle position estimation system according to an embodiment of the present invention.
  • the vehicle position estimation system includes a satellite positioning device 1, a vehicle sensor information output device 2, an imaging sensor device 3, an optical ranging sensor device 4, a high-accuracy map database 5, and a vehicle position estimation device 6. ing.
  • a vehicle equipped with the vehicle position estimation system is referred to as “own vehicle”.
  • the satellite positioning device 1 calculates the absolute position (latitude, longitude, altitude) and absolute azimuth of the own vehicle by satellite positioning based on a positioning signal transmitted by a Global Positioning System (GNSS) satellite such as a GPS (Global Positioning System) satellite. Then, the calculated information on the absolute position and the absolute direction of the own vehicle is output as “absolute positioning information”.
  • GNSS Global Positioning System
  • GPS Global Positioning System
  • the vehicle sensor information output device 2 outputs “vehicle sensor information” obtained from vehicle sensors such as a vehicle speed sensor, a gyro, a steering angle sensor, and an air pressure sensor mounted on the own vehicle. It is assumed that the vehicle sensor information includes at least one of travel speed, rotation angle, steering angle, and air pressure.
  • the imaging sensor device 3 includes an imaging device (camera sensor) such as a monocular camera, for example, and detects a feature around the road where the own vehicle is located (hereinafter, referred to as a “road feature”) from an image of the vicinity of the own vehicle. Detect position, shape, type, etc. Road features include not only three-dimensional features such as traffic road signs and traffic lights, but also planar features such as lane markings and stop lines drawn on the road surface. In addition, the position of the road feature detected by the imaging sensor device 3 is a relative position from the own vehicle. The imaging sensor device 3 outputs information on the detected position, shape, type, and the like of the road feature and information on its detection accuracy as “imaging sensor information”.
  • imaging sensor information information on the detected position, shape, type, and the like of the road feature and information on its detection accuracy as “imaging sensor information”.
  • the optical ranging sensor device 4 includes an optical ranging sensor such as a LiDAR or a stereo camera, and includes “optical ranging information including information on a distance to an obstacle existing around the own vehicle and reflection intensity or luminance / color. Is output.
  • an optical ranging sensor such as a LiDAR or a stereo camera
  • optical ranging information including information on a distance to an obstacle existing around the own vehicle and reflection intensity or luminance / color. Is output.
  • an object that reflects light is detected as an obstacle.
  • the stereo camera an object whose distance can be detected from a parallax image (images captured from a plurality of different viewpoints) is detected as an obstacle.
  • the detectable range of the optical distance measuring sensor device 4 is preset in the vehicle position estimating device 6 as a parameter. Since the range changes according to the performance of the optical distance measuring sensor device 4, it is preferable that the range is determined based on an experiment or the like.
  • FIG. 2 shows an example of the detectable range of the optical distance measuring sensor device 4.
  • the X axis is set in the traveling direction (head direction) of the host vehicle
  • the Y axis is set in the lateral direction of the host vehicle
  • the Z axis is set in the height direction with the host vehicle position as the origin.
  • the own vehicle position as the origin corresponds to the own vehicle position estimated by the vehicle position estimating device 6, and is, for example, the center of gravity position, the center position, or the head position of the own vehicle.
  • the detectable range of the optical ranging sensor device 4 is defined as a space defined by a detectable distance L1, a detectable angle ⁇ 1 on the XY plane, and a detectable angle ⁇ 2 on the ZY plane, as shown in FIG. it can.
  • the detectable distance is considered to be the same on the XY plane and the ZX plane, but different detectable distances may be set on the XY plane and the ZX plane.
  • the high-precision map database 5 is a database in which data of high-precision maps is stored.
  • the high-precision map includes information indicating the lane shape of the road, such as the position, shape, and type of lane markings of the road, the position of the stop line, the position, shape, type of traffic road signs and traffic signals installed around the road, Detailed information on road features, such as direction information, is included.
  • the high-accuracy map database 5 may not be mounted on the own vehicle, and may be, for example, a server that distributes high-accuracy map data to the vehicle position estimating device 6 by communication.
  • the high-accuracy map database 5 is depicted as a different block from the vehicle position estimating device 6, but the high-accurate map database 5 may be provided inside the vehicle position estimating device 6. Further, the high-accuracy map database 5 and the vehicle position estimating device 6 may be provided in a navigation system of the own vehicle.
  • the vehicle position estimating device 6 includes a vehicle position estimating unit 61 and a road feature position / azimuth detecting unit 62, as shown in FIG.
  • the vehicle position estimating unit 61 includes the absolute positioning information (the absolute position and the absolute azimuth of the own vehicle) output from the satellite positioning device 1 and the vehicle sensor information (the traveling speed and the rotation angle of the own vehicle) output from the vehicle sensor information output device 2. , Steering angle, and air pressure) and the position and orientation of the vehicle on the high-accuracy map based on the high-accuracy map data stored in the high-accuracy map database 5. Is output as “first estimated vehicle position information”. Note that the absolute positioning information and the vehicle sensor information are input to the vehicle position estimating unit 61 at regular intervals, and the vehicle position estimating unit 61 outputs the first estimated own vehicle position information at regular intervals.
  • the road feature position / azimuth detecting unit 62 outputs the first estimated own vehicle position information (the position and the azimuth of the own vehicle on the high-accuracy map estimated by the vehicle position estimating unit 61) output by the vehicle position estimating unit 61, Based on the high-accuracy map stored in the high-accuracy map database 5 and at least one of the image sensor information (the position, shape, type, and the like of the road feature detected by the image sensor 3) output by the image sensor 3 Then, a road feature to be detected is selected from the optical ranging information output by the optical ranging sensor device 4.
  • the road feature position / azimuth detecting unit 62 detects the relative position and relative orientation of the selected road feature from the own vehicle from the optical ranging information output by the optical ranging sensor device 4. Further, the road feature position / azimuth detecting unit 62 detects the position of the vehicle on the high-precision map based on the relative position and relative orientation of the road feature detected from the imaging sensor information and the data of the high-precision map. And the direction is estimated. The information on the position and orientation of the vehicle on the high-accuracy map estimated by the road feature position / azimuth detecting unit 62 is output from the road feature position / azimuth detecting unit 62 as “second estimated own vehicle position information”. You.
  • the road feature position / azimuth detection unit 62 can detect the relative position and relative orientation from the own vehicle at higher speed and with higher accuracy as the road feature to be detected, that is, the optical distance measurement information. Select the one with higher density and accuracy. Specifically, a road feature having a short distance from the own vehicle (a short detection distance) and a large size (a large detectable area) when viewed from the own vehicle is selected as a detection target. For example, by performing a road feature detection process based on the first estimated own vehicle position information and the high-precision map or the imaging sensor information, a 10-cm-diameter cylindrical pole and a 30-cm-radius disk-shaped road sign are automatically detected. If detected at the same distance from the car, the road feature position / azimuth detecting unit 62 selects a road sign of both as a detection target.
  • the position / direction detection unit 62 may calculate the size of the road feature based on the positional relationship between the vehicle and the road feature. For example, as shown in FIG. 3, when the angle (detection angle) between the surface of a rectangular road feature having an area S and the detection direction of the optical distance measuring sensor device 4 is ⁇ a, the road feature can be detected.
  • the area is S ⁇ sin ⁇ a.
  • the road feature position / azimuth detecting unit 62 When detecting the relative position and relative orientation of the road feature selected as the detection target, the road feature position / azimuth detecting unit 62 outputs the optical ranging information (point group information) output by the optical ranging sensor device 4. Among them, the detection processing is performed on the optical ranging information within the range in which the road feature is estimated to be located. The position of the road feature is known at the stage of selecting the road feature to be detected. By limiting the range of the optical ranging information to be subjected to the road feature detection processing in this way, the calculation load required for the detection processing can be reduced, and the position and orientation of the road feature can be detected at high speed.
  • the range of the optical distance measurement information to be subjected to the road feature detection processing includes an area having a width corresponding to a predetermined margin at the outer edge of the range corresponding to the estimated position and shape of the road feature.
  • the range is added.
  • the range of the optical ranging information to be subjected to the detection processing is a radius R including the shape of the road sign. 4 is defined as a range obtained by adding a region having a margin m to the outer edge of the circle, that is, a range of a circle having a radius R + m shown in FIG.
  • the allowance m is preferably a value equal to or larger than the estimation error e of the first estimated own vehicle position information.
  • the reason is that the range where the road feature is estimated to be located is obtained as the position (relative position) of the road feature based on the own vehicle position on the high-accuracy map indicated by the first estimated own vehicle position information. For this reason, the same error as the error generated in the first estimated own-vehicle position information also occurs in the position of the road feature position.
  • the information of the estimation error e may be included in the first estimated own-vehicle position information, or may be a value previously obtained from an experiment or the like.
  • an optimal detection model is applied according to the shape information (circle, rectangle, cylinder, etc.) included in the road feature information of the high-precision map.
  • shape information (circle, rectangle, cylinder, etc.) included in the road feature information of the high-precision map.
  • the shape information of a road feature indicates a cylinder
  • the cylinder may be detected using a Cylinder model of PCL (Point @ Cloud @ Library).
  • PCL Point @ Cloud @ Library
  • the vehicle position estimating device 6 of the present embodiment has both high-accuracy position detection and immediate responsiveness. Also, by limiting the range of the optical ranging information to be subjected to the road feature detection processing, the calculation load for the detection processing is reduced. Therefore, the effect that the position of the own vehicle can be estimated with high accuracy can be obtained.
  • the detection processing may take a certain amount of time (200 to 300 milliseconds may be required). . Therefore, especially when the own vehicle is running, a difference may occur between the position and the azimuth of the own vehicle indicated by the second estimated own vehicle position information and the current position and the azimuth of the own vehicle.
  • the vehicle position estimating unit 61 obtains an error in the first estimated own vehicle position information using the second estimated own vehicle position information, and based on the error, determines the current (latest) By correcting the estimated own vehicle position information of No. 1, a more accurate position and orientation of the own vehicle are obtained. Specifically, the vehicle position estimating unit 61 acquires from the road feature position / azimuth detecting unit 62 the second estimated own vehicle position information and the information on the acquisition time of the optical ranging information used for the calculation, The difference between the first estimated own vehicle position information at the same time and the second estimated own vehicle position information acquired from the road feature position / azimuth detecting unit 62 is obtained, and the obtained difference is used as the first estimated own vehicle position information. It is regarded as an error of the vehicle position information. Then, the vehicle position estimating unit 61 calculates the more accurate position and orientation of the own vehicle by adding the error to the latest first estimated own vehicle position information.
  • the vehicle position estimating unit 61 uses the first estimated own-vehicle position information from the present time to a certain time (200 to 300 ms or more). Must be held together with the calculation time.
  • FIG. 5 is a flowchart showing the operation of the vehicle position estimation device 6.
  • a process for the vehicle position estimating device 6 to estimate the own vehicle position (own vehicle position estimating process) will be described. Note that the flow of FIG. 5 is repeatedly executed after the activation of the vehicle position estimating device 6.
  • the vehicle position estimating unit 61 determines the position and orientation of the vehicle on the high-precision map based on the absolute positioning information output by the satellite positioning device 1 and the vehicle sensor information output by the vehicle sensor information output device 2. First estimated vehicle position information including the information is calculated (step S100).
  • the road feature position / azimuth detecting unit 62 is in a detectable range of the optical distance measuring sensor device 4 among the road features existing around the own vehicle position indicated by the first estimated own vehicle position information.
  • Information on road features is acquired from the high-accuracy map database 5 (step S101).
  • the road feature position / azimuth detecting unit 62 calculates the first estimated own vehicle position information calculated in step S100, the road feature information acquired in step S101, and the imaging sensor information output by the imaging sensor device 3.
  • a road feature to be detected is selected based on one or both of the above (step S102).
  • the road feature position / azimuth detecting unit 62 performs a detection process on the optical ranging information (point group information) output from the optical ranging sensor device 4, so that the road feature selected in step S ⁇ b> 102 can be used.
  • the relative position and relative azimuth from the car are detected (step S103). This detection process is performed on the optical ranging information output from the optical ranging sensor device 4 within the range where the road feature is estimated to be located.
  • the road feature position / azimuth detection unit 62 performs a high-precision map based on the information on the road feature obtained in step S101 and the relative position and relative orientation of the road feature detected from the own vehicle in step S103.
  • the second estimated own vehicle position information including the information on the position and the direction of the upper own vehicle is calculated (step S104).
  • the vehicle position estimating unit 61 calculates the second estimated own vehicle position information calculated in step S104 and the first estimated own vehicle position information corresponding to the acquisition time of the optical ranging information used for the calculation. Is obtained as an error of the first estimated own vehicle position information, and the error is added to the latest first estimated own vehicle position information, thereby correcting the latest first estimated own vehicle position information. (Step S105). Based on the corrected first estimated own-vehicle position information, the current position and orientation of the own-vehicle can be obtained.
  • FIGS. 6 and 7 are diagrams illustrating examples of the hardware configuration of the vehicle position estimating device 6, respectively.
  • Each function of the components of the vehicle position estimation device 6 shown in FIG. 1 is realized by, for example, a processing circuit 70 shown in FIG. That is, the vehicle position estimating device 6 uses the first absolute position information of the own vehicle obtained by the satellite positioning and the vehicle sensor information obtained from the own vehicle sensor to indicate the position and the direction of the own vehicle on the map. Estimated own vehicle position information is calculated, and based on the first estimated own vehicle position information and one or both of the road feature information included in the map data and the image sensor information obtained from the own vehicle image sensor.
  • a processing circuit 70 is provided for calculating second estimated vehicle position information indicating the position and direction of the vehicle on the map based on the relative position and direction of the road feature detected from the information.
  • the processing circuit 70 may be dedicated hardware, or a processor that executes a program stored in a memory (a central processing unit (CPU: Central Processing Unit), a processing device, an arithmetic device, a microprocessor, a microcomputer, It may be configured using a DSP (also called Digital Signal Processor).
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the processing circuit 70 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array) or a combination of these.
  • Each function of the components of the vehicle position estimating device 6 may be realized by an individual processing circuit, or the functions may be realized by one processing circuit.
  • FIG. 7 shows an example of a hardware configuration of the vehicle position estimating device 6 when the processing circuit 70 is configured using the processor 71 that executes a program.
  • the functions of the components of the vehicle position estimation device 6 are realized by software or the like (software, firmware, or a combination of software and firmware).
  • Software and the like are described as programs and stored in the memory 72.
  • the processor 71 implements the function of each unit by reading and executing the program stored in the memory 72. That is, when the vehicle position estimating device 6 is executed by the processor 71, the vehicle position estimating device 6 uses the vehicle positioning information on the map based on the absolute positioning information of the vehicle obtained by satellite positioning and the vehicle sensor information obtained from the vehicle sensor of the vehicle.
  • a second estimated own-vehicle position indicating a position and an azimuth of the own vehicle on a map based on a process of detecting from the obtained optical ranging information and a relative position and a relative azimuth of the road feature detected from the optical ranging information.
  • Calculate information Comprising a process of the memory 72 for storing a program that will but executed consequently. In other words, it can be said that this program causes a computer to execute the procedure and method of operation of the components of the vehicle position estimation device 6.
  • the memory 72 is, for example, a non-volatile or non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), and an EEPROM (Electrically Erasable Programmable Read Only Memory). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive device, or any storage medium used in the future. You may.
  • a non-volatile or non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), and an EEPROM (Electrically Erasable Programmable Read Only Memory). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive
  • the present invention is not limited to this, and some components of the vehicle position estimating device 6 may be realized by dedicated hardware, and another part of the components may be realized by software or the like.
  • the function is realized by a processing circuit 70 as dedicated hardware, and for other components, the processing circuit 70 as a processor 71 executes a program stored in a memory 72. The function can be realized by reading and executing.
  • the vehicle position estimating apparatus 6 can realize the above-described functions by hardware, software, or the like, or a combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

In this vehicle position estimation device 6, a vehicle position estimation unit 61 calculates first estimated vehicle position information indicating the position and orientation of a vehicle on a map on the basis of absolute positioning information for the vehicle obtained through satellite positioning and vehicle sensor information obtained from a vehicle sensor of the vehicle. A road feature position/orientation detection unit 62 selects a road feature to be detected on the basis of the first estimated vehicle position information and at least one from among road feature information included in map data and imaging sensor information obtained from an imaging sensor of the vehicle. Further, the road feature position/orientation detection unit 62 detects the relative position and relative orientation of the selected road feature in relation to the vehicle from optical distance measurement information obtained from an optical distance measurement sensor of the vehicle and calculates second estimated vehicle position information indicating the position and orientation of the vehicle on the map on the basis of the detected relative position and relative orientation of the road feature.

Description

車両位置推定装置および車両位置推定方法Vehicle position estimating apparatus and vehicle position estimating method

 本発明は、車両の現在地を推定する車両位置推定装置に関するものである。 The present invention relates to a vehicle position estimating device for estimating a current position of a vehicle.

 例えば下記の特許文献1には、車両の周辺を撮影して得た画像データから道路標識を検出し、検出した道路標識に対する車両の相対位置と、地図情報に含まれる道路標識の位置とに基づいて、車両の道路上の位置を推定する技術が開示されている。 For example, in Patent Document 1 below, a road sign is detected from image data obtained by photographing the periphery of a vehicle, and a road sign is detected based on a relative position of the vehicle with respect to the detected road sign and a position of the road sign included in the map information. Thus, a technique for estimating the position of a vehicle on a road has been disclosed.

特開2016-176769号公報JP 2016-176679 A

 単眼カメラなどの撮像センサは、道路標識などの撮影対象の種別を認識することができ、比較的安価であるという利点を持つ。しかし、撮像センサを用いた位置の検出は、車両の進行方向に対する検出誤差が数メートル程度あり、高精度での位置検出は困難である。そのため、撮像センサによる位置検出は、車両の自動運転や予防安全技術など、車両の位置および方位を高精度で検出する必要がある用途には向かない。 (4) An imaging sensor such as a monocular camera has an advantage that it can recognize the type of an imaging target such as a road sign and is relatively inexpensive. However, the position detection using the image sensor has a detection error in the traveling direction of the vehicle of about several meters, and it is difficult to detect the position with high accuracy. Therefore, the position detection by the image sensor is not suitable for applications that need to detect the position and direction of the vehicle with high accuracy, such as automatic driving of vehicles and preventive safety technology.

 一方、LiDAR(Light Detection and Ranging)やステレオカメラなどの光学測距センサは、位置の検出精度が高い。しかし、道路標識などの位置を検出するためには、検出対象までの距離の情報と反射強度または輝度・色の情報とを含む大量の点群情報を処理する必要があるため、高スペックなCPU(Central Processing Unit)を持つ演算装置が必要となる。また、位置検出に要する処理時間も数秒単位で必要となるため、車両の自動運転や予防安全技術など、即時応答性が必要な用途に向かない。 On the other hand, optical ranging sensors such as LiDAR (Light Detection and Ranging) and stereo cameras have high position detection accuracy. However, in order to detect the position of a road sign or the like, it is necessary to process a large amount of point cloud information including information on a distance to a detection target and information on reflection intensity or luminance / color. (Central Processing Unit) is required. In addition, since the processing time required for position detection is also required in units of several seconds, it is not suitable for applications requiring immediate responsiveness, such as automatic driving of vehicles and preventive safety technology.

 本発明は以上のような課題を解決するためになされたものであり、高精度な位置検出と即時応答性を兼ね備える車両位置推定装置を提供することを目的とする。 The present invention has been made to solve the above problems, and an object of the present invention is to provide a vehicle position estimating apparatus having both high-accuracy position detection and immediate responsiveness.

 本発明に係る車両位置推定装置は、衛星測位で得られる自車の絶対測位情報および自車の車両センサから得られる車両センサ情報に基づいて、地図上の自車の位置および方位を示す第1の推定自車位置情報を算出する車両位置推定部と、第1の推定自車位置情報と、地図のデータに含まれる道路地物の情報および自車の撮像センサから得られる撮像センサ情報の片方または両方とに基づいて、検出対象とする道路地物を選択し、選択した道路地物の自車からの相対位置および相対方位を、自車の光学測距センサから得られる光学測距情報から検出し、光学測距情報から検出した道路地物の相対位置および相対方位に基づいて、地図上の自車の位置および方位を示す第2の推定自車位置情報を算出する道路地物位置・方位検出部と、を備えるものである。 The vehicle position estimating device according to the present invention is a first vehicle position estimating device that indicates a position and an azimuth of the own vehicle on a map based on absolute positioning information of the own vehicle obtained by satellite positioning and vehicle sensor information obtained from a vehicle sensor of the own vehicle. A vehicle position estimating unit for calculating the estimated own vehicle position information, one of the first estimated own vehicle position information, information on road features included in the map data, and image sensor information obtained from the image sensor of the own vehicle. Or, based on both, select a road feature to be detected, and determine the relative position and relative orientation of the selected road feature from the own vehicle from the optical ranging information obtained from the optical ranging sensor of the own vehicle. Based on the detected relative position and relative orientation of the road feature detected from the optical ranging information, the road feature position and second estimated vehicle position information indicating the position and orientation of the vehicle on the map are calculated. An orientation detection unit. Than it is.

 本発明によれば、第2の推定自車位置情報を求める際に、光学測距情報に対する検出処理を行う範囲が、第1の推定自車位置情報から推定した道路地物の位置を含む範囲に限定するため、検出処理に要する時間を短縮化できる。それにより、高精度な位置検出と即時応答性を兼ね備える車両位置推定装置を実現できる。 According to the present invention, when obtaining the second estimated own-vehicle position information, the range in which the detection processing is performed on the optical ranging information includes the position of the road feature estimated from the first estimated own-vehicle position information. , The time required for the detection process can be reduced. As a result, a vehicle position estimating device having both high-accuracy position detection and immediate responsiveness can be realized.

 本発明の目的、特徴、態様、および利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects, and advantages of the present invention will become more apparent from the following detailed description and the accompanying drawings.

本発明の実施の形態に係る車両位置推定システムの構成を示すブロック図である。1 is a block diagram illustrating a configuration of a vehicle position estimation system according to an embodiment of the present invention. 光学測距センサ装置の検出可能範囲の例を示す図である。It is a figure showing an example of a detectable range of an optical distance measuring sensor device. 道路地物の検出可能面積の導出例を示す図である。It is a figure which shows the example of derivation of the detectable area of a road feature. 道路地物の検出処理の対象範囲の例を示す図である。It is a figure which shows the example of the target range of the detection processing of a road feature. 本発明の実施の形態に係る車両位置推定装置の動作を示すフローチャートである。4 is a flowchart illustrating an operation of the vehicle position estimation device according to the embodiment of the present invention. 車両位置推定装置のハードウェア構成の例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration of a vehicle position estimation device. 車両位置推定装置のハードウェア構成の例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration of a vehicle position estimation device.

 図1は、本発明の実施の形態に係る車両位置推定システムの構成を示すブロック図である。図1のように、当該車両位置推定システムは、衛星測位装置1、車両センサ情報出力装置2、撮像センサ装置3、光学測距センサ装置4、高精度地図データベース5および車両位置推定装置6を備えている。以下、当該車両位置推定システムを搭載した車両を「自車」という。 FIG. 1 is a block diagram showing a configuration of a vehicle position estimation system according to an embodiment of the present invention. As shown in FIG. 1, the vehicle position estimation system includes a satellite positioning device 1, a vehicle sensor information output device 2, an imaging sensor device 3, an optical ranging sensor device 4, a high-accuracy map database 5, and a vehicle position estimation device 6. ing. Hereinafter, a vehicle equipped with the vehicle position estimation system is referred to as “own vehicle”.

 衛星測位装置1は、GPS(Global Positioning System)衛星などのGNSS(Global Navigation Satellite System)衛星が送信する測位信号に基づく衛星測位により自車の絶対位置(緯度、経度、高度)および絶対方位を算出し、算出した自車の絶対位置および絶対方位の情報を、「絶対測位情報」として出力する。 The satellite positioning device 1 calculates the absolute position (latitude, longitude, altitude) and absolute azimuth of the own vehicle by satellite positioning based on a positioning signal transmitted by a Global Positioning System (GNSS) satellite such as a GPS (Global Positioning System) satellite. Then, the calculated information on the absolute position and the absolute direction of the own vehicle is output as “absolute positioning information”.

 車両センサ情報出力装置2は、自車に搭載された車速センサ、ジャイロ、ステアリング角センサ、気圧センサなどの車両センサから得られる「車両センサ情報」を出力する。車両センサ情報には、走行速度、回転角、ステアリング角、気圧のうちの少なくとも1つの情報が含まれるものとする。 The vehicle sensor information output device 2 outputs “vehicle sensor information” obtained from vehicle sensors such as a vehicle speed sensor, a gyro, a steering angle sensor, and an air pressure sensor mounted on the own vehicle. It is assumed that the vehicle sensor information includes at least one of travel speed, rotation angle, steering angle, and air pressure.

 撮像センサ装置3は、例えば単眼カメラなどの撮像装置(カメラセンサ)を備え、自車の周辺を撮影した画像から、自車が位置する道路周辺の地物(以下「道路地物」という)の位置、形状、種別などを検出する。道路地物には、交通道路標識や交通信号機など立体的な地物だけでなく、路面に描かれた区画線や停止線など平面的な地物も含まれる。なお、撮像センサ装置3によって検出される道路地物の位置は、自車からの相対位置である。撮像センサ装置3は、検出した道路地物の位置、形状、種別などの情報およびその検出精度の情報を、「撮像センサ情報」として出力する。 The imaging sensor device 3 includes an imaging device (camera sensor) such as a monocular camera, for example, and detects a feature around the road where the own vehicle is located (hereinafter, referred to as a “road feature”) from an image of the vicinity of the own vehicle. Detect position, shape, type, etc. Road features include not only three-dimensional features such as traffic road signs and traffic lights, but also planar features such as lane markings and stop lines drawn on the road surface. In addition, the position of the road feature detected by the imaging sensor device 3 is a relative position from the own vehicle. The imaging sensor device 3 outputs information on the detected position, shape, type, and the like of the road feature and information on its detection accuracy as “imaging sensor information”.

 光学測距センサ装置4は、LiDARやステレオカメラなどの光学測距センサを備え、自車の周辺に存在する障害物までの距離および、反射強度または輝度・色の情報を含む「光学測距情報」を出力する。例えばLiDARでは、光を反射する物体が障害物として検出される。また、ステレオカメラでは、視差画像(異なる複数の視点から撮影した画像)から距離を検出可能な物体が障害物として検出される。 The optical ranging sensor device 4 includes an optical ranging sensor such as a LiDAR or a stereo camera, and includes “optical ranging information including information on a distance to an obstacle existing around the own vehicle and reflection intensity or luminance / color. Is output. For example, in LiDAR, an object that reflects light is detected as an obstacle. In the stereo camera, an object whose distance can be detected from a parallax image (images captured from a plurality of different viewpoints) is detected as an obstacle.

 ここで、光学測距センサ装置4の検出可能範囲は、車両位置推定装置6にパラメータとして予め設定されているものとする。当該範囲は、光学測距センサ装置4の性能に応じて変わるため、実験等に基づいて決定されることが好ましい。 Here, it is assumed that the detectable range of the optical distance measuring sensor device 4 is preset in the vehicle position estimating device 6 as a parameter. Since the range changes according to the performance of the optical distance measuring sensor device 4, it is preferable that the range is determined based on an experiment or the like.

 図2に、光学測距センサ装置4の検出可能範囲の例を示す。図2では、自車位置を原点として、自車の進行方向(車頭方向)にX軸、自車の横方向にY軸、高さ方向にZ軸をとっている。Z=0であるXY平面が道路の路面に相当する。原点とする自車位置は、車両位置推定装置6が推定する自車位置に相当し、例えば、自車の重心位置、中心位置、あるいは車頭位置などとされる。光学測距センサ装置4の検出可能範囲は、図2のように、検出可能距離L1と、XY平面上の検出可能角度Θ1と、ZY平面上の検出可能角度Θ2とによって規定される空間として定義できる。一般的な光学測距センサでは、検出可能距離はXY平面上でもZX平面上でも同じと考えられるが、XY平面上とZX平面上とで異なる検出可能距離が設定されてもよい。 FIG. 2 shows an example of the detectable range of the optical distance measuring sensor device 4. In FIG. 2, the X axis is set in the traveling direction (head direction) of the host vehicle, the Y axis is set in the lateral direction of the host vehicle, and the Z axis is set in the height direction with the host vehicle position as the origin. The XY plane with Z = 0 corresponds to the road surface of the road. The own vehicle position as the origin corresponds to the own vehicle position estimated by the vehicle position estimating device 6, and is, for example, the center of gravity position, the center position, or the head position of the own vehicle. The detectable range of the optical ranging sensor device 4 is defined as a space defined by a detectable distance L1, a detectable angle Θ1 on the XY plane, and a detectable angle Θ2 on the ZY plane, as shown in FIG. it can. In a general optical distance measuring sensor, the detectable distance is considered to be the same on the XY plane and the ZX plane, but different detectable distances may be set on the XY plane and the ZX plane.

 図1に戻り、高精度地図データベース5は、高精度地図のデータが格納されたデータベースである。高精度地図には、道路の区画線の位置、形状および種別といった道路の車線形状を示す情報や、停止線の位置、道路周辺に設置された交通道路標識や交通信号機の位置、形状、種別、向きの情報など、道路地物の詳細な情報が含まれている。 Returning to FIG. 1, the high-precision map database 5 is a database in which data of high-precision maps is stored. The high-precision map includes information indicating the lane shape of the road, such as the position, shape, and type of lane markings of the road, the position of the stop line, the position, shape, type of traffic road signs and traffic signals installed around the road, Detailed information on road features, such as direction information, is included.

 高精度地図データベース5は、自車に搭載されていなくてもよく、例えば、通信により高精度地図のデータを車両位置推定装置6に配信するサーバであってもよい。また、図1では、高精度地図データベース5は車両位置推定装置6との異なるブロックとして描かれているが、高精度地図データベース5は車両位置推定装置6の内部に設けられていてもよい。また、高精度地図データベース5および車両位置推定装置6は、自車のナビゲーションシステムに備えられたものでもよい。 The high-accuracy map database 5 may not be mounted on the own vehicle, and may be, for example, a server that distributes high-accuracy map data to the vehicle position estimating device 6 by communication. In FIG. 1, the high-accuracy map database 5 is depicted as a different block from the vehicle position estimating device 6, but the high-accurate map database 5 may be provided inside the vehicle position estimating device 6. Further, the high-accuracy map database 5 and the vehicle position estimating device 6 may be provided in a navigation system of the own vehicle.

 車両位置推定装置6は、図1に示すように、車両位置推定部61と、道路地物位置・方位検出部62とを備えている。 The vehicle position estimating device 6 includes a vehicle position estimating unit 61 and a road feature position / azimuth detecting unit 62, as shown in FIG.

 車両位置推定部61は、衛星測位装置1が出力する絶対測位情報(自車の絶対位置および絶対方位)と、車両センサ情報出力装置2が出力する車両センサ情報(自車の走行速度、回転角、ステアリング角、気圧の1つ以上)と、高精度地図データベース5に格納されている高精度地図のデータとに基づいて、高精度地図上の自車の位置および方位を推定し、その推定結果を「第1の推定自車位置情報」として出力する。なお、車両位置推定部61には絶対測位情報および車両センサ情報が一定周期で入力され、車両位置推定部61は第1の推定自車位置情報を一定周期で出力する。 The vehicle position estimating unit 61 includes the absolute positioning information (the absolute position and the absolute azimuth of the own vehicle) output from the satellite positioning device 1 and the vehicle sensor information (the traveling speed and the rotation angle of the own vehicle) output from the vehicle sensor information output device 2. , Steering angle, and air pressure) and the position and orientation of the vehicle on the high-accuracy map based on the high-accuracy map data stored in the high-accuracy map database 5. Is output as “first estimated vehicle position information”. Note that the absolute positioning information and the vehicle sensor information are input to the vehicle position estimating unit 61 at regular intervals, and the vehicle position estimating unit 61 outputs the first estimated own vehicle position information at regular intervals.

 道路地物位置・方位検出部62は、車両位置推定部61が出力した第1の推定自車位置情報(車両位置推定部61が推定した高精度地図上の自車の位置および方位)と、高精度地図データベース5に格納されている高精度地図および撮像センサ装置3が出力する撮像センサ情報(撮像センサ装置3が検出した道路地物の位置、形状、種別など)の少なくとも片方と、に基づいて、光学測距センサ装置4が出力する光学測距情報から検出する道路地物を選択する。また、道路地物位置・方位検出部62は、選択した道路地物の自車からの相対位置および相対方位を、光学測距センサ装置4が出力する光学測距情報から検出する。さらに、道路地物位置・方位検出部62は、撮像センサ情報から検出した当該道路地物の相対位置および相対方位と、高精度地図のデータとに基づいて、高精度地図上の自車の位置および方位を推定する。道路地物位置・方位検出部62が推定した高精度地図上の自車の位置および方位の情報は、「第2の推定自車位置情報」として道路地物位置・方位検出部62から出力される。 The road feature position / azimuth detecting unit 62 outputs the first estimated own vehicle position information (the position and the azimuth of the own vehicle on the high-accuracy map estimated by the vehicle position estimating unit 61) output by the vehicle position estimating unit 61, Based on the high-accuracy map stored in the high-accuracy map database 5 and at least one of the image sensor information (the position, shape, type, and the like of the road feature detected by the image sensor 3) output by the image sensor 3 Then, a road feature to be detected is selected from the optical ranging information output by the optical ranging sensor device 4. In addition, the road feature position / azimuth detecting unit 62 detects the relative position and relative orientation of the selected road feature from the own vehicle from the optical ranging information output by the optical ranging sensor device 4. Further, the road feature position / azimuth detecting unit 62 detects the position of the vehicle on the high-precision map based on the relative position and relative orientation of the road feature detected from the imaging sensor information and the data of the high-precision map. And the direction is estimated. The information on the position and orientation of the vehicle on the high-accuracy map estimated by the road feature position / azimuth detecting unit 62 is output from the road feature position / azimuth detecting unit 62 as “second estimated own vehicle position information”. You.

 ここで、道路地物位置・方位検出部62は、検出対象とする道路地物として、自車からの相対位置および相対方位をより高速かつ高精度に検出可能なもの、すなわち、光学測距情報の密度と精度がより高くなるものを選択する。具体的には、自車からの距離が短く(検出距離が短く)、かつ、自車から見たときのサイズが大きい(検出可能面積が大きい)道路地物が、検出対象として選択される。例えば、第1の推定自車位置情報と高精度地図または撮像センサ情報とに基づく道路地物の検出処理により、半径10cmの筒状のポールと、半径30cmの円盤状の道路標識とが、自車から同じ距離に検出された場合、道路地物位置・方位検出部62は、両者のうちの道路標識を検出対象として選択する。 Here, the road feature position / azimuth detection unit 62 can detect the relative position and relative orientation from the own vehicle at higher speed and with higher accuracy as the road feature to be detected, that is, the optical distance measurement information. Select the one with higher density and accuracy. Specifically, a road feature having a short distance from the own vehicle (a short detection distance) and a large size (a large detectable area) when viewed from the own vehicle is selected as a detection target. For example, by performing a road feature detection process based on the first estimated own vehicle position information and the high-precision map or the imaging sensor information, a 10-cm-diameter cylindrical pole and a 30-cm-radius disk-shaped road sign are automatically detected. If detected at the same distance from the car, the road feature position / azimuth detecting unit 62 selects a road sign of both as a detection target.

 自車から見た道路地物のサイズ(検出可能面積)は、自車の向き(光学測距センサ装置4の検出方向)と道路地物の面の向きとの関係により変わるため、道路地物位置・方位検出部62が、自車と道路地物との位置関係に基づいて道路地物のサイズを算出してもよい。例えば図3のように、面積Sを持つ矩形の道路地物の面と、光学測距センサ装置4の検出方向とが成す角度(検出角度)がΘaである場合、当該道路地物の検出可能面積は、S・sinΘaとなる。 Since the size (detectable area) of the road feature viewed from the own vehicle changes depending on the relationship between the direction of the own vehicle (the detection direction of the optical ranging sensor device 4) and the direction of the surface of the road feature, the road feature is determined. The position / direction detection unit 62 may calculate the size of the road feature based on the positional relationship between the vehicle and the road feature. For example, as shown in FIG. 3, when the angle (detection angle) between the surface of a rectangular road feature having an area S and the detection direction of the optical distance measuring sensor device 4 is Θa, the road feature can be detected. The area is S · sinΘa.

 また、道路地物位置・方位検出部62は、検出対象として選択した道路地物の相対位置および相対方位を検出する際、光学測距センサ装置4が出力する光学測距情報(点群情報)のうち、当該道路地物が位置すると推定される範囲内の光学測距情報に対して検出処理を行う。なお、当該道路地物の位置は、検出対象とする道路地物を選択する段階で分かっている。このように道路地物の検出処理の対象とする光学測距情報の範囲を制限することで、検出処理にかかる演算負荷を低減でき、高速に道路地物の位置および方位を検出できる。 When detecting the relative position and relative orientation of the road feature selected as the detection target, the road feature position / azimuth detecting unit 62 outputs the optical ranging information (point group information) output by the optical ranging sensor device 4. Among them, the detection processing is performed on the optical ranging information within the range in which the road feature is estimated to be located. The position of the road feature is known at the stage of selecting the road feature to be detected. By limiting the range of the optical ranging information to be subjected to the road feature detection processing in this way, the calculation load required for the detection processing can be reduced, and the position and orientation of the road feature can be detected at high speed.

 道路地物の検出処理の対象とする光学測距情報の範囲は、推定された道路地物の位置および形状に相当する範囲の外縁に、予め定められた余裕度に相当する幅を持つ領域を加えた範囲とする。例えば、検出対象として選択された道路地物が、図4のような矩形の道路標識であった場合、検出処理の対象とする光学測距情報の範囲は、道路標識の形状を包括する半径Rの円の外縁に余裕度mの幅を持つ領域を加えて得られる範囲、つまり図4に示す半径R+mの円の範囲として規定される。 The range of the optical distance measurement information to be subjected to the road feature detection processing includes an area having a width corresponding to a predetermined margin at the outer edge of the range corresponding to the estimated position and shape of the road feature. The range is added. For example, when the road feature selected as the detection target is a rectangular road sign as shown in FIG. 4, the range of the optical ranging information to be subjected to the detection processing is a radius R including the shape of the road sign. 4 is defined as a range obtained by adding a region having a margin m to the outer edge of the circle, that is, a range of a circle having a radius R + m shown in FIG.

 余裕度mは、第1の推定自車位置情報の推定誤差e以上の値とすることが望ましい。その理由は、道路地物が位置すると推定される範囲は、第1の推定自車位置情報が示す高精度地図上の自車位置を基準にした道路地物の位置(相対位置)として求められるため、第1の推定自車位置情報に生じた誤差と同じ誤差が道路地物位置の位置にも生じるためである。推定誤差eの情報は、第1の推定自車位置情報に含まれてもよいし、前もって実験等から求めた値でもよい。 The allowance m is preferably a value equal to or larger than the estimation error e of the first estimated own vehicle position information. The reason is that the range where the road feature is estimated to be located is obtained as the position (relative position) of the road feature based on the own vehicle position on the high-accuracy map indicated by the first estimated own vehicle position information. For this reason, the same error as the error generated in the first estimated own-vehicle position information also occurs in the position of the road feature position. The information of the estimation error e may be included in the first estimated own-vehicle position information, or may be a value previously obtained from an experiment or the like.

 また、光学測距情報を用いた道路地物の検出処理には、高精度地図の道路地物の情報に含まれる形状情報(円、矩形、円柱など)に応じて、最適な検出モデルを適用する。例えば、道路地物の形状情報が円柱を示す場合、PCL(Point Cloud Library)のCylinderモデルを用いて円柱を検出するとよい。そうすることで、道路地物の検出処理を、特定の形状を抽出するための処理に特化して実行できるため、道路地物の検出処理負荷が低減され、高速に道路地物の位置および方位を検出できるようになる。 In addition, for the detection of road features using optical ranging information, an optimal detection model is applied according to the shape information (circle, rectangle, cylinder, etc.) included in the road feature information of the high-precision map. I do. For example, when the shape information of a road feature indicates a cylinder, the cylinder may be detected using a Cylinder model of PCL (Point @ Cloud @ Library). By doing so, the road feature detection process can be performed specifically for the process of extracting a specific shape, so that the load on the road feature detection processing is reduced, and the position and orientation of the road feature are rapidly increased. Can be detected.

 道路地物位置・方位検出部62が算出した第2の推定自車位置情報が示す自車の位置は、光学測距情報から検出した道路地物の位置に基づいて得られるためその精度は高い。また、道路地物位置・方位検出部62は、第2の推定自車位置情報を求める際、道路地物の検出処理の対象とする光学測距情報の範囲を制限しているため、第2の推定自車位置情報を短時間で得ることができる。よって、本実施の形態の車両位置推定装置6は、高精度な位置検出と即時応答性を兼ね備えている。また、道路地物の検出処理の対象とする光学測距情報の範囲を制限することで、検出処理にかかる演算負荷が低減されることから、安価でかつ低スペックなCPUを持つ演算装置を用いて、高精度な自車位置の推定が可能となるという効果も得られる。 Since the position of the own vehicle indicated by the second estimated own vehicle position information calculated by the road feature position / azimuth detecting unit 62 is obtained based on the position of the road feature detected from the optical ranging information, the accuracy is high. . Further, the road feature position / azimuth detecting unit 62 restricts the range of the optical distance measurement information to be subjected to the road feature detection processing when obtaining the second estimated own vehicle position information. Can be obtained in a short time. Therefore, the vehicle position estimating device 6 of the present embodiment has both high-accuracy position detection and immediate responsiveness. Also, by limiting the range of the optical ranging information to be subjected to the road feature detection processing, the calculation load for the detection processing is reduced. Therefore, the effect that the position of the own vehicle can be estimated with high accuracy can be obtained.

 しかし、道路地物の検出処理の対象とする光学測距情報の範囲を制限したとしても、検出処理にある程度の時間を要することが考えられる(200~300ミリ秒の時間を要する場合もある)。そのため、特に自車が走行中のときには、第2の推定自車位置情報が示す自車の位置および方位と、現在の自車の位置および方位との間に差が生じ得る。 However, even if the range of the optical ranging information to be subjected to the road feature detection processing is limited, the detection processing may take a certain amount of time (200 to 300 milliseconds may be required). . Therefore, especially when the own vehicle is running, a difference may occur between the position and the azimuth of the own vehicle indicated by the second estimated own vehicle position information and the current position and the azimuth of the own vehicle.

 そこで本実施の形態では、車両位置推定部61が、第2の推定自車位置情報を用いて第1の推定自車位置情報の誤差を求め、その誤差に基づいて現在の(最新の)第1の推定自車位置情報を補正することで、より正確な自車の位置および方位を求める。具体的には、車両位置推定部61は、道路地物位置・方位検出部62から第2の推定自車位置情報とその算出に用いた光学測距情報の取得時刻の情報とを取得し、それと同じ時刻における第1の推定自車位置情報と、道路地物位置・方位検出部62から取得した第2の推定自車位置情報との差分を求め、得られた差分を第1の推定自車位置情報の誤差とみなす。そして、車両位置推定部61は、その誤差を最新の第1の推定自車位置情報に加算することで、より正確な自車の位置および方位を算出する。 Therefore, in the present embodiment, the vehicle position estimating unit 61 obtains an error in the first estimated own vehicle position information using the second estimated own vehicle position information, and based on the error, determines the current (latest) By correcting the estimated own vehicle position information of No. 1, a more accurate position and orientation of the own vehicle are obtained. Specifically, the vehicle position estimating unit 61 acquires from the road feature position / azimuth detecting unit 62 the second estimated own vehicle position information and the information on the acquisition time of the optical ranging information used for the calculation, The difference between the first estimated own vehicle position information at the same time and the second estimated own vehicle position information acquired from the road feature position / azimuth detecting unit 62 is obtained, and the obtained difference is used as the first estimated own vehicle position information. It is regarded as an error of the vehicle position information. Then, the vehicle position estimating unit 61 calculates the more accurate position and orientation of the own vehicle by adding the error to the latest first estimated own vehicle position information.

 なお、車両位置推定部61が上記の補正を行うためには、車両位置推定部61が、現在から一定時間(200~300ミリ秒以上)遡った時刻までの第1の推定自車位置情報を、その算出時刻とともに保持しておく必要がある。 In order for the vehicle position estimating unit 61 to perform the above-described correction, the vehicle position estimating unit 61 uses the first estimated own-vehicle position information from the present time to a certain time (200 to 300 ms or more). Must be held together with the calculation time.

 図5は、車両位置推定装置6の動作を示すフローチャートである。以下、図5を参照しつつ、車両位置推定装置6が自車位置を推定するための処理(自車位置推定処理)について説明する。なお、図5のフローは、車両位置推定装置6の起動後、繰り返し実行される。 FIG. 5 is a flowchart showing the operation of the vehicle position estimation device 6. Hereinafter, with reference to FIG. 5, a process for the vehicle position estimating device 6 to estimate the own vehicle position (own vehicle position estimating process) will be described. Note that the flow of FIG. 5 is repeatedly executed after the activation of the vehicle position estimating device 6.

 まず、車両位置推定部61が、衛星測位装置1が出力する絶対測位情報と、車両センサ情報出力装置2が出力する車両センサ情報とに基づいて、高精度地図上の自車の位置および方位の情報を含む第1の推定自車位置情報を算出する(ステップS100)。 First, the vehicle position estimating unit 61 determines the position and orientation of the vehicle on the high-precision map based on the absolute positioning information output by the satellite positioning device 1 and the vehicle sensor information output by the vehicle sensor information output device 2. First estimated vehicle position information including the information is calculated (step S100).

 次に、道路地物位置・方位検出部62が、第1の推定自車位置情報が示す自車位置の周辺に存在する道路地物のうち、光学測距センサ装置4の検出可能範囲にある道路地物の情報を、高精度地図データベース5から取得する(ステップS101)。 Next, the road feature position / azimuth detecting unit 62 is in a detectable range of the optical distance measuring sensor device 4 among the road features existing around the own vehicle position indicated by the first estimated own vehicle position information. Information on road features is acquired from the high-accuracy map database 5 (step S101).

 そして、道路地物位置・方位検出部62は、ステップS100で算出された第1の推定自車位置情報と、ステップS101で取得した道路地物の情報および撮像センサ装置3が出力する撮像センサ情報の片方または両方と、に基づいて、検出対象とする道路地物を選択する(ステップS102)。 Then, the road feature position / azimuth detecting unit 62 calculates the first estimated own vehicle position information calculated in step S100, the road feature information acquired in step S101, and the imaging sensor information output by the imaging sensor device 3. A road feature to be detected is selected based on one or both of the above (step S102).

 続いて、道路地物位置・方位検出部62は、光学測距センサ装置4が出力する光学測距情報(点群情報)に対する検出処理を行うことで、ステップS102で選択した道路地物の自車からの相対位置および相対方位を検出する(ステップS103)。この検出処理は、光学測距センサ装置4が出力する光学測距情報のうち、当該道路地物が位置すると推定される範囲内の光学測距情報に対して行われる。 Subsequently, the road feature position / azimuth detecting unit 62 performs a detection process on the optical ranging information (point group information) output from the optical ranging sensor device 4, so that the road feature selected in step S <b> 102 can be used. The relative position and relative azimuth from the car are detected (step S103). This detection process is performed on the optical ranging information output from the optical ranging sensor device 4 within the range where the road feature is estimated to be located.

 さらに、道路地物位置・方位検出部62は、ステップS101で取得した道路地物の情報と、ステップS103で検出した道路地物の自車からの相対位置および相対方位に基づいて、高精度地図上の自車の位置および方位の情報を含む第2の推定自車位置情報を算出する(ステップS104)。 Further, the road feature position / azimuth detection unit 62 performs a high-precision map based on the information on the road feature obtained in step S101 and the relative position and relative orientation of the road feature detected from the own vehicle in step S103. The second estimated own vehicle position information including the information on the position and the direction of the upper own vehicle is calculated (step S104).

 最後に、車両位置推定部61が、ステップS104で算出された第2の推定自車位置情報と、その算出に使用された光学測距情報の取得時刻に対応する第1の推定自車位置情報との差分を、第1の推定自車位置情報の誤差として求め、その誤差を最新の第1の推定自車位置情報に加算することで、最新の第1の推定自車位置情報を補正する(ステップS105)。補正後の第1の推定自車位置情報により、現在の自車の位置および方位が得られる。 Lastly, the vehicle position estimating unit 61 calculates the second estimated own vehicle position information calculated in step S104 and the first estimated own vehicle position information corresponding to the acquisition time of the optical ranging information used for the calculation. Is obtained as an error of the first estimated own vehicle position information, and the error is added to the latest first estimated own vehicle position information, thereby correcting the latest first estimated own vehicle position information. (Step S105). Based on the corrected first estimated own-vehicle position information, the current position and orientation of the own-vehicle can be obtained.

 図6および図7は、それぞれ車両位置推定装置6のハードウェア構成の例を示す図である。図1に示した車両位置推定装置6の構成要素の各機能は、例えば図6に示す処理回路70により実現される。すなわち、車両位置推定装置6は、衛星測位で得られる自車の絶対測位情報および自車の車両センサから得られる車両センサ情報に基づいて、地図上の自車の位置および方位を示す第1の推定自車位置情報を算出し、第1の推定自車位置情報と、地図のデータに含まれる道路地物の情報および自車の撮像センサから得られる撮像センサ情報の片方または両方とに基づいて、検出対象とする道路地物を選択し、選択した道路地物の自車からの相対位置および相対方位を、自車の光学測距センサから得られる光学測距情報から検出し、光学測距情報から検出した道路地物の相対位置および相対方位に基づいて、地図上の自車の位置および方位を示す第2の推定自車位置情報を算出するための処理回路70を備える。処理回路70は、専用のハードウェアであってもよいし、メモリに格納されたプログラムを実行するプロセッサ(中央処理装置(CPU:Central Processing Unit)、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)とも呼ばれる)を用いて構成されていてもよい。 FIGS. 6 and 7 are diagrams illustrating examples of the hardware configuration of the vehicle position estimating device 6, respectively. Each function of the components of the vehicle position estimation device 6 shown in FIG. 1 is realized by, for example, a processing circuit 70 shown in FIG. That is, the vehicle position estimating device 6 uses the first absolute position information of the own vehicle obtained by the satellite positioning and the vehicle sensor information obtained from the own vehicle sensor to indicate the position and the direction of the own vehicle on the map. Estimated own vehicle position information is calculated, and based on the first estimated own vehicle position information and one or both of the road feature information included in the map data and the image sensor information obtained from the own vehicle image sensor. Selecting the road feature to be detected, detecting the relative position and relative orientation of the selected road feature from the own vehicle from optical ranging information obtained from the optical ranging sensor of the own vehicle, and performing optical ranging. A processing circuit 70 is provided for calculating second estimated vehicle position information indicating the position and direction of the vehicle on the map based on the relative position and direction of the road feature detected from the information. The processing circuit 70 may be dedicated hardware, or a processor that executes a program stored in a memory (a central processing unit (CPU: Central Processing Unit), a processing device, an arithmetic device, a microprocessor, a microcomputer, It may be configured using a DSP (also called Digital Signal Processor).

 処理回路70が専用のハードウェアである場合、処理回路70は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものなどが該当する。車両位置推定装置6の構成要素の各々の機能が個別の処理回路で実現されてもよいし、それらの機能がまとめて一つの処理回路で実現されてもよい。 When the processing circuit 70 is dedicated hardware, the processing circuit 70 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), and an FPGA (Field-Programmable). Gate Array) or a combination of these. Each function of the components of the vehicle position estimating device 6 may be realized by an individual processing circuit, or the functions may be realized by one processing circuit.

 図7は、処理回路70がプログラムを実行するプロセッサ71を用いて構成されている場合における車両位置推定装置6のハードウェア構成の例を示している。この場合、車両位置推定装置6の構成要素の機能は、ソフトウェア等(ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせ)により実現される。ソフトウェア等はプログラムとして記述され、メモリ72に格納される。プロセッサ71は、メモリ72に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、車両位置推定装置6は、プロセッサ71により実行されるときに、衛星測位で得られる自車の絶対測位情報および自車の車両センサから得られる車両センサ情報に基づいて、地図上の自車の位置および方位を示す第1の推定自車位置情報を算出する処理と、第1の推定自車位置情報と、地図のデータに含まれる道路地物の情報および自車の撮像センサから得られる撮像センサ情報の片方または両方とに基づいて、検出対象とする道路地物を選択する処理と、選択した道路地物の自車からの相対位置および相対方位を、自車の光学測距センサから得られる光学測距情報から検出する処理と、光学測距情報から検出した道路地物の相対位置および相対方位に基づいて、地図上の自車の位置および方位を示す第2の推定自車位置情報を算出する処理と、が結果的に実行されることになるプログラムを格納するためのメモリ72を備える。換言すれば、このプログラムは、車両位置推定装置6の構成要素の動作の手順や方法をコンピュータに実行させるものであるともいえる。 FIG. 7 shows an example of a hardware configuration of the vehicle position estimating device 6 when the processing circuit 70 is configured using the processor 71 that executes a program. In this case, the functions of the components of the vehicle position estimation device 6 are realized by software or the like (software, firmware, or a combination of software and firmware). Software and the like are described as programs and stored in the memory 72. The processor 71 implements the function of each unit by reading and executing the program stored in the memory 72. That is, when the vehicle position estimating device 6 is executed by the processor 71, the vehicle position estimating device 6 uses the vehicle positioning information on the map based on the absolute positioning information of the vehicle obtained by satellite positioning and the vehicle sensor information obtained from the vehicle sensor of the vehicle. Calculating the first estimated own-vehicle position information indicating the position and direction of the vehicle, the first estimated own-vehicle position information, the information on road features included in the map data, and the image sensor of the own vehicle. Based on one or both of the imaging sensor information, a process of selecting a road feature to be detected, and the relative position and relative orientation of the selected road feature from the own vehicle from the optical ranging sensor of the own vehicle. A second estimated own-vehicle position indicating a position and an azimuth of the own vehicle on a map based on a process of detecting from the obtained optical ranging information and a relative position and a relative azimuth of the road feature detected from the optical ranging information. Calculate information Comprising a process of the memory 72 for storing a program that will but executed consequently. In other words, it can be said that this program causes a computer to execute the procedure and method of operation of the components of the vehicle position estimation device 6.

 ここで、メモリ72は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリー、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)およびそのドライブ装置等、または、今後使用されるあらゆる記憶媒体であってもよい。 Here, the memory 72 is, for example, a non-volatile or non-volatile memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), and an EEPROM (Electrically Erasable Programmable Read Only Memory). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc) and its drive device, or any storage medium used in the future. You may.

 以上、車両位置推定装置6の構成要素の機能が、ハードウェアおよびソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限ったものではなく、車両位置推定装置6の一部の構成要素を専用のハードウェアで実現し、別の一部の構成要素をソフトウェア等で実現する構成であってもよい。例えば、一部の構成要素については専用のハードウェアとしての処理回路70でその機能を実現し、他の一部の構成要素についてはプロセッサ71としての処理回路70がメモリ72に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。 The configuration in which the functions of the components of the vehicle position estimation device 6 are realized by one of hardware and software has been described above. However, the present invention is not limited to this, and some components of the vehicle position estimating device 6 may be realized by dedicated hardware, and another part of the components may be realized by software or the like. For example, for some components, the function is realized by a processing circuit 70 as dedicated hardware, and for other components, the processing circuit 70 as a processor 71 executes a program stored in a memory 72. The function can be realized by reading and executing.

 以上のように、車両位置推定装置6は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the vehicle position estimating apparatus 6 can realize the above-described functions by hardware, software, or the like, or a combination thereof.

 なお、本発明は、その発明の範囲内において、実施の形態を適宜、変形、省略することが可能である。 In the present invention, the embodiments can be appropriately modified and omitted within the scope of the present invention.

 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、この発明がそれに限定されるものではない。例示されていない無数の変形例が、この発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is illustrative in all aspects, and the present invention is not limited thereto. It is understood that innumerable modifications that are not illustrated can be assumed without departing from the scope of the present invention.

 1 衛星測位装置、2 車両センサ情報出力装置、3 撮像センサ装置、4 光学測距センサ装置、5 高精度地図データベース、6 車両位置推定装置、61 車両位置推定部、62 道路地物位置・方位検出部、70 処理回路、71 プロセッサ、72 メモリ。 1) satellite positioning device, 2) vehicle sensor information output device, 3) imaging sensor device, 4) optical ranging sensor device, 5) high-accuracy map database, 6) vehicle position estimating device, 61) vehicle position estimating section, 62) road feature position / direction detection , 70 processing circuit, 71 processor, 72 memory.

Claims (4)

 衛星測位で得られる自車の絶対測位情報および前記自車の車両センサから得られる車両センサ情報に基づいて、地図上の前記自車の位置および方位を示す第1の推定自車位置情報を算出する車両位置推定部と、
 前記第1の推定自車位置情報と、前記地図のデータに含まれる道路地物の情報および前記自車の撮像センサから得られる撮像センサ情報の片方または両方とに基づいて、検出対象とする道路地物を選択し、選択した前記道路地物の前記自車からの相対位置および相対方位を、前記自車の光学測距センサから得られる光学測距情報から検出し、前記光学測距情報から検出した前記道路地物の相対位置および相対方位に基づいて、前記地図上の前記自車の位置および方位を示す第2の推定自車位置情報を算出する道路地物位置・方位検出部と、
を備える車両位置推定装置。
Based on absolute positioning information of the vehicle obtained by satellite positioning and vehicle sensor information obtained from a vehicle sensor of the vehicle, first estimated vehicle position information indicating the position and direction of the vehicle on a map is calculated. A vehicle position estimating unit,
A road to be detected based on the first estimated vehicle position information and one or both of road feature information included in the map data and image sensor information obtained from an image sensor of the vehicle. Selecting a feature, detecting the relative position and relative orientation of the selected road feature from the own vehicle from optical ranging information obtained from the optical ranging sensor of the own vehicle, and detecting the relative location and relative orientation from the optical ranging information. Based on the detected relative position and relative orientation of the road feature, a road feature position / azimuth detection unit that calculates second estimated own vehicle position information indicating the position and orientation of the own vehicle on the map,
A vehicle position estimating device comprising:
 前記道路地物位置・方位検出部は、選択した前記道路地物の検出処理の対象とする前記光学測距情報の範囲を、前記第1の推定自車位置情報と前記道路地物の情報および前記撮像センサ情報の片方または両方とに基づいて当該道路地物が位置すると推定される範囲に制限する
請求項1に記載の車両位置推定装置。
The road feature position / azimuth detection unit sets the range of the optical ranging information to be the target of the selected road feature detection processing, by using the first estimated own vehicle position information, the information of the road feature, and The vehicle position estimating device according to claim 1, wherein the vehicle position estimating device is limited to a range in which the road feature is estimated to be located based on one or both of the imaging sensor information.
 車両位置推定部は、前記第2の推定自車位置情報と、前記第2の推定自車位置情報の算出に用いた前記光学測距情報の取得時刻に対応する前記第1の推定自車位置情報との差分に基づいて、最新の第1の推定自車位置情報を補正する、
請求項1に記載の車両位置推定装置。
The vehicle position estimating unit is configured to calculate the second estimated own vehicle position information and the first estimated own vehicle position corresponding to an acquisition time of the optical ranging information used for calculating the second estimated own vehicle position information. Correcting the latest first estimated own-vehicle position information based on the difference with the information;
The vehicle position estimating device according to claim 1.
 車両位置推定装置の車両位置推定部が、衛星測位で得られる自車の絶対測位情報および前記自車の車両センサから得られる車両センサ情報に基づいて、地図上の前記自車の位置および方位を示す第1の推定自車位置情報を算出し、
 前記車両位置推定装置の道路地物位置・方位検出部が、前記第1の推定自車位置情報と、前記地図のデータに含まれる道路地物の情報および前記自車の撮像センサから得られる撮像センサ情報の片方または両方とに基づいて、検出対象とする道路地物を選択し、
 前記道路地物位置・方位検出部が、選択した前記道路地物の前記自車からの相対位置および相対方位を、前記自車の光学測距センサから得られる光学測距情報から検出し、
 前記道路地物位置・方位検出部が、前記光学測距情報から検出した前記道路地物の相対位置および相対方位に基づいて、前記地図上の前記自車の位置および方位を示す第2の推定自車位置情報を算出する、
車両位置推定方法。
The vehicle position estimating unit of the vehicle position estimating device, based on the absolute positioning information of the vehicle obtained by satellite positioning and vehicle sensor information obtained from the vehicle sensor of the vehicle, determines the position and orientation of the vehicle on a map. Calculating the first estimated vehicle position information shown in FIG.
The road feature position / azimuth detecting unit of the vehicle position estimating device detects the first estimated own vehicle position information, the information of the road feature included in the map data, and the image obtained from the image sensor of the own vehicle. Based on one or both of the sensor information, select a road feature to be detected,
The road feature position / azimuth detecting unit detects a relative position and a relative orientation of the selected road feature from the own vehicle from optical ranging information obtained from an optical ranging sensor of the own vehicle,
A second estimation indicating the position and orientation of the vehicle on the map based on the relative position and orientation of the road feature detected by the road feature position / azimuth detection unit from the optical ranging information. Calculate own vehicle position information,
Vehicle position estimation method.
PCT/JP2018/027495 2018-07-23 2018-07-23 Vehicle position estimation device and vehicle position estimation method Ceased WO2020021596A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027495 WO2020021596A1 (en) 2018-07-23 2018-07-23 Vehicle position estimation device and vehicle position estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/027495 WO2020021596A1 (en) 2018-07-23 2018-07-23 Vehicle position estimation device and vehicle position estimation method

Publications (1)

Publication Number Publication Date
WO2020021596A1 true WO2020021596A1 (en) 2020-01-30

Family

ID=69181450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/027495 Ceased WO2020021596A1 (en) 2018-07-23 2018-07-23 Vehicle position estimation device and vehicle position estimation method

Country Status (1)

Country Link
WO (1) WO2020021596A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115132061A (en) * 2021-03-25 2022-09-30 本田技研工业株式会社 Map generation device, map generation system, map generation method, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008129867A (en) * 2006-11-21 2008-06-05 Toyota Motor Corp Driving assistance device
JP2016176769A (en) * 2015-03-19 2016-10-06 クラリオン株式会社 Information processing device and vehicle position detection method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008129867A (en) * 2006-11-21 2008-06-05 Toyota Motor Corp Driving assistance device
JP2016176769A (en) * 2015-03-19 2016-10-06 クラリオン株式会社 Information processing device and vehicle position detection method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115132061A (en) * 2021-03-25 2022-09-30 本田技研工业株式会社 Map generation device, map generation system, map generation method, and storage medium

Similar Documents

Publication Publication Date Title
US20190033867A1 (en) Systems and methods for determining a vehicle position
US20180154901A1 (en) Method and system for localizing a vehicle
WO2018227980A1 (en) Camera sensor based lane line map construction method and construction system
WO2016203515A1 (en) Driving lane determining device and driving lane determining method
CN111025308B (en) Vehicle positioning method, device, system and storage medium
JP7113134B2 (en) vehicle controller
CN110271553B (en) Method and apparatus for robustly positioning a vehicle
CN109839636B (en) Object recognition device
US11983890B2 (en) Method and apparatus with motion information estimation
US11908206B2 (en) Compensation for vertical road curvature in road geometry estimation
WO2018212292A1 (en) Information processing device, control method, program and storage medium
CN114670840A (en) Dead angle estimation device, vehicle travel system, and dead angle estimation method
JP2018189463A (en) Vehicle position estimating device and program
JP2023527898A (en) Method and apparatus for processing sensor data
WO2021001018A1 (en) Sourced lateral offset for adas or ad features
CN116027375B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
EP4078521B1 (en) A camera system for a mobile device, a method for localizing a camera and a method for localizing multiple cameras
JP7209912B2 (en) DRIVING ASSISTANCE CONTROL DEVICE AND DRIVING ASSISTANCE CONTROL METHOD
CN113375679B (en) Lane-level positioning method, device, system and related equipment
WO2020021596A1 (en) Vehicle position estimation device and vehicle position estimation method
US12204342B2 (en) Self-location estimation method and self-location estimation device
CN114755663B (en) External reference calibration method and device for vehicle sensor and computer readable storage medium
WO2021033312A1 (en) Information output device, automated driving device, and method for information output
JP6929493B2 (en) Map data processing system and map data processing method
US20250131744A1 (en) Lane-dividing line recognition apparatus for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927800

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18927800

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP