[go: up one dir, main page]

EP3980724A1 - Procédé de création d'une carte de caractéristiques à usage universel - Google Patents

Procédé de création d'une carte de caractéristiques à usage universel

Info

Publication number
EP3980724A1
EP3980724A1 EP20724501.0A EP20724501A EP3980724A1 EP 3980724 A1 EP3980724 A1 EP 3980724A1 EP 20724501 A EP20724501 A EP 20724501A EP 3980724 A1 EP3980724 A1 EP 3980724A1
Authority
EP
European Patent Office
Prior art keywords
measurement data
map
features
received
stored
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP20724501.0A
Other languages
German (de)
English (en)
Inventor
Sebastian Scherer
Peter Biber
Hanno Homann
Marco Lampacrescia
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3980724A1 publication Critical patent/EP3980724A1/fr
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/3867Geometry of map features, e.g. shape points, polygons or for simplified maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • G01C21/3878Hierarchical structures, e.g. layering
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • the invention relates to a method for creating digital maps and a method for performing localization.
  • the invention also relates to a control device, a computer program and a machine-readable storage medium.
  • Localization is an essential part of the function.
  • the exact position of the vehicle or the robot within a map or an environment can be determined by the localization. Based on the determined position, control commands can be generated such that, for example
  • SLAM method is used for simultaneous localization and mapping, especially in applications without access to GNSS data.
  • measurement data from, for example, LIDAR sensors are collected and evaluated to generate a map.
  • a position within the map can be determined.
  • the problem with the SLAM method is its use in dynamic or semi-static environments. Such an environment can
  • the regularly updated map must be available to all users Be made available, whereby an infrastructure for the provision of high data volumes is necessary.
  • the object on which the invention is based can be seen in proposing a method for creating a universally usable digital map with reduced data consumption.
  • a method for creating digital maps by a control device is provided.
  • measurement data of an environment are received during a measurement drive.
  • the measurement drive can be any drive.
  • Measurement data can preferably also be collected by at least one sensor while standing or parking.
  • the corresponding measurement data can then be received and processed by the control unit.
  • a SLAM method is carried out to determine a trajectory of the measurement drive.
  • self-localization is carried out based on a series of measurement data, with the respective positions forming a trajectory during the measurement run.
  • the received measurement data are converted into a
  • the received measurement data can have positions and / or distances relative to a sensor, for example. These relative coordinates can then be transformed into an absolute coordinate system of the trajectory, for example.
  • a coordinate system can, for example, be a Cartesian one
  • the transformed measurement data are used to create an intensity map. It can, for example, an intensity of reflected rays from a or several LIDAR sensors or radar sensors are determined and stored in the form of a map with a received radiation intensity.
  • the features are then extracted from the intensity map and stored in a feature map.
  • the features can preferably be in the
  • Intensity map can be detected. This process can take place, for example, by an algorithm for pattern recognition.
  • the pattern recognition can also be carried out by a neural network which has been appropriately trained beforehand.
  • the pattern recognition can for example be carried out manually by an operator or in an automated manner.
  • an automated pattern recognition can be released or confirmed by the processor.
  • the sample card can preferably be used universally.
  • the sample card can be used independently of sensors or across sensors, so that features can be extracted from differently determined measurement data and used for localization using the feature map.
  • a localization in particular provided by a control device.
  • measurement data of an environment and a feature map are received.
  • the measurement data can be determined by one or more sensors.
  • a sensor can be, for example, a camera sensor, a LIDAR sensor, a radar sensor, an ultrasonic sensor and the like.
  • the sensor can differ from a sensor that was used to create the feature map.
  • features in the received measurement data are recognized and extracted. At least one extracted feature for determining a position is then compared with the features stored in the feature map. If at least one feature has been successfully matched, the position of the sensor or of a vehicle that carries out the measurement with the aid of sensors can be determined.
  • a control device is provided, the control device being set up to carry out the method.
  • the control device can be a vehicle-internal control device, which is in a Vehicle control for performing automated driving functions is integrated or can be connected to the vehicle control.
  • the control device can be configured as a control device external to the vehicle, such as a server unit or cloud technology.
  • a computer program which comprises instructions that are used when executing the
  • Computer program by a computer or a control device cause the latter to carry out the method according to the invention.
  • a machine-readable storage medium is provided on which the computer program according to the invention is stored.
  • the control unit can be installed in a vehicle.
  • at least one measurement run can take place in a vehicle with the control device.
  • the vehicle can be assisted, partially automated, highly automated and / or fully automated or driverless in accordance with the BASt standard.
  • the vehicle can be a robot, a drone, a watercraft and the like.
  • the method can be used on roads, such as, for example, motorways, country roads, urban areas, as well as away from roads or in off-road areas.
  • the method can be used in buildings or halls, in underground rooms, multi-storey car parks and parking garages, tunnels and the like.
  • the at least one sensor for determining measurement data can be part of an environment sensor system or at least one sensor of the vehicle.
  • the at least one sensor can be a LIDAR sensor
  • Radar sensor ultrasonic sensor
  • camera sensor camera sensor
  • odometer odometer
  • sensors can be used alone or in combination with one another.
  • sensors such as
  • Ultrasonic distance sensors, cameras, and the like can be used to perform an odometric procedure.
  • static pressure sensors, cameras, and the like can be used to perform an odometric procedure.
  • Characteristics of an environment are determined and extracted. Such features can be, for example, lane markings, geometric shapes of buildings, curbs, streets, arrangement and position of traffic lights, delineator posts, lane boundaries, buildings, containers and the like. Such features can be detected by different sensors and used for localization. For example, features extracted from measurement data of a LIDAR sensor can also be recorded by camera sensors and compared with one another for the purpose of localization. A universally applicable feature map can thus be created which can be used by different vehicles and machines. For example, such a feature map can be used by people carriers, transport units, manipulators and the like for precise localization.
  • the marking map can preferably be created in a first step and then used for localization tasks.
  • the marking card can be used for localization and control tasks of automatically operated vehicles or robots.
  • the features of the feature map can be in the form of geometric figures, lines or points, the features can be stored with a minimum data size in the form of coordinates or vectors. As a result, the required data volume can be reduced when the feature map is provided to vehicles or robots.
  • the feature map is stored as the digital map or as a map layer of the digital map. This allows the feature map to be used particularly flexibly.
  • an existing card can be upgraded by the feature card or designed as a digital card with minimal storage requirements.
  • the received measurement data is available as a point cloud and is assigned to a grid made up of a large number of cells.
  • mean values of the measurement data of each cell are formed to create the intensity map.
  • the cells of the digital map can be pixels, pixel groups or polygons, for example.
  • the measured values can in particular be formed by reflected or backscattered and subsequently detected beams of a radar sensor and / or a LIDAR sensor.
  • a height map is created from the received measurement data, a weighted mean value being formed from the measurement data of each cell and the neighboring cells to create the height map. In this way, additional information can be extracted from the determined measurement data and used when creating the feature map.
  • information is received from the created height map in order to determine a height of the extracted features and is stored in the feature map.
  • the height map can be overlaid with the feature map and the corresponding attributes or information from the height map can be transferred to the feature map.
  • the height or gradient of the intensities at the positions of the features can be adopted from the respective features. This process can preferably be carried out in an automated manner, each cell of the height map being compared with each cell of the feature map.
  • the extracted features are stored in the feature map as universally ascertainable features.
  • the features are extracted and stored as geometric shapes, lines, points and / or point clouds and the like. Objects, markings and characteristic or distinctive shapes can thus be extracted from the measurement data of the environment and used to carry out localizations. In particular, this allows a large number of static features to be determined even in dynamic environments and used for the precise determination of a position.
  • the received measurement data are designed as position data and are stored in a position diagram.
  • the position determined is preferably matched when it is successfully matched Features stored as a new measured value in the position diagram.
  • the feature map can be used to determine a position, for example a
  • the current position is determined along a route, for example at defined time intervals, and stored in a position diagram or a position map.
  • a covered distance or trajectory can be displayed using the position diagram. If a feature is found in the feature map, the vehicle or the sensor which the
  • Measurement data determined can be assigned a position within the feature map. This position is then saved as a separate measurement in the position diagram.
  • the measurement data are determined by at least one sensor which differs from at least one sensor for creating the feature map.
  • the extracted features can preferably be present in an abstracted form and can thus be universally readable or comparable.
  • Such a form of the features can be present, for example, as coordinates in text form.
  • the features within the coordinates can have a starting point, an end point, intermediate points, directions, lengths, heights and the like. This information can be stored with a particularly low memory requirement and used to carry out comparisons.
  • FIG. 1 shows a schematic representation of an arrangement for illustrating a method according to the invention
  • FIG. 2 shows a schematic diagram to illustrate the method for creating digital maps according to an exemplary embodiment
  • FIG. 3 shows a schematic diagram to illustrate the method for performing a localization according to a
  • Embodiment, 4 shows a schematic intensity map
  • FIG. 6 shows a perspective illustration of a feature map.
  • FIG. 1 shows a schematic representation of an arrangement 1 for
  • the arrangement 1 has two vehicles 6, 8. Alternatively or additionally, the arrangement 1 can have robots and / or further vehicles. According to the exemplary embodiment shown, a first vehicle 6 is used to carry out the method 2 for creating digital maps, in particular
  • the second vehicle 8 is illustrated schematically in order to illustrate a method 4 for performing a localization within the digital map.
  • the first vehicle 6 has a control device 10, which is connected to a machine-readable memory 12 and a sensor 14 for data transmission purposes.
  • the sensor 14 can be a LIDAR sensor 14, for example.
  • the first vehicle 6 can use the LIDAR sensor 14 to scan an environment U and generate measurement data.
  • the measurement data determined can then be received by the control device 10 and evaluated.
  • a feature map created by the control device 10 can use a
  • the feature map can be stored in the machine-readable storage medium 12.
  • the second vehicle 8 also has a control device 11.
  • Control device 11 is data-conducting with a machine-readable
  • the sensor 15 is a camera sensor 15 and can likewise determine measurement data of the surroundings U and transmit them to the control device 11.
  • the control device 11 can extract features from the measurement data of the environment U and compare them with features from the feature map, which are based on the
  • FIG. 2 shows a schematic diagram to illustrate the method 2 for creating digital maps according to a
  • a first step 18 measurement data of the environment U are determined during a measurement drive of the first vehicle 6 and received by the control device 10.
  • the environment U is scanned with a LIDAR sensor 14.
  • a SLAM method is carried out on the basis of the measurement data received during the measurement drive.
  • a trajectory of the first vehicle 6 is determined by the SLAM method.
  • the received measurement data are transformed into a coordinate system of the trajectory 20.
  • the trajectory can be transformed into a coordinate system of the measurement data.
  • Coordinate system can be a Cartesian coordinate system.
  • An intensity map 30 is created on the basis of the transformed measurement data 21.
  • Such an intensity map 30 is illustrated in FIG.
  • the measurement data can be present as a grid map with a large number of cells 31, 32.
  • the cells 31, 32 can, for example, as pixels or as
  • Each cell 31, 32 can have a local assignment, such as GPS coordinates, in accordance with the coordinate system.
  • An intensity is then calculated for each cell 31, 32.
  • a mean value is calculated for all measured values within the respective cell 31, 32.
  • An intensity map 30 is thus formed 21 from the calculated mean values.
  • a height map 40 is created 22.
  • the height map 40 is created from weighted mean values and is shown in FIG.
  • the weighted mean values are for the measured values within each cell 31 and the
  • Measurement data in the corresponding neighboring cells 32 are calculated.
  • features are extracted from the intensity map 30. This can be done, for example, by an automated
  • Pattern recognition algorithm or manually by an employee. For example, transitions between light and dark areas in the intensity map 30 can be taken into account as possible patterns.
  • a profile can be assigned to each feature using the height map 40.
  • the determined features are stored according to their position within the intensity map 30 in a feature map 60 24. Die
  • Feature map 60 is illustrated schematically in FIG.
  • an exemplary LIDAR scan with a large number of features 62, 64, 66 is superimposed.
  • the features 62, 64, 66 are exemplary as
  • the feature map 60 can, for example, be stored in the machine-readable storage medium 12 and made available via the communication link 16.
  • FIG. 3 shows a schematic diagram to illustrate the method 4 for performing a localization according to a
  • Embodiment. The method 4 is carried out, for example, by the control device 11 of the second vehicle 8.
  • a step 25 measurement data of the environment U are determined by the sensor 15 and transmitted to the control device 11.
  • the control device 11 transmits measurement data of the environment U to the control device 11.
  • Control unit 11 received. This can be done by a device in control unit 11
  • the measurement data can be determined continuously or at defined time intervals and received by the control device 11. Furthermore, odometric measurement data can be received by the control device 11.
  • features 62, 64, 66 are extracted from the received measurement data.
  • the features 62, 64, 66 are compared with the received feature card 60 27. During the comparison, an attempt is made to match the features 62, 64, 66 detected inside the vehicle on the feature card 60 Find.
  • the measurement data determined by odometry can be used here
  • Features map 60 has abstracted features 62, 64, 66 that can therefore be used universally, can be determined using camera sensor 15
  • Measurement data can also be used for localization.
  • the position of the vehicle 8 can be corrected or updated 28.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Databases & Information Systems (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)
  • Instructional Devices (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Processing Or Creating Images (AREA)
  • Computer Networks & Wireless Communication (AREA)

Abstract

L'invention concerne un procédé de création de cartes numériques à l'aide d'un dispositif de commande. Des données de mesure d'un environnement sont reçues pendant un parcours de mesure, un procédé SLAM est mis en œuvre sur la base des données de mesure reçues afin de déterminer une trajectoire du parcours de mesure, les données de mesure reçues sont transformées en un système de coordonnées de la trajectoire, les données de mesure transformées sont utilisées pour créer une carte d'intensité, des caractéristiques sont extraites de la carte d'intensité et sont mémorisées dans une carte de caractéristiques. En outre, l'invention concerne un procédé de mise en œuvre d'une localisation, un dispositif de commande, un programme informatique et un support d'enregistrement lisible par machine.
EP20724501.0A 2019-06-07 2020-05-07 Procédé de création d'une carte de caractéristiques à usage universel Ceased EP3980724A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019208384.6A DE102019208384A1 (de) 2019-06-07 2019-06-07 Verfahren zum Erstellen einer universell einsetzbaren Merkmalskarte
PCT/EP2020/062702 WO2020244881A1 (fr) 2019-06-07 2020-05-07 Procédé de création d'une carte de caractéristiques à usage universel

Publications (1)

Publication Number Publication Date
EP3980724A1 true EP3980724A1 (fr) 2022-04-13

Family

ID=70613795

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20724501.0A Ceased EP3980724A1 (fr) 2019-06-07 2020-05-07 Procédé de création d'une carte de caractéristiques à usage universel

Country Status (6)

Country Link
US (1) US20220236073A1 (fr)
EP (1) EP3980724A1 (fr)
JP (2) JP7329079B2 (fr)
CN (1) CN113994172B (fr)
DE (1) DE102019208384A1 (fr)
WO (1) WO2020244881A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7718687B2 (ja) * 2021-09-01 2025-08-05 国立大学法人金沢大学 3次元マップ生成方法及び3次元マップ生成装置
DE102023207262A1 (de) * 2023-07-28 2025-01-30 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Ermittlung einer Trajektorienverlängerung von einer Trajektorie für ein Objekt, Computerprogramm, maschinenlesbares Speichermedium, elektronische Steuereinheit sowie eine Rückfahrkameravorrichtung für ein Fahrzeug
DE102023210175A1 (de) * 2023-10-18 2025-04-24 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Unterstützung bei einer Erstellung einer Höhenkarte
DE102024201359A1 (de) * 2024-02-14 2025-08-14 BSH Hausgeräte GmbH Verfahren zum Erstellen einer Umgebungskarte

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3292679B2 (ja) * 1997-06-03 2002-06-17 三菱電機株式会社 レーダ装置
JP2005009926A (ja) * 2003-06-17 2005-01-13 Nec Corp 目標中心位置算出方法、装置及びプログラム
JP5249577B2 (ja) 2007-12-20 2013-07-31 三菱重工業株式会社 追尾システム及びその方法並びに車両
US8699755B2 (en) * 2009-02-20 2014-04-15 Navteq B.V. Determining travel path features based on retroreflectivity
US8473187B2 (en) * 2009-06-01 2013-06-25 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US20130202197A1 (en) * 2010-06-11 2013-08-08 Edmund Cochrane Reeler System and Method for Manipulating Data Having Spatial Co-ordinates
JP5500388B2 (ja) 2011-02-16 2014-05-21 アイシン・エィ・ダブリュ株式会社 撮影位置特定システム、撮影位置特定プログラム、及び撮影位置特定方法
US8798840B2 (en) 2011-09-30 2014-08-05 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US20140267282A1 (en) * 2013-03-14 2014-09-18 Robert Bosch Gmbh System And Method For Context Dependent Level Of Detail Adjustment For Navigation Maps And Systems
US9037403B2 (en) 2013-03-26 2015-05-19 Toyota Motor Engineering & Manufacturing North America, Inc. Intensity map-based localization with adaptive thresholding
DE102013208521B4 (de) * 2013-05-08 2022-10-13 Bayerische Motoren Werke Aktiengesellschaft Kollektives Erlernen eines hochgenauen Straßenmodells
GB201409625D0 (en) * 2014-05-30 2014-07-16 Isis Innovation Vehicle localisation
DE102014223363B4 (de) * 2014-11-17 2021-04-29 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zur Lokalisation eines Kraftfahrzeugs in einer ortsfesten Referenzkarte
AU2015395741B2 (en) * 2015-05-20 2019-06-27 Mitsubishi Electric Corporation Point-cloud-image generation device and display system
EP3332219B1 (fr) * 2015-08-03 2021-11-03 TomTom Global Content B.V. Procédés et systèmes de génération et d'utilisation de données de référence de localisation
CN106022259B (zh) * 2016-05-20 2019-04-12 江苏得得空间信息科技有限公司 一种基于激光点云三维特征描述模型的山区道路提取方法
US11085775B2 (en) * 2016-09-28 2021-08-10 Tomtom Global Content B.V. Methods and systems for generating and using localisation reference data
DE102016220521A1 (de) * 2016-10-19 2018-04-19 Volkswagen Aktiengesellschaft Verfahren zum Ermitteln einer Position eines Kraftfahrzeugs sowie Fahrerassistenzsystem für ein Kraftfahrzeug
KR102096875B1 (ko) * 2016-10-21 2020-05-28 네이버랩스 주식회사 자율 주행 기술을 응용한 3차원 실내 정밀 지도 자동 생성 로봇 및 로봇의 제어 방법
US10528055B2 (en) * 2016-11-03 2020-01-07 Ford Global Technologies, Llc Road sign recognition
DE102017103986A1 (de) 2017-02-27 2018-08-30 Vorwerk & Co. Interholding Gmbh Verfahren zum Betrieb eines sich selbsttätig fortbewegenden Roboters
US10430968B2 (en) * 2017-03-14 2019-10-01 Ford Global Technologies, Llc Vehicle localization using cameras
CN106896353A (zh) * 2017-03-21 2017-06-27 同济大学 一种基于三维激光雷达的无人车路口检测方法
US10565457B2 (en) * 2017-08-23 2020-02-18 Tusimple, Inc. Feature matching and correspondence refinement and 3D submap position refinement system and method for centimeter precision localization using camera-based submap and LiDAR-based global map
DE102017220242A1 (de) * 2017-11-14 2019-05-16 Robert Bosch Gmbh Verfahren und Vorrichtung zum Erstellen und Bereitstellen einer Karte
DE102017221691A1 (de) * 2017-12-01 2019-06-06 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zur Eigenlokalisierung eines Fahrzeugs
DE102017222810A1 (de) * 2017-12-14 2019-06-19 Robert Bosch Gmbh Verfahren zum Erstellen einer merkmalbasierten Lokalisierungskarte für ein Fahrzeug unter Berücksichtigung charakteristischer Strukturen von Objekten
US11237269B2 (en) * 2018-04-26 2022-02-01 Ford Global Technologies, Llc Localization technique
US11035933B2 (en) * 2018-05-04 2021-06-15 Honda Motor Co., Ltd. Transition map between lidar and high-definition map
JP7167876B2 (ja) * 2018-08-31 2022-11-09 株式会社デンソー 地図生成システム、サーバ、方法

Also Published As

Publication number Publication date
JP7329079B2 (ja) 2023-08-17
CN113994172A (zh) 2022-01-28
JP2022535568A (ja) 2022-08-09
US20220236073A1 (en) 2022-07-28
DE102019208384A1 (de) 2020-12-10
WO2020244881A1 (fr) 2020-12-10
JP2023126893A (ja) 2023-09-12
CN113994172B (zh) 2024-12-24
JP7609929B2 (ja) 2025-01-07

Similar Documents

Publication Publication Date Title
DE102015203016B4 (de) Verfahren und Vorrichtung zur optischen Selbstlokalisation eines Kraftfahrzeugs in einem Umfeld
EP3980724A1 (fr) Procédé de création d'une carte de caractéristiques à usage universel
EP3824247A1 (fr) Procédé et système destiné à déterminer une position d'un véhicule
DE102019119204A1 (de) Assistenzsteuerungssystem
WO2018197122A1 (fr) Procédé de création et d'actualisation automatiques d'un ensemble de données pour un véhicule autonome
DE102020118629B4 (de) Computerimplementiertes Verfahren zum Bestimmen der Validität einer geschätzten Position eines Fahrzeugs
EP3688487A1 (fr) Procédé et système de cartographie et de localisation d'un véhicule à partir de mesures radar
EP3577419A1 (fr) Procédé de localisation d'un véhicule à forte automatisation, par ex. un véhicule entièrement automatisé (haf), dans une carte de localisation numérique
DE102019126446A1 (de) Fahrzeugsteuerungsvorrichtung
DE102016003261A1 (de) Verfahren zur Selbstlokalisierung eines Fahrzeugs in einer Fahrzeugumgebung
DE102018007658A1 (de) Verfahren zum Bereitstellen von erweiterten Umfeldmodellen für zumindest teilweise autonome Fahrzeuge, Steuergerät zum Ausführen eines solchen Verfahrens, sowie Fahrzeug mit einem solchen Steuergerät
DE102017201664A1 (de) Verfahren zur Lokalisierung eines höher automatisierten Fahrzeugs in einer digitalen Karte
DE102017122440A1 (de) Verfahren zum Lokalisieren und Weiterbilden einer digitalen Karte durch ein Kraftfahrzeug; Lokalisierungseinrichtung
DE102018220782A1 (de) Lokalisierung eines Fahrzeugs anhand von dynamischen Objekten
EP3721371B1 (fr) Procédé de détermination de la position d'un véhicule, appareil de commande et véhicule
EP2737282A1 (fr) Procédé et dispositif pour la détermination de données d'orientation réelles spécifiques d'un véhicule pour un véhicule
DE102019128253B4 (de) Verfahren zum Navigieren eines Flurförderzeugs
DE102019203623A1 (de) Verfahren zur Bereitstellung einer Karte
WO2022002564A1 (fr) Déterminer une position de départ d'un véhicule pour sa localisation
DE102020206356A1 (de) Verfahren zum Ermitteln einer Ausgangspose eines Fahrzeugs
WO2020160801A1 (fr) Procédé, dispositif, programme informatique et produit programme d'ordinateur pour faire fonctionner un véhicule
EP3969846B1 (fr) Méthode de validation de l'actualité d'une carte
DE102018210712A1 (de) System und Verfahren zur gleichzeitigen Lokalisierung und Kartierung
DE102020206239B4 (de) Verfahren und System zum Bestimmen einer Eigenposition eines Kraftfahrzeugs sowie ein mit einem solchen System ausgerüstetes Kraftfahrzeug
DE102022202400A1 (de) Verfahren zur Lokalisierung eines Fahrzeuges sowie Sensorsystem für ein Fahrzeug zur Lokalisierung des Fahrzeuges

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220107

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230207

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20231113