EP3869484B1 - Information processing device, information processing method, and program - Google Patents
Information processing device, information processing method, and program Download PDFInfo
- Publication number
- EP3869484B1 EP3869484B1 EP19874002.9A EP19874002A EP3869484B1 EP 3869484 B1 EP3869484 B1 EP 3869484B1 EP 19874002 A EP19874002 A EP 19874002A EP 3869484 B1 EP3869484 B1 EP 3869484B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- candidate
- information
- location information
- point
- trajectory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3837—Data obtained from a single source
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/3815—Road data
- G01C21/3819—Road shape data, e.g. outline of a route
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to an information processing device that generates boundary location information indicating locations of boundaries of a road such as demarcation lines and road shoulder edges to be used as map information.
- Automated driving by which a vehicle such as a car drives by automated control requires not only various sensors such as a camera and a laser radar that are attached to an automated driving vehicle and detect situations inside and outside the vehicle, but also map information called a dynamic map that highly accurately represents a road on which the vehicle is traveling. It is desirable that this highly accurate map information be created based on information resulting from actually measuring the vicinity of the real road.
- a dynamic map can be created by collecting three-dimensional point cloud information indicating coordinates and luminance levels of points on the surfaces of features, such as the surface of a road for which map information is to be generated and facilities on the side of the road, or image information of the road, and analyzing the collected information.
- a surveying vehicle which is a vehicle in which a mobile mapping system is installed, is made to actually travel and three-dimensional point cloud information and the like of the road surface, facilities on the side of the road, and the like are collected.
- This mobile mapping system is a measurement device that is of a type to be installed in a vehicle and includes a positioning device such as a Global Navigation Satellite System (GNSS) receiver, an inertial measurement unit, and an odometer, and measurement equipment such as a laser scanner, which performs scanning with laser light and measures locations and reflection luminance levels of target objects at which the laser light is reflected, and a camera, and as a result of measurement by these positioning device and measurement device, collects location information of the traveling vehicle, three-dimensional point cloud information indicating coordinates and luminance levels of points on the surfaces of features in the vicinity of the traveled road, image information of the road, and the like.
- GNSS Global Navigation Satellite System
- Information obtained by the mobile mapping system is point cloud information and image information which are not map information, so that a plotting process is required to generate map information from the point cloud information.
- This plotting process is, for example, a process to generate lines indicating locations of boundaries necessary as map information, such as lines representing demarcation lines on the road and road shoulder edges of the road, based on the three-dimensional point cloud information. If the plotting process is performed manually and map information in a wide range is prepared, the amount of information of point clouds to be processed becomes enormous, resulting in enormous work costs. In particular, work costs for plotting demarcation lines and road shoulder edges, which are features that always exist along the road and are map information required to realize automated driving, are high. Therefore, techniques for performing this plotting process by software have been developed.
- Patent Literature 1 discloses a device that plots a demarcation line and a road shoulder edge by setting traverse planes perpendicular to a traveling direction of a vehicle at predetermined intervals in three-dimensional measurement data of a road, extracting candidate points for the demarcation line and the road shoulder edge on each of the traverse planes that have been set, and connecting the candidate points extracted on the respective traverse planes.
- the candidate points for the demarcation line are extracted using reflection luminance level information included in the three-dimensional point cloud information obtained by a laser scanner, and the candidate points for the road shoulder edge are extracted using location information in the three-dimensional point cloud information.
- JP 2017 223-511 A discloses a road structuring device to accurately detect a traveling lane zone of a road even in a situation that a blurred zone line and a zebra zone exist, and a deficit of a point group of occlusion caused by a parallel-traveling vehicle occurs.
- the road structuring device comprises: ground-surface/wall-surface detection part which calculates the flatness and a normal direction of a three-dimensional point included in a three-dimensional point group, performs clustering by using the similarity of the three-dimensional point in the normal direction, detects a ground surface and a wall surface on the basis of the normal direction and the flatness of the three-dimensional point group cluster, and detects the boundary information of the ground surface and the wall surface; a traveling region detection part which determines whether or not an attention point is a boundary point by calculating a separation degree by a temporary boundary line direction and a separation degree by a temporary boundary line curve, and detects a traveling region regulation line on the basis of the boundary point which is connected on the basis of the similarity of a tangent direction and the similarity of a magnitude of a curvature; and a lane structuring processing part which detects a traveling lane zone of a road on the basis of the traveling region regulation line, and detects a branch point or a cross point on the basis of the
- US 2017/116477 A1 discloses a method and apparatus for lane detection using overhead images and positional data.
- a server receives positional data from a vehicle and computes a continuous trajectory.
- the server receives an overhead image of a road section.
- the server crops and processes the overhead image to remove unwanted portions.
- the server identifies edge features using the continuous trajectory and steerable filters.
- the server identifies lanes in the overhead image using a maximization algorithm, the edge filters, and the continuous trajectory.
- Patent Literature 1 JP 2017-78904 A
- a deceleration line exists next to a demarcation line, candidate points for the demarcation line and candidate points for the deceleration line will be connected, and a demarcation line whose shape is different from the shape of the real demarcation line will be generated.
- another problem is that when a road shoulder edge is to be plotted as a boundary, if location information of the road surface has not been obtained accurately in point cloud information due to disturbance factors such as weeds and another vehicle traveling alongside, locations of the roots of the weeds and locations of the tires of the vehicle traveling alongside will be extracted as candidate points and they will be connected, and as a result, a road shoulder edge whose shape is different from the shape of the real road shoulder edge will be generated.
- the present invention has been made to solve the problems as described above, and aims to obtain an information processing device that generates boundary location information of a road in which inaccuracies due to disturbance factors are reduced.
- the present invention provides an information processing device, in accordance with claim 1.
- the present invention also provides an information processing method, in accordance with claim 8.
- the present invention also provides a program, in accordance with claim 9.
- An information processing device includes a candidate location information generation unit to generate, based on feature measurement information indicating locations of features including a road having a boundary, candidate location information indicating locations of candidate elements for the boundary; a selection unit to select candidate elements from among the candidate elements indicated by the candidate location information generated by the candidate location information generation unit, based on a trajectory along the road; and a boundary location information generation unit to generate boundary location information indicating determined locations of the boundary, using the candidate elements selected by the selection unit.
- candidate elements excluding candidate elements that do not correspond to the trajectory along the road and are thus considered to be disturbance are selected in the selection unit, and based on the selected candidate elements, the boundary location information generation unit generates boundary location information, so that the boundary location information of the road in which inaccuracies due to disturbance factors are reduced can be generated.
- An information processing device generates, based on feature measurement information indicating locations of features including a road having a boundary, candidate location information indicating locations of candidate elements for the boundary, and selects candidate elements corresponding to a trajectory along the road from among the candidate elements indicated by the generated candidate location information, thereby excluding candidate elements that do not correspond to the trajectory along the road and are thus considered to be disturbance, and based on the selected candidate elements, generates location information of the boundary of the road, thereby determining locations of the boundary of the road by reducing inaccuracies due to disturbance factors.
- Fig. 1 is a configuration diagram illustrating a configuration of an information processing device 1 in a first embodiment to implement the present invention.
- an information acquisition unit 2 acquires feature measurement information indicating locations of features including a road having a boundary and trajectory information indicating a trajectory along the road.
- the features refer to elements that may be used as map information, which are elements such as demarcation lines and road markings depicted on the road, installed objects such as guard rails and road signs installed in the vicinity of the road, and road shoulder edges, which are edges of road shoulders. This embodiment will be described hereinafter using demarcation lines and road shoulder edges as the features.
- a candidate location information generation unit 3 generates candidate location information indicating locations of candidate elements for the boundary, based on the feature measurement information and the trajectory information acquired by the information acquisition unit 2.
- a selection unit 4 selects candidate elements from among the candidate elements indicated by the candidate location information generated by the candidate location information generation unit 3, based on the trajectory information.
- a boundary location information generation unit 5 generates boundary location information indicating determined locations of the boundary, using the candidate elements selected by the selection unit 4.
- the information processing device 1 is configured as described above, and each unit will be described in detail below.
- the information acquisition unit 2 acquires feature measurement information indicating locations of features including a road having a boundary and trajectory information indicating a trajectory along the road.
- the feature measurement information and the trajectory information are acquired by measurement using a mobile mapping system.
- the mobile mapping system is installed in a mobile object such as, for example, a vehicle, an aircraft, or a drone.
- a vehicle in which the mobile mapping system is installed is referred to as a surveying vehicle.
- the mobile mapping system includes a positioning device, such as a GNSS receiver, an inertial navigator, and an odometer, and measurement equipment, such as measurement devices like a laser scanner, a camera, and the like.
- feature measurement information is measured by the laser scanner included in the mobile mapping system.
- the laser scanner performs scanning with laser light and receives reflected light of the emitted laser light.
- the laser scanner is installed on the roof of the surveying vehicle, and scans the surface of the road and features in the vicinity of the road with laser light such that the laser light traverses the road. It is information including a location of a point at which reflected light of the laser light emitted from the laser scanner and reflected on the surface of a feature is received, a reflection luminance level indicating the intensity of reflected light when the laser light is reflected at the feature at the point of this location, and a time point when the location and the reflection luminance level are measured, and is measured by the laser scanner provided in the mobile mapping system.
- This feature measurement information is measured for each of different points on the surface of the feature, and a collection of feature measurement information at a plurality of points is referred to as point cloud information.
- trajectory information is traveling trajectory information indicating a traveling trajectory, at the time of measuring point cloud information, of the surveying vehicle in which the mobile mapping system is installed.
- the traveling trajectory of the surveying vehicle is represented by a sequence of points collected from information on the location of the vehicle at each time point when measured by the GNSS receiver provided in the mobile mapping system, and is represented, for example, by a sequence of points obtained by measurement by setting the update rate of the GNSS receiver to 100 Hz, that is, by measuring location information of the surveying vehicle at every 0.01 seconds.
- the update rate of the GNSS receiver is not limited to 100 Hz, and may be changed depending on the granularity of the trajectory to be set.
- point cloud information acquired by the mobile mapping system is used as feature measurement information.
- this is not limiting, and provided that locations of features are indicated, information on locations of features generated by image processing from image data captured by the camera installed in the mobile mapping system, for example, may be used.
- the method for acquiring feature measurement information is not limited to measurement by the mobile mapping system, and feature measurement information may be acquired by measurement by measurement equipment installed in an aircraft or a drone, for example.
- trajectory information is not limited to that described above, provided that a trajectory along the road corresponding to feature measurement information is indicated.
- flight trajectory information indicating a flight trajectory of an aircraft or a drone, instead of the traveling trajectory of the surveying vehicle, may be used.
- trajectory information is assumed to be a sequence of points consisting of results of measurement by the GNSS receiver at each time point.
- a curve generated based on the sequence of points may be used.
- a trajectory is not limited to a movement trajectory of a mobile object that moves along the road, such as a traveling trajectory or a flight trajectory, and may be created manually by visualizing point cloud information and using a paint tool or the like, for example.
- feature measurement information and trajectory information are not limited to those acquired at the same time, and feature measurement information may be acquired by an aircraft and trajectory information may be acquired by a vehicle, for example, provided that both are information concerning the same road.
- the candidate location information generation unit 3 generates candidate location information indicating locations of candidate elements for the boundary of the road, based on the feature measurement information acquired by the information acquisition unit 2.
- the candidate location information generation unit 3 includes a division unit 6 to set a plurality of areas and divide the feature measurement information into sets respectively belonging to the plurality of areas that have been set, and a divided-area candidate location information generation unit 7 to generate candidate location information, based on the feature measurement information in each area after being divided by the division unit 6.
- the division unit 6 sets a plurality of areas, and divides the feature measurement information acquired by the information acquisition unit 2 into sets respectively belonging to the plurality of areas that have been set. That is, the division unit 6 divides the feature measurement information acquired by the information acquisition unit 2 into sets each belonging to one of the areas that have been set.
- a plurality of areas are a plurality of planes perpendicular to the traveling trajectory, indicated by the traveling trajectory information, of the surveying vehicle in which the mobile mapping system is installed.
- a plurality of areas are assumed to be planes perpendicular to the traveling trajectory.
- they may be curved planes or cuboids instead of planes, and may be intersecting diagonally instead of being perpendicular. That is, a plurality of areas are not limited to perpendicular planes, and may be a plurality of spaces intersecting the trajectory indicated by the trajectory information.
- the feature measurement information may be divided at predetermined time intervals, based on measurement time points indicated by measurement time point information included in point cloud information. For example, when the mobile mapping system measures features, if the laser scanner performs measurement while rotating in a direction perpendicular to the traveling direction of the vehicle, it may be desirable to divide feature measurement information into pieces of data each corresponding to one rotation of the laser scanner. In such a case, a space corresponding to time required for one rotation may be set as each area.
- each set may be formed by selecting some points instead of all points in each area. This can reduce the amount of information on which data processing is performed and prevent a delay in data processing when the amount of information in the point cloud information is larger than necessary.
- the divided-area candidate location information generation unit 7 generates candidate location information indicating locations of candidate elements for a boundary of a feature, based on the feature measurement information in each area after being divided by the division unit 6.
- the candidate elements refer to elements that may form a boundary of a feature, and are points, line segments, faces, and the like. In this embodiment, a case in which points are treated as candidate elements will be described hereinafter.
- boundaries are demarcation lines and road shoulder edges.
- To generate candidate location information of a demarcation line information on locations and luminance levels of features included in feature measurement information is used.
- To generate candidate location information of a road shoulder edge information on only locations of features is used. Therefore, although point cloud information is assumed to include information on locations, luminance levels, and time points in this embodiment, if only road shoulder edges are to be plotted, feature measurement information may include only information on locations of features.
- a method for generating candidate location information of a demarcation line as a boundary in the divided-area candidate location information generation unit 7 will now be described.
- Fig. 2 is a conceptual diagram representing a method for generating candidate location information of a demarcation line based on point cloud information, on a certain plane set as an area by the division unit 6.
- the range of the demarcation line is recognized by utilizing the fact that there is a large difference between the luminance level of the road surface and the luminance level of the demarcation line, and based on the range, candidate locations for the demarcation line are generated.
- the vertical axis indicates the luminance level of features
- the horizontal axis indicates a right direction of the road on a certain plane set by the division unit 6.
- the right direction of the road is a direction that is perpendicular to both a traveling direction of the vehicle and a vertically upward direction and such that the right side facing the traveling direction of the vehicle is positive.
- the coordinate origin may be set in any way.
- the origin position of the vertical axis is 0 [W/sr/m 2 ]
- the origin position of the horizontal axis is the location of a point with the smallest coordinate value in the right direction of the road among points included in a point cloud on the plane.
- the divided-area candidate location information generation unit 7 checks the luminance levels of points included in point cloud information sequentially in order of coordinate values in the right direction of the road, starting with a point with the smallest coordinate value, and detects a point with a sharp increase in the luminance level, that is, a point 8 whose luminance level differs from that of the immediately preceding point by more than a threshold value (for example, a set value of displacement in reflection intensity of reflected light of laser light), and detects a point with a sharp decrease in the luminance level, that is, a point 9 whose luminance level differs from that of the next point by more than the threshold value.
- a threshold value for example, a set value of displacement in reflection intensity of reflected light of laser light
- a candidate point 10 is generated by treating the location of the median point among points with high luminance levels, that is, points between the point 8 and the point 9, as a candidate element for the demarcation line, and candidate location information indicating the location of the candidate point 10 is generated.
- the candidate point 10 may be generated by treating the location of the center between the point 8 and the point 9 as a candidate element for the demarcation line, and candidate location information indicating the location of the candidate point 10 may be generated.
- candidate location information may be generated by setting a threshold value 11 for the luminance value in advance, as illustrated in Fig. 3 , extracting a point 12, a point 13, a point 14, and a point 15, included in point cloud information, whose luminance levels are equal to or above the threshold value, treating the extracted points as the range of the demarcation line, and if the width of the range of the demarcation line is within the specified width between the upper and lower limits, treating the location of the median point of these points as a candidate point 16 for the demarcation line.
- Fig. 4 is a conceptual diagram representing a method for generating candidate location information of a road shoulder edge based on point cloud information, on a certain plane set by the division unit 6.
- the vertical axis indicates a height direction, that is, a vertically upward direction on a certain plane set by the division unit 6, and the horizontal axis indicates the right direction of the road on this plane.
- the coordinate origin may be set in any way. In this embodiment, it is assumed that the origin position of the vertical axis is the location of a point at the lowest position among points included in a point cloud on the plane, and the origin position of the horizontal axis is the location of a point with the smallest coordinate value in the right direction of the road among the points included in the point cloud on the plane.
- the divided-area candidate location information generation unit 7 classifies the point cloud information into points on the road and points of features other than the road, such as a curb and an installed object. This classification is performed by setting a threshold value 17 for the height direction in advance, and treating a point whose height is less than the threshold value 17 as the road and a point whose height is equal to or above the threshold value 17 as a point other than the road. Then, a point closest to the traveling trajectory indicated by the trajectory information obtained from the information acquisition unit 2 is extracted from among the points classified as points other than the road. In the example in Fig. 4 , although the traveling trajectory is not illustrated, it is positioned to the left of the origin point, so that a point 18 is extracted as the point closest to the traveling trajectory.
- the location of the foot of a perpendicular line drawn from the location of the extracted point 18 to a road surface 19 is treated as a candidate point 20 for the road shoulder edge, and candidate location information indicating the location of the candidate point 20 is generated.
- the road surface 19 may be set in advance as with the threshold value 17, or an approximate plane may be generated based on the points determined as points on the road because of a height less than the threshold value 17, and the generated plane may be treated as the road surface.
- a road shoulder edge is a boundary indicating an edge of the road.
- only one road shoulder edge is set on each of the right and left sides of the traveling direction of the vehicle, so that it is arranged that only one candidate point for the road shoulder edge be generated on each of the right and left sides of the traveling trajectory on one plane.
- the coordinate values, in the height direction, of points included in point cloud information may be sequentially checked, starting with a point with the smallest coordinate value in the right direction of the road, to extract a point with a sharp decrease in the coordinate value and a point with a sharp increase in the coordinate value, and these points may be treated as candidate points for the road shoulder edge.
- the coordinate values, in the height direction, of points included in point cloud information may be sequentially checked, starting with a point with the smallest coordinate value in the right direction of the road, to extract a point with a sharp decrease in the coordinate value and a point with a sharp increase in the coordinate value, and these points may be treated as candidate points for the road shoulder edge.
- an approximate straight line 21 may be generated based on a plurality of points with sharp changes in the height, and the intersection between this straight line and the road surface may be treated as a candidate point 22 for the road shoulder edge.
- the divided-area candidate location information generation unit 7 obtains a candidate point for a boundary of a feature, that is, a candidate point for a demarcation line or a road shoulder edge, for each area set by the division unit 6, and generates candidate location information indicating the location of each candidate point, as described above.
- the selection unit 4 selects candidate elements for a boundary from among the candidate elements generated by the candidate location information generation unit 3, based on the trajectory information acquired by the information acquisition unit 2. Specifically, the selection unit 4 of this embodiment selects candidate points for the road shoulder edge or the demarcation line corresponding to the traveling trajectory, based on the distance between the candidate point (the candidate point 10 or the candidate point 16 for the demarcation line, the candidate point 20 or the candidate point 22 for the road shoulder edge) in each area obtained by the operation of the candidate location information generation unit 3 and the traveling trajectory. Furthermore, in this embodiment, the selection unit 4 selects candidate points for the road shoulder edge or the demarcation line corresponding to the traveling trajectory, based on the distance between candidate elements, that is, the distance between candidate points.
- Fig. 6 is a conceptual diagram representing a method for grouping of candidate points for the demarcation line, and illustrates a demarcation line 48 and a deceleration line 49 that are depicted on the road and a traveling trajectory 26 projected on the road surface. Fig. 6 also illustrates candidate points 23, 24, and 25, which are among the candidate points obtained in the divided-area candidate location information generation unit 7.
- the selection unit 4 calculates the distance between the candidate point for the demarcation line generated on each plane by the divided-area candidate location information generation unit 7 and the traveling trajectory, and the distance between the candidate points.
- the selection unit 4 performs this process for all combinations of candidate points, and performs grouping for all the candidate points.
- the above process is performed on all combinations of candidate points.
- the above process may be performed only on combinations of points whose measurement time points are close to each other, based on measurement time point information included in point cloud information.
- the demarcation line is assumed to be approximately parallel to the traveling trajectory. Thus, if candidate points generated by the divided-area candidate location information generation unit 7 exist on the demarcation line, these candidate points will be grouped into the same group. However, if a point on the road surface other than the demarcation line is extracted as a candidate point because its luminance level is high due to the influence of a line different from the demarcation line, such as the deceleration line 49 illustrated in Fig. 6 , or a smudge, for example, the distance between this point and the traveling trajectory is different when compared with points on the demarcation line, so that this point is not grouped into the same group as the candidate points on the demarcation line. Therefore, it is possible to form a group of points on the demarcation line not including disturbance factors such as a smudge on the road surface or a line other than the demarcation line.
- the selection unit 4 performs integration of groups. That is, if a candidate point included in a certain group is also included in another different group, all the candidate points included in these two groups are integrated into one group. This will create a group including many candidate points corresponding to the traveling trajectory 26.
- the traveling trajectory is depicted as an arrow in Fig. 6
- the traveling trajectory is a sequence of points indicating the location of the vehicle at each time point measured by the GNSS receiver in this embodiment. Therefore, in this embodiment in actuality, the distance between each candidate point for the demarcation line and each point included in the sequence of points of the traveling trajectory is calculated, and the distance between each candidate point and a point of the traveling trajectory located at the shortest distance is treated as the distance between each candidate point and the traveling trajectory.
- a perpendicular line may be drawn downward from each candidate point to the curve and the length of the perpendicular line may be calculated.
- a group in which the number of candidate points included in the group is equal to or smaller than a predetermined number is extracted, and if this group is located between groups whose numbers of candidate points are larger than that of this group and the distance to each of the larger groups is shorter than the size of a preset road width, this group is deleted. This can prevent points on a road marking and the like from being used for generating location information of the demarcation line.
- Fig. 7 is a conceptual diagram representing a method for excluding a group other than those of the demarcation line from a plurality of groups.
- a group 29 and a group 30, each formed by candidate points on the demarcation line, are large groups, and a group 27 on a deceleration line and a group 28 on a road marking that are between the large groups are excluded.
- the method for selecting candidate points for the demarcation line is not limited to the method described above. Instead of calculating the distance between each candidate point and the traveling trajectory with regard to two candidate points, candidate points whose absolute values of the distance to the traveling trajectory are within a predetermined range may be grouped together.
- the selection unit 4 groups the candidate points for the road shoulder edge that are located on the left side of the traveling trajectory on the respective planes into one group, and groups candidate points for the road shoulder edge that are located on the right side of the traveling trajectory on the respective planes into one group different from the group for the road shoulder edge on the left side.
- point cloud information may not be able to be measured accurately due to the influence of plants or another vehicle travelling alongside. If grouping is performed by the above method using candidate points obtained from such point cloud information not measured accurately, since each candidate point is grouped into either of two groups by being distinguished based on whether it is located on the right or left side of the traveling trajectory, a road shoulder edge that is not smooth, having sharp irregularities sideways, that is, a road shoulder edge deviated from the shape of the real road shoulder edge may be generated.
- a section including large irregularities sideways in the positions of candidate points is determined as erroneous detection, and the boundary location information generation unit 5 excludes the candidate points included in the section determined as erroneous detection from the candidate points to be used for generating location information of the road shoulder edge.
- the selection unit 4 compares the shape of each group of candidate points with the shape of the traveling trajectory. Specifically, the distance between each candidate point in the group and the traveling trajectory is calculated, and a point at which this distance changes from that of an adjacent candidate point by a predetermined threshold value or more is extracted.
- candidate points corresponding to the traveling trajectory can be selected by detecting a portion where a point cloud is not acquired accurately due to the influence of plants or another vehicle traveling alongside and there are candidate points that might generate a road shoulder edge deviated from the shape of the real road shoulder edge, and excluding the detected candidate points.
- the candidate points to be used for generating a boundary can be selected by excluding disturbance factors such as smudges on the road surface.
- Fig. 8 is a conceptual diagram representing a method for selecting candidate points for a road shoulder edge.
- the distance between each candidate point and the traveling trajectory is checked sequentially, starting with a point with the earliest time point and ending with a point with the latest time point. Then, points with a sharp change in the distance to the traveling trajectory, that is, a candidate point 32 whose distance to the traveling trajectory indicates a sharp decrease from that of a next candidate point 34 and a candidate point 33 whose distance to the traveling trajectory indicates a sharp increase from that of the preceding point 35 are detected.
- the candidate point 32 and the candidate point 33 are ends of the candidate points that are detected accurately, and the candidate point 34, the candidate point 35, and candidate points 36 between the candidate point 34 and the candidate point 35 are detected erroneously, so that these candidate points are excluded from the candidate points to be used for generating location information of the road shoulder edge.
- the method for comparing the shape of a group of candidate points with the shape of the traveling trajectory is not limited to the above method.
- an approximate curve represented by a cubic function may be generated from a plurality of candidate points, for each fixed distance interval including a plurality of candidate points, and whether each candidate point should be used for generating location information of the road shoulder edge may be determined based on the distance between the approximate curve and the candidate point.
- a normal smooth road shoulder edge it can be represented by a cubic function, so that the distance between the approximate curve and a candidate point is small. If irregularities sideways are caused by plants or the like, the distance between the approximate curve and a candidate point increases, so that it can be determined as erroneous detection.
- the boundary location information generation unit 5 generates boundary location information indicating determined locations of a boundary, using the candidate elements selected by the selection unit 4.
- the boundary location information generation unit 5 generates location information of a demarcation line or location information of a road shoulder edge, based on the traveling trajectory indicated by the traveling trajectory information and using the candidate points selected by the selection unit 4.
- the boundary location information generation unit 5 connects groups grouped as the candidates for the demarcation line by the selection unit 4.
- the candidate location information generation unit 3 may not be able to properly generate candidate elements in that section, and groups of the demarcation line generated by the selection unit 4 may be scattered at locations separated from each other. Therefore, the boundary location information generation unit 5 connects groups of the demarcation line existing at separate locations, based on the traveling trajectory of the surveying vehicle.
- To connect groups signifies to generate candidate points for the demarcation line between two groups at predetermined distance intervals.
- the traveling trajectory of the surveying vehicle is used to connect the groups.
- candidate points are generated such that an approximate curve formed by the two groups and the candidate points generated between the two groups is shaped similarly to the nearby traveling trajectory.
- the location of the median point of the candidate points included in each group is calculated first, and then the distance between the median point of one of the groups and each candidate point included in the other group is calculated.
- Some candidate points are extracted sequentially in order of this distance, starting with a point with the shortest distance.
- some candidate points are also extracted from the other group.
- the location of the median point of the extracted candidate points are calculated in each of the groups, and the distance between the median points of the two groups is calculated. If this distance between the median points is within a predetermined range, the two groups are connected.
- Fig. 9 is a conceptual diagram representing a method for connecting two groups of a demarcation line.
- a group 38 and a group 39 of the demarcation line are connected. If the group 38 and the group 39 are within a predetermined distance range, candidate points 40 are generated so that these two groups are joined. These candidate points 40 are generated such that an approximate curve formed by the group 38, the group 39, and the candidate points 40 is shaped similarly to the traveling trajectory 37.
- the traveling trajectory is depicted as an arrow.
- the traveling trajectory is a sequence of points indicating the location of the vehicle at each time point when measured by the GNSS receiver. Therefore, in this embodiment, an approximate curve is first generated from the sequence of points measured by the GNSS receiver, and two groups are connected such that this approximate curve is shaped similarly to an approximate curve formed by the two groups and newly generated candidate points.
- the method for connecting two groups is not limited to the above method, and groups may be connected using only the positional relationship with a sequence of points measured by the GNSS receiver without generating an approximate curve. For example, two groups may be connected by generating new candidate points such that the angles formed by connecting the points measured by the GNSS receiver with straight lines are the same as the angles formed by connecting the candidate points newly generated between the two groups with straight lines.
- location information of the finally obtained candidate points for the demarcation line is generated as location information of the demarcation line indicating determined locations of the demarcation line.
- the boundary location information generation unit 5 deletes the candidate point 34, the candidate point 35, and the candidate points 36 included in the section determined as erroneously detected as the candidates for the road shoulder edge by the selection unit 4.
- candidate points 41 are generated utilizing the shape of the traveling trajectory 31.
- a specific method is the same as the method used to connect groups in generating location information of the demarcation line.
- the candidate points 41 are generated such that an approximate curve formed by candidate points other than the candidate point 34, the candidate point 35, and the candidate points 36 that have been excluded is shaped similarly to the nearby traveling trajectory 31.
- the demarcation line may be used instead of the traveling trajectory.
- the road shoulder edge may be used instead of the traveling trajectory.
- location information may be generated by using only points that are locally near the trajectory of the vehicle itself and connecting these points, as illustrated in Fig. 11 .
- location information of a road shoulder edge 47 can be generated by using a candidate point 43, a candidate point 44, a candidate point 45, a candidate point 46, and a traveling trajectory 42 and generating new candidate points (not illustrated) on a line connecting the candidate point 43, the candidate point 44, the candidate point 45, and the candidate point 46.
- the boundary location information generation unit 5 generates location information of the candidate points for the road shoulder edge thus obtained, as location information of the road shoulder edge indicating determined locations of the road shoulder edge.
- the information processing device 1 in the first embodiment is configured as described above.
- the functions of the information processing device 1 are realized by hardware illustrated in Fig. 12 .
- the hardware illustrated in Fig. 12 includes a processing device 50 such as a central processing unit (CPU), a storage device 51 such as a read only memory (ROM) or a hard disk, and a communication device 52.
- the information acquisition unit 2 illustrated in Fig. 1 is realized by the communication device 52, and the candidate location information generation unit 3, the selection unit 4, and the boundary location information generation unit 5 are realized by execution of programs stored in the storage device 51 by the processing device 50.
- the information acquisition unit 2 may be configured such that the processing device 50 acquires feature measurement information and trajectory information that are stored in advance in the storage device 51, instead of using the communication device 52.
- the method for realizing the functions of the information processing device 1 is not limited to a combination of hardware and software as described above, and the functions may be realized solely by hardware such as a large-scale integrated circuit (LSI), which is a processing device in which programs are implemented, or some of the functions may be realized by dedicated hardware and some of the functions may be realized by a combination of a processing device and programs.
- LSI large-scale integrated circuit
- the information processing device 1 in this embodiment is configured as described above.
- step S1 the information acquisition unit 2 acquires, from an external server or the like, point cloud information measured by the mobile mapping system, as feature measurement information, and traveling trajectory information indicating a traveling trajectory of the surveying vehicle in which the mobile mapping system is installed when measuring the point cloud information, as trajectory information.
- step 2 the division unit 6 sets a plurality of planes perpendicular to the traveling trajectory of the surveying vehicle, based on the traveling trajectory information acquired by the information acquisition unit 2 in step S1, and divides the point cloud information acquired by the information acquisition unit 2 into sets respectively belonging to the planes that have been set.
- step S3 the divided-area candidate location information generation unit 7 generates candidate location information indicating locations of candidate points for the demarcation line, based on luminance level information included in point cloud information on each plane set by the division unit 6 in step S2.
- step S4 the selection unit 4 performs a grouping process and a group integration process as described above on the candidate points indicated by the candidate location information generated by the divided-area candidate location information generation unit 7 in step S3.
- the selection unit 4 performs a process of selecting groups of candidate points to be used for generating a boundary and excluding a group due to a disturbance factor such as a smudge on the road surface, from among the groups generated in step S4.
- step S5 the number of candidate points included in each group and the distance between each pair of groups are calculated.
- steps S6 to S8 it is determined whether each group meets three conditions for being excluded from candidates to be used for generating location information of the demarcation line.
- the group that meets all the three conditions is excluded.
- the group that does not meet any of the conditions is selected as a candidate to be used for generating location information.
- step S6 it is determined whether a target group includes only the predetermined number of candidate points or less, that is, whether it is a group of a predetermined size or smaller size.
- step S7 it is determined whether there is a larger group near the target group.
- step S8 it is determined whether the small target group is located between larger groups.
- step S9 the group that meets all the conditions of steps S6 to S8 is excluded from candidates to be used for generating location information.
- step S10 the group that does not meet any of the conditions is selected as a candidate to be used for generating location information.
- step S11 it is determined whether the process of steps S6 to S10 has been performed on all the groups. If the process has not been performed on all the groups, processing returns to step S6 and the process of steps S6 to S10 is performed on a next group.
- step S11 If it is determined in step S11 that the above process has been performed on all the groups, processing proceeds to step S12.
- step S12 the boundary location information generation unit 5 determines whether there are two or more groups to be candidates for the demarcation line. If there are two or more groups, processing proceeds to step S13. If there is only one group, processing proceeds to step S14.
- the boundary location information generation unit 5 determines in step S12 that there are two or more groups to be candidates for the demarcation line, the boundary location information generation unit 5 connects groups in step S13. Specifically, the boundary location information generation unit 5 generates new candidate points for the demarcation line between two groups, based on the traveling trajectory. The new candidate points are generated such that an approximate curve formed by the candidate points is shaped similarly to the traveling trajectory.
- step S14 the boundary location information generation unit 5 generates location information of the finally obtained candidate points for the demarcation line, as location information of the demarcation line indicating determined locations of the demarcation line. That is, in step S14, the boundary location information generation unit generates boundary location information indicating determined locations of the boundary, as a process of determining boundary location information to be finally output.
- the demarcation line indicated by the location information generated by the operation as described above is generated by excluding disturbance factors such as smudges on the road, and thus is a smooth line corresponding to the traveling trajectory along the road, so that when it is used for map information, location information of the demarcation line that properly reflects the real demarcation line can be obtained.
- steps S21 and S22 are substantially the same as that in steps S1 and S2, respectively, in plotting a demarcation line.
- step S21 the information acquisition unit 2 acquires point cloud information as feature measurement information and traveling trajectory information as trajectory information.
- step S22 the division unit 6 sets a plurality of planes perpendicular to the traveling trajectory of the surveying vehicle, based on the traveling trajectory information acquired by the information acquisition unit 2, and divides the point cloud information acquired by the information acquisition unit 2 into sets respectively belonging to the planes that have been set.
- step S23 the divided-area candidate location information generation unit 7 generates candidate location information indicating locations of candidate points for the road shoulder edge, based on height information included in the point cloud information on each plane set by the division unit in step S22.
- step S24 the selection unit 4 performs the grouping process described above on the candidate points indicated by the candidate location information generated by the divided-area candidate location information generation unit 7 in step S23. Since only one road shoulder edge is to be set on each of the right and left sides of the traveling trajectory, the selection unit 4 groups pairs of candidate points selected on the respective planes such that candidate points existing on the left side of the traveling trajectory are grouped together into one group and candidate points existing on the right side of the traveling trajectory are grouped together into another group different from the group of the left side.
- the selection unit 4 excludes candidate points that are likely to cause irregularities in the road shoulder edge to be generated, in each group grouped in step S24. To do so, the selection unit 4 checks the distance between each candidate point and the traveling trajectory sequentially in order of the measurement time points included in the point cloud information, starting with a candidate point with the earliest time point and ending with a point with the latest time point, extracts candidate points with a sharp change in the distance to the traveling trajectory, treats the section between the extracted candidate points as an exclusion section, and excludes the candidate points in the exclusion section from the candidate points to be used for generating location information of the road shoulder edge. Specifically, the selection unit 4 performs the following steps S25 to S26.
- step S25 the distance between each candidate point in the group and the traveling trajectory is calculated.
- step S26 an exclusion section is detected by comparing the distances calculated for each candidate point in step S25 between candidate points that are placed adjacent to each other when the candidate points are arranged in order of the measurement time points from the earliest to the latest. This detection of an exclusion section is performed as described below, for example.
- this candidate point With regard to a certain candidate point and two candidate points adjacent to and sandwiching this candidate point, it is determined whether a difference in the distance when compared with one of the adjacent candidate points and a difference in the distance when compared with the other adjacent candidate point are both equal to or greater than a predetermined threshold value. If both are not equal to or greater than the threshold value, this candidate point can be presumed to be a candidate point for a smooth road shoulder edge, so that it is kept as a candidate point in the group. If the difference in the distance is equal to or greater than the threshold value with regard to at least one of the two adjacent candidate points, this candidate point can be presumed to be the start point or end point of an exclusion section, so that it is treated as the start point or end point of the exclusion section.
- the exclusion section corresponding to the candidate points 34 to 36 of Fig. 8 can be detected, based on the candidate points of the start point and end point of the exclusion section, a time series of measurement time points of the candidate points, the number of continuous candidate points with a difference in the distance equal to or less than the threshold value, or the like.
- the selection unit 4 excludes the candidate points included in the exclusion section detected in step S26 from the candidates to be used for generating location information of the road shoulder edge.
- step S28 the boundary location information generation unit 5 generates candidate points to interpolate the exclusion section excluded by the selection unit 4 in step S27.
- the specific method is substantially the same as the method by which two groups are connected when plotting a demarcation line, and new candidate points are generated such that an approximate curve formed by the candidate points, other than the excluded candidate points, and the newly generated candidate points is shaped similarly to the nearby traveling trajectory.
- step S29 the boundary location information generation unit 5 generates location information of the finally obtained candidate points for the road shoulder edge, as location information of the road shoulder edge indicating determined locations of the road shoulder edge. That is, in step S29, the boundary location information generation unit generates boundary location information indicating determined locations of the boundary, as a process of determining boundary location information to be finally output.
- the road shoulder edge indicated by the location information generated by the operation as described above is generated by eliminating disturbance factors such as smudges on the road surface, and thus is a smooth line corresponding to the traveling trajectory along the road. Therefore, when it is used for map information, location information of the road shoulder edge properly reflecting the real road shoulder edge can be obtained.
- the boundary location information generation unit By the operation of the information processing device 1 as described above, when boundaries of a road such as a demarcation line and a road shoulder edge are to be plotted, candidate points corresponding to the traveling trajectory of the surveying vehicle traveling along the road are selected in the selection unit, and based on the selected candidate points, the boundary location information generation unit generates boundary information. As a result, it is possible to generate boundary location information of the road in which inaccuracies due to disturbance factors are reduced.
- An information processing device is applicable to plotting of a dynamic map.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
Description
- The present invention relates to an information processing device that generates boundary location information indicating locations of boundaries of a road such as demarcation lines and road shoulder edges to be used as map information.
- Automated driving by which a vehicle such as a car drives by automated control requires not only various sensors such as a camera and a laser radar that are attached to an automated driving vehicle and detect situations inside and outside the vehicle, but also map information called a dynamic map that highly accurately represents a road on which the vehicle is traveling. It is desirable that this highly accurate map information be created based on information resulting from actually measuring the vicinity of the real road. For example, a dynamic map can be created by collecting three-dimensional point cloud information indicating coordinates and luminance levels of points on the surfaces of features, such as the surface of a road for which map information is to be generated and facilities on the side of the road, or image information of the road, and analyzing the collected information.
- As a method for collecting information resulting from actually measuring the vicinity of the road, there is a method in which a surveying vehicle, which is a vehicle in which a mobile mapping system is installed, is made to actually travel and three-dimensional point cloud information and the like of the road surface, facilities on the side of the road, and the like are collected. This mobile mapping system is a measurement device that is of a type to be installed in a vehicle and includes a positioning device such as a Global Navigation Satellite System (GNSS) receiver, an inertial measurement unit, and an odometer, and measurement equipment such as a laser scanner, which performs scanning with laser light and measures locations and reflection luminance levels of target objects at which the laser light is reflected, and a camera, and as a result of measurement by these positioning device and measurement device, collects location information of the traveling vehicle, three-dimensional point cloud information indicating coordinates and luminance levels of points on the surfaces of features in the vicinity of the traveled road, image information of the road, and the like.
- Information obtained by the mobile mapping system is point cloud information and image information which are not map information, so that a plotting process is required to generate map information from the point cloud information. This plotting process is, for example, a process to generate lines indicating locations of boundaries necessary as map information, such as lines representing demarcation lines on the road and road shoulder edges of the road, based on the three-dimensional point cloud information. If the plotting process is performed manually and map information in a wide range is prepared, the amount of information of point clouds to be processed becomes enormous, resulting in enormous work costs. In particular, work costs for plotting demarcation lines and road shoulder edges, which are features that always exist along the road and are map information required to realize automated driving, are high. Therefore, techniques for performing this plotting process by software have been developed.
- One of the techniques for performing this plotting process is a technique described in
Patent Literature 1, for example.Patent Literature 1 discloses a device that plots a demarcation line and a road shoulder edge by setting traverse planes perpendicular to a traveling direction of a vehicle at predetermined intervals in three-dimensional measurement data of a road, extracting candidate points for the demarcation line and the road shoulder edge on each of the traverse planes that have been set, and connecting the candidate points extracted on the respective traverse planes. The candidate points for the demarcation line are extracted using reflection luminance level information included in the three-dimensional point cloud information obtained by a laser scanner, and the candidate points for the road shoulder edge are extracted using location information in the three-dimensional point cloud information. -
discloses a road structuring device to accurately detect a traveling lane zone of a road even in a situation that a blurred zone line and a zebra zone exist, and a deficit of a point group of occlusion caused by a parallel-traveling vehicle occurs. The road structuring device comprises: ground-surface/wall-surface detection part which calculates the flatness and a normal direction of a three-dimensional point included in a three-dimensional point group, performs clustering by using the similarity of the three-dimensional point in the normal direction, detects a ground surface and a wall surface on the basis of the normal direction and the flatness of the three-dimensional point group cluster, and detects the boundary information of the ground surface and the wall surface; a traveling region detection part which determines whether or not an attention point is a boundary point by calculating a separation degree by a temporary boundary line direction and a separation degree by a temporary boundary line curve, and detects a traveling region regulation line on the basis of the boundary point which is connected on the basis of the similarity of a tangent direction and the similarity of a magnitude of a curvature; and a lane structuring processing part which detects a traveling lane zone of a road on the basis of the traveling region regulation line, and detects a branch point or a cross point on the basis of the detected traveling lane zone.JP 2017 223-511 A -
US 2017/116477 A1 discloses a method and apparatus for lane detection using overhead images and positional data. A server receives positional data from a vehicle and computes a continuous trajectory. The server receives an overhead image of a road section. The server crops and processes the overhead image to remove unwanted portions. The server identifies edge features using the continuous trajectory and steerable filters. The server identifies lanes in the overhead image using a maximization algorithm, the edge filters, and the continuous trajectory. - Patent Literature 1:
JP 2017-78904 A - In the plotting process disclosed in
Patent Literature 1, the candidate points extracted on the respective traverse planes are simply connected with lines. For this reason, there is a problem, which is that although a demarcation line similar to the real demarcation line drawn on the road needs to be generated as map information, if there are a plurality of points with high reflection luminance levels on some of the traverse planes due to disturbance factors such as smudges on the road, a demarcation line whose shape is deviated from the shape of the real demarcation line will be generated. In addition, if a deceleration line exists next to a demarcation line, candidate points for the demarcation line and candidate points for the deceleration line will be connected, and a demarcation line whose shape is different from the shape of the real demarcation line will be generated. In addition, another problem is that when a road shoulder edge is to be plotted as a boundary, if location information of the road surface has not been obtained accurately in point cloud information due to disturbance factors such as weeds and another vehicle traveling alongside, locations of the roots of the weeds and locations of the tires of the vehicle traveling alongside will be extracted as candidate points and they will be connected, and as a result, a road shoulder edge whose shape is different from the shape of the real road shoulder edge will be generated. - The present invention has been made to solve the problems as described above, and aims to obtain an information processing device that generates boundary location information of a road in which inaccuracies due to disturbance factors are reduced.
- The present invention provides an information processing device, in accordance with
claim 1. - The present invention also provides an information processing method, in accordance with
claim 8. - The present invention also provides a program, in accordance with
claim 9. - An information processing device according to the present invention includes a candidate location information generation unit to generate, based on feature measurement information indicating locations of features including a road having a boundary, candidate location information indicating locations of candidate elements for the boundary; a selection unit to select candidate elements from among the candidate elements indicated by the candidate location information generated by the candidate location information generation unit, based on a trajectory along the road; and a boundary location information generation unit to generate boundary location information indicating determined locations of the boundary, using the candidate elements selected by the selection unit. Therefore, candidate elements excluding candidate elements that do not correspond to the trajectory along the road and are thus considered to be disturbance are selected in the selection unit, and based on the selected candidate elements, the boundary location information generation unit generates boundary location information, so that the boundary location information of the road in which inaccuracies due to disturbance factors are reduced can be generated.
-
-
Fig. 1 is a configuration diagram illustrating a configuration of an information processing device in a first embodiment; -
Fig. 2 is a conceptual diagram representing a method for generating candidate location information of a demarcation line by a divided-area candidate location information generation unit in the first embodiment; -
Fig. 3 is a conceptual diagram representing a method for generating candidate location information of a demarcation line by the divided-area candidate location information generation unit in the first embodiment; -
Fig. 4 is a conceptual diagram representing a method for generating candidate location information of a road shoulder edge by the divided-area candidate location information generation unit in the first embodiment; -
Fig. 5 is a conceptual diagram representing a method for generating candidate location information of a road shoulder edge by the divided-area candidate location information generation unit in the first embodiment; -
Fig. 6 is a conceptual diagram representing a method for grouping candidate points for a demarcation line by a selection unit in the first embodiment; -
Fig. 7 is a conceptual diagram representing a method for excluding a group other than that of a demarcation line from a plurality of groups by the selection unit in the first embodiment; -
Fig. 8 is a conceptual diagram representing a method for selecting candidate points for a road shoulder edge by the selection unit in the first embodiment; -
Fig. 9 is a conceptual diagram representing a method for connecting two groups of a demarcation line by a boundary location information generation unit in the first embodiment; -
Fig. 10 is a conceptual diagram representing a method for generating location information of a road shoulder edge by the boundary location information generation unit in the first embodiment; -
Fig. 11 is a conceptual diagram representing a method for generating location information of a road shoulder edge by the boundary location information generation unit in the first embodiment; -
Fig. 12 is a configuration diagram illustrating a hardware configuration of the information processing device in the first embodiment; -
Fig. 13 is a flowchart illustrating operation relating to generation of location information of a demarcation line by the information processing device in the first embodiment; and -
Fig. 14 is a flowchart illustrating operation relating to generation of location information of a road shoulder edge by the information processing device in the first embodiment. - An information processing device according to the present invention generates, based on feature measurement information indicating locations of features including a road having a boundary, candidate location information indicating locations of candidate elements for the boundary, and selects candidate elements corresponding to a trajectory along the road from among the candidate elements indicated by the generated candidate location information, thereby excluding candidate elements that do not correspond to the trajectory along the road and are thus considered to be disturbance, and based on the selected candidate elements, generates location information of the boundary of the road, thereby determining locations of the boundary of the road by reducing inaccuracies due to disturbance factors.
-
Fig. 1 is a configuration diagram illustrating a configuration of aninformation processing device 1 in a first embodiment to implement the present invention. - In the
information processing device 1 illustrated inFig. 1 , aninformation acquisition unit 2 acquires feature measurement information indicating locations of features including a road having a boundary and trajectory information indicating a trajectory along the road. The features refer to elements that may be used as map information, which are elements such as demarcation lines and road markings depicted on the road, installed objects such as guard rails and road signs installed in the vicinity of the road, and road shoulder edges, which are edges of road shoulders. This embodiment will be described hereinafter using demarcation lines and road shoulder edges as the features. - A candidate location
information generation unit 3 generates candidate location information indicating locations of candidate elements for the boundary, based on the feature measurement information and the trajectory information acquired by theinformation acquisition unit 2. A selection unit 4 selects candidate elements from among the candidate elements indicated by the candidate location information generated by the candidate locationinformation generation unit 3, based on the trajectory information. A boundary locationinformation generation unit 5 generates boundary location information indicating determined locations of the boundary, using the candidate elements selected by the selection unit 4. - The
information processing device 1 is configured as described above, and each unit will be described in detail below. - The
information acquisition unit 2 acquires feature measurement information indicating locations of features including a road having a boundary and trajectory information indicating a trajectory along the road. The feature measurement information and the trajectory information are acquired by measurement using a mobile mapping system. The mobile mapping system is installed in a mobile object such as, for example, a vehicle, an aircraft, or a drone. A vehicle in which the mobile mapping system is installed is referred to as a surveying vehicle. For example, the mobile mapping system includes a positioning device, such as a GNSS receiver, an inertial navigator, and an odometer, and measurement equipment, such as measurement devices like a laser scanner, a camera, and the like. - In this embodiment, feature measurement information is measured by the laser scanner included in the mobile mapping system. The laser scanner performs scanning with laser light and receives reflected light of the emitted laser light. The laser scanner is installed on the roof of the surveying vehicle, and scans the surface of the road and features in the vicinity of the road with laser light such that the laser light traverses the road. It is information including a location of a point at which reflected light of the laser light emitted from the laser scanner and reflected on the surface of a feature is received, a reflection luminance level indicating the intensity of reflected light when the laser light is reflected at the feature at the point of this location, and a time point when the location and the reflection luminance level are measured, and is measured by the laser scanner provided in the mobile mapping system. This feature measurement information is measured for each of different points on the surface of the feature, and a collection of feature measurement information at a plurality of points is referred to as point cloud information.
- In this embodiment, trajectory information is traveling trajectory information indicating a traveling trajectory, at the time of measuring point cloud information, of the surveying vehicle in which the mobile mapping system is installed. In this embodiment, the traveling trajectory of the surveying vehicle is represented by a sequence of points collected from information on the location of the vehicle at each time point when measured by the GNSS receiver provided in the mobile mapping system, and is represented, for example, by a sequence of points obtained by measurement by setting the update rate of the GNSS receiver to 100 Hz, that is, by measuring location information of the surveying vehicle at every 0.01 seconds. The update rate of the GNSS receiver is not limited to 100 Hz, and may be changed depending on the granularity of the trajectory to be set.
- In this embodiment, point cloud information acquired by the mobile mapping system is used as feature measurement information. However, this is not limiting, and provided that locations of features are indicated, information on locations of features generated by image processing from image data captured by the camera installed in the mobile mapping system, for example, may be used. The method for acquiring feature measurement information is not limited to measurement by the mobile mapping system, and feature measurement information may be acquired by measurement by measurement equipment installed in an aircraft or a drone, for example.
- Similarly, trajectory information is not limited to that described above, provided that a trajectory along the road corresponding to feature measurement information is indicated. For example, flight trajectory information indicating a flight trajectory of an aircraft or a drone, instead of the traveling trajectory of the surveying vehicle, may be used. In this embodiment, trajectory information is assumed to be a sequence of points consisting of results of measurement by the GNSS receiver at each time point. However, instead of a sequence of points, a curve generated based on the sequence of points may be used. A trajectory is not limited to a movement trajectory of a mobile object that moves along the road, such as a traveling trajectory or a flight trajectory, and may be created manually by visualizing point cloud information and using a paint tool or the like, for example.
- Furthermore, feature measurement information and trajectory information are not limited to those acquired at the same time, and feature measurement information may be acquired by an aircraft and trajectory information may be acquired by a vehicle, for example, provided that both are information concerning the same road.
- The candidate location
information generation unit 3 generates candidate location information indicating locations of candidate elements for the boundary of the road, based on the feature measurement information acquired by theinformation acquisition unit 2. In this embodiment, the candidate locationinformation generation unit 3 includes adivision unit 6 to set a plurality of areas and divide the feature measurement information into sets respectively belonging to the plurality of areas that have been set, and a divided-area candidate locationinformation generation unit 7 to generate candidate location information, based on the feature measurement information in each area after being divided by thedivision unit 6. - The
division unit 6 sets a plurality of areas, and divides the feature measurement information acquired by theinformation acquisition unit 2 into sets respectively belonging to the plurality of areas that have been set. That is, thedivision unit 6 divides the feature measurement information acquired by theinformation acquisition unit 2 into sets each belonging to one of the areas that have been set. In this embodiment, a plurality of areas are a plurality of planes perpendicular to the traveling trajectory, indicated by the traveling trajectory information, of the surveying vehicle in which the mobile mapping system is installed. - In this embodiment, a plurality of areas are assumed to be planes perpendicular to the traveling trajectory. However, this is not limiting, and they may be curved planes or cuboids instead of planes, and may be intersecting diagonally instead of being perpendicular. That is, a plurality of areas are not limited to perpendicular planes, and may be a plurality of spaces intersecting the trajectory indicated by the trajectory information.
- Instead of being divided based on spatial positions, the feature measurement information may be divided at predetermined time intervals, based on measurement time points indicated by measurement time point information included in point cloud information. For example, when the mobile mapping system measures features, if the laser scanner performs measurement while rotating in a direction perpendicular to the traveling direction of the vehicle, it may be desirable to divide feature measurement information into pieces of data each corresponding to one rotation of the laser scanner. In such a case, a space corresponding to time required for one rotation may be set as each area.
- When point cloud information is divided into sets respectively belonging to a plurality of areas, each set may be formed by selecting some points instead of all points in each area. This can reduce the amount of information on which data processing is performed and prevent a delay in data processing when the amount of information in the point cloud information is larger than necessary.
- The divided-area candidate location
information generation unit 7 generates candidate location information indicating locations of candidate elements for a boundary of a feature, based on the feature measurement information in each area after being divided by thedivision unit 6. The candidate elements refer to elements that may form a boundary of a feature, and are points, line segments, faces, and the like. In this embodiment, a case in which points are treated as candidate elements will be described hereinafter. - In this embodiment, boundaries are demarcation lines and road shoulder edges. To generate candidate location information of a demarcation line, information on locations and luminance levels of features included in feature measurement information is used. To generate candidate location information of a road shoulder edge, information on only locations of features is used. Therefore, although point cloud information is assumed to include information on locations, luminance levels, and time points in this embodiment, if only road shoulder edges are to be plotted, feature measurement information may include only information on locations of features.
- A method for generating candidate location information of a demarcation line as a boundary in the divided-area candidate location
information generation unit 7 will now be described. -
Fig. 2 is a conceptual diagram representing a method for generating candidate location information of a demarcation line based on point cloud information, on a certain plane set as an area by thedivision unit 6. The range of the demarcation line is recognized by utilizing the fact that there is a large difference between the luminance level of the road surface and the luminance level of the demarcation line, and based on the range, candidate locations for the demarcation line are generated. - In the graph illustrated in
Fig. 2 , the vertical axis indicates the luminance level of features, and the horizontal axis indicates a right direction of the road on a certain plane set by thedivision unit 6. The right direction of the road is a direction that is perpendicular to both a traveling direction of the vehicle and a vertically upward direction and such that the right side facing the traveling direction of the vehicle is positive. - The coordinate origin may be set in any way. In this embodiment, it is assumed that the origin position of the vertical axis is 0 [W/sr/m2], and the origin position of the horizontal axis is the location of a point with the smallest coordinate value in the right direction of the road among points included in a point cloud on the plane.
- The divided-area candidate location
information generation unit 7 checks the luminance levels of points included in point cloud information sequentially in order of coordinate values in the right direction of the road, starting with a point with the smallest coordinate value, and detects a point with a sharp increase in the luminance level, that is, apoint 8 whose luminance level differs from that of the immediately preceding point by more than a threshold value (for example, a set value of displacement in reflection intensity of reflected light of laser light), and detects a point with a sharp decrease in the luminance level, that is, apoint 9 whose luminance level differs from that of the next point by more than the threshold value. Then, if the interpoint width between thepoint 8 and thepoint 9 is within a specified width between the upper and lower limits, acandidate point 10 is generated by treating the location of the median point among points with high luminance levels, that is, points between thepoint 8 and thepoint 9, as a candidate element for the demarcation line, and candidate location information indicating the location of thecandidate point 10 is generated. Alternatively, if the interpoint width between thepoint 8 and thepoint 9 is within the specified width between the upper and lower limits, thecandidate point 10 may be generated by treating the location of the center between thepoint 8 and thepoint 9 as a candidate element for the demarcation line, and candidate location information indicating the location of thecandidate point 10 may be generated. - This will generate a sequence of points of the candidate points 10 for the demarcation line that line up in the traveling direction of the vehicle over the demarcation line.
- The method for generating candidate location information indicating locations of candidate elements for the demarcation line is not limited to the above method. For example, instead of using a difference in the luminance level from an adjacent point as described above, candidate location information may be generated by setting a
threshold value 11 for the luminance value in advance, as illustrated inFig. 3 , extracting apoint 12, apoint 13, apoint 14, and apoint 15, included in point cloud information, whose luminance levels are equal to or above the threshold value, treating the extracted points as the range of the demarcation line, and if the width of the range of the demarcation line is within the specified width between the upper and lower limits, treating the location of the median point of these points as acandidate point 16 for the demarcation line. - Next, a method for generating candidate location information of a road shoulder edge will be described.
-
Fig. 4 is a conceptual diagram representing a method for generating candidate location information of a road shoulder edge based on point cloud information, on a certain plane set by thedivision unit 6. By utilizing the fact that there is a difference between the height of a road surface and the height of a road shoulder, the location of the road shoulder is recognized, and based on the location of the road shoulder, candidate locations for the road shoulder edge are generated. - In the graph illustrated in
Fig. 4 , the vertical axis indicates a height direction, that is, a vertically upward direction on a certain plane set by thedivision unit 6, and the horizontal axis indicates the right direction of the road on this plane. The coordinate origin may be set in any way. In this embodiment, it is assumed that the origin position of the vertical axis is the location of a point at the lowest position among points included in a point cloud on the plane, and the origin position of the horizontal axis is the location of a point with the smallest coordinate value in the right direction of the road among the points included in the point cloud on the plane. - First, the divided-area candidate location
information generation unit 7 classifies the point cloud information into points on the road and points of features other than the road, such as a curb and an installed object. This classification is performed by setting athreshold value 17 for the height direction in advance, and treating a point whose height is less than thethreshold value 17 as the road and a point whose height is equal to or above thethreshold value 17 as a point other than the road. Then, a point closest to the traveling trajectory indicated by the trajectory information obtained from theinformation acquisition unit 2 is extracted from among the points classified as points other than the road. In the example inFig. 4 , although the traveling trajectory is not illustrated, it is positioned to the left of the origin point, so that apoint 18 is extracted as the point closest to the traveling trajectory. Then, the location of the foot of a perpendicular line drawn from the location of the extractedpoint 18 to aroad surface 19 is treated as a candidate point 20 for the road shoulder edge, and candidate location information indicating the location of the candidate point 20 is generated. Theroad surface 19 may be set in advance as with thethreshold value 17, or an approximate plane may be generated based on the points determined as points on the road because of a height less than thethreshold value 17, and the generated plane may be treated as the road surface. - A road shoulder edge is a boundary indicating an edge of the road. To be used as map information, only one road shoulder edge is set on each of the right and left sides of the traveling direction of the vehicle, so that it is arranged that only one candidate point for the road shoulder edge be generated on each of the right and left sides of the traveling trajectory on one plane.
- This will generate a sequence of points of the candidate points 20 for the road shoulder edge that are lined up in the traveling direction of the vehicle.
- How to obtain candidate points for a road shoulder edge is not limited to the method described above. For example, the coordinate values, in the height direction, of points included in point cloud information may be sequentially checked, starting with a point with the smallest coordinate value in the right direction of the road, to extract a point with a sharp decrease in the coordinate value and a point with a sharp increase in the coordinate value, and these points may be treated as candidate points for the road shoulder edge. As illustrated in
Fig. 5 , if the outer side of the shoulder of the road is a slope such as a bank, and moreover, a point cloud is sparse, an approximatestraight line 21 may be generated based on a plurality of points with sharp changes in the height, and the intersection between this straight line and the road surface may be treated as acandidate point 22 for the road shoulder edge. - The divided-area candidate location
information generation unit 7 obtains a candidate point for a boundary of a feature, that is, a candidate point for a demarcation line or a road shoulder edge, for each area set by thedivision unit 6, and generates candidate location information indicating the location of each candidate point, as described above. - The selection unit 4 selects candidate elements for a boundary from among the candidate elements generated by the candidate location
information generation unit 3, based on the trajectory information acquired by theinformation acquisition unit 2. Specifically, the selection unit 4 of this embodiment selects candidate points for the road shoulder edge or the demarcation line corresponding to the traveling trajectory, based on the distance between the candidate point (thecandidate point 10 or thecandidate point 16 for the demarcation line, the candidate point 20 or thecandidate point 22 for the road shoulder edge) in each area obtained by the operation of the candidate locationinformation generation unit 3 and the traveling trajectory. Furthermore, in this embodiment, the selection unit 4 selects candidate points for the road shoulder edge or the demarcation line corresponding to the traveling trajectory, based on the distance between candidate elements, that is, the distance between candidate points. - First, a method for selecting candidate points for a demarcation line by the selection unit 4 will be described.
- To select candidate points for the demarcation line, grouping of candidate points is performed first.
Fig. 6 is a conceptual diagram representing a method for grouping of candidate points for the demarcation line, and illustrates ademarcation line 48 and adeceleration line 49 that are depicted on the road and a travelingtrajectory 26 projected on the road surface.Fig. 6 also illustrates candidate points 23, 24, and 25, which are among the candidate points obtained in the divided-area candidate locationinformation generation unit 7. - The selection unit 4 calculates the distance between the candidate point for the demarcation line generated on each plane by the divided-area candidate location
information generation unit 7 and the traveling trajectory, and the distance between the candidate points. - Then, with regard to certain two candidate points, if the distance between the two candidate points is within a predetermined range and a difference between the two candidate points concerning the distance to the traveling trajectory is within a predetermined range, these two candidate points are grouped into the same group. In
Fig. 6 , a difference between the distance between thecandidate point 23 and the travelingtrajectory 26 and the distance between thecandidate point 24 and the travelingtrajectory 26 is small, so that thecandidate point 23 and thecandidate point 24 are grouped into the same group. With regard to thecandidate point 23 and thecandidate point 25, a difference between these points concerning the distance to the travelingtrajectory 26 is large, so that these candidate points are not grouped into the same group. - The selection unit 4 performs this process for all combinations of candidate points, and performs grouping for all the candidate points. In this embodiment, the above process is performed on all combinations of candidate points. However, the above process may be performed only on combinations of points whose measurement time points are close to each other, based on measurement time point information included in point cloud information.
- The demarcation line is assumed to be approximately parallel to the traveling trajectory. Thus, if candidate points generated by the divided-area candidate location
information generation unit 7 exist on the demarcation line, these candidate points will be grouped into the same group. However, if a point on the road surface other than the demarcation line is extracted as a candidate point because its luminance level is high due to the influence of a line different from the demarcation line, such as thedeceleration line 49 illustrated inFig. 6 , or a smudge, for example, the distance between this point and the traveling trajectory is different when compared with points on the demarcation line, so that this point is not grouped into the same group as the candidate points on the demarcation line. Therefore, it is possible to form a group of points on the demarcation line not including disturbance factors such as a smudge on the road surface or a line other than the demarcation line. - Furthermore, the selection unit 4 performs integration of groups. That is, if a candidate point included in a certain group is also included in another different group, all the candidate points included in these two groups are integrated into one group. This will create a group including many candidate points corresponding to the traveling
trajectory 26. - In this way, it is possible to separately form a group of candidate points to be used for generating a boundary and a group due to a disturbance factor such as a smudge on the road surface.
- Although the traveling trajectory is depicted as an arrow in
Fig. 6 , the traveling trajectory is a sequence of points indicating the location of the vehicle at each time point measured by the GNSS receiver in this embodiment. Therefore, in this embodiment in actuality, the distance between each candidate point for the demarcation line and each point included in the sequence of points of the traveling trajectory is calculated, and the distance between each candidate point and a point of the traveling trajectory located at the shortest distance is treated as the distance between each candidate point and the traveling trajectory. When the traveling trajectory is treated as a curve instead of sequence of points, a perpendicular line may be drawn downward from each candidate point to the curve and the length of the perpendicular line may be calculated. - In the method of grouping candidate points described above, if a difference between candidate points concerning the distance to the traveling trajectory is within the predetermined range, these points are grouped together. Thus, also in the case of points other than those on the demarcation line, if a difference between points on a deceleration line or a road marking, for example, concerning the distance to the traveling trajectory is within the predetermined range, these points may be grouped together. However, such a group is not a group of points on the demarcation line and thus needs to be excluded from candidates for the boundary. To do this, a group in which the number of candidate points included in the group is equal to or smaller than a predetermined number is extracted, and if this group is located between groups whose numbers of candidate points are larger than that of this group and the distance to each of the larger groups is shorter than the size of a preset road width, this group is deleted. This can prevent points on a road marking and the like from being used for generating location information of the demarcation line.
- The above process will be described using
Fig. 7. Fig. 7 is a conceptual diagram representing a method for excluding a group other than those of the demarcation line from a plurality of groups. InFig. 7 , agroup 29 and agroup 30, each formed by candidate points on the demarcation line, are large groups, and agroup 27 on a deceleration line and agroup 28 on a road marking that are between the large groups are excluded. - The method for selecting candidate points for the demarcation line is not limited to the method described above. Instead of calculating the distance between each candidate point and the traveling trajectory with regard to two candidate points, candidate points whose absolute values of the distance to the traveling trajectory are within a predetermined range may be grouped together.
- Next, a method for selecting candidate elements for the road shoulder edge by the selection unit 4 will be described.
- On each plane, there is only one candidate point for the road shoulder edge generated by the divided-area candidate location
information generation unit 7 on each of the right and left sides of the traveling trajectory of the vehicle. For this reason, the selection unit 4 groups the candidate points for the road shoulder edge that are located on the left side of the traveling trajectory on the respective planes into one group, and groups candidate points for the road shoulder edge that are located on the right side of the traveling trajectory on the respective planes into one group different from the group for the road shoulder edge on the left side. - In measurement by the laser scanner installed in the mobile mapping system, point cloud information may not be able to be measured accurately due to the influence of plants or another vehicle travelling alongside. If grouping is performed by the above method using candidate points obtained from such point cloud information not measured accurately, since each candidate point is grouped into either of two groups by being distinguished based on whether it is located on the right or left side of the traveling trajectory, a road shoulder edge that is not smooth, having sharp irregularities sideways, that is, a road shoulder edge deviated from the shape of the real road shoulder edge may be generated. Therefore, a section including large irregularities sideways in the positions of candidate points is determined as erroneous detection, and the boundary location
information generation unit 5 excludes the candidate points included in the section determined as erroneous detection from the candidate points to be used for generating location information of the road shoulder edge. - In order to select appropriate candidate points, the selection unit 4 compares the shape of each group of candidate points with the shape of the traveling trajectory. Specifically, the distance between each candidate point in the group and the traveling trajectory is calculated, and a point at which this distance changes from that of an adjacent candidate point by a predetermined threshold value or more is extracted. By this, candidate points corresponding to the traveling trajectory can be selected by detecting a portion where a point cloud is not acquired accurately due to the influence of plants or another vehicle traveling alongside and there are candidate points that might generate a road shoulder edge deviated from the shape of the real road shoulder edge, and excluding the detected candidate points.
- In this way, the candidate points to be used for generating a boundary can be selected by excluding disturbance factors such as smudges on the road surface.
- The above method will be specifically described using
Fig. 8. Fig. 8 is a conceptual diagram representing a method for selecting candidate points for a road shoulder edge. - Since the points included in point cloud information have information on measurement time points, the distance between each candidate point and the traveling trajectory is checked sequentially, starting with a point with the earliest time point and ending with a point with the latest time point. Then, points with a sharp change in the distance to the traveling trajectory, that is, a
candidate point 32 whose distance to the traveling trajectory indicates a sharp decrease from that of anext candidate point 34 and acandidate point 33 whose distance to the traveling trajectory indicates a sharp increase from that of the precedingpoint 35 are detected. Thecandidate point 32 and thecandidate point 33 are ends of the candidate points that are detected accurately, and thecandidate point 34, thecandidate point 35, and candidate points 36 between thecandidate point 34 and thecandidate point 35 are detected erroneously, so that these candidate points are excluded from the candidate points to be used for generating location information of the road shoulder edge. - The method for comparing the shape of a group of candidate points with the shape of the traveling trajectory is not limited to the above method. For example, an approximate curve represented by a cubic function may be generated from a plurality of candidate points, for each fixed distance interval including a plurality of candidate points, and whether each candidate point should be used for generating location information of the road shoulder edge may be determined based on the distance between the approximate curve and the candidate point. In the case of a normal smooth road shoulder edge, it can be represented by a cubic function, so that the distance between the approximate curve and a candidate point is small. If irregularities sideways are caused by plants or the like, the distance between the approximate curve and a candidate point increases, so that it can be determined as erroneous detection.
- The boundary location
information generation unit 5 generates boundary location information indicating determined locations of a boundary, using the candidate elements selected by the selection unit 4. In this embodiment, the boundary locationinformation generation unit 5 generates location information of a demarcation line or location information of a road shoulder edge, based on the traveling trajectory indicated by the traveling trajectory information and using the candidate points selected by the selection unit 4. - First, a method for generating location information of a demarcation line will be described.
- The boundary location
information generation unit 5 connects groups grouped as the candidates for the demarcation line by the selection unit 4. - In measurement of reflection luminance levels of features by the laser scanner, if there is a section where the demarcation line is faded, the measured reflection luminance level of the demarcation line in that section may not be very high. In such a situation, the candidate location
information generation unit 3 may not be able to properly generate candidate elements in that section, and groups of the demarcation line generated by the selection unit 4 may be scattered at locations separated from each other. Therefore, the boundary locationinformation generation unit 5 connects groups of the demarcation line existing at separate locations, based on the traveling trajectory of the surveying vehicle. - To connect groups signifies to generate candidate points for the demarcation line between two groups at predetermined distance intervals. The traveling trajectory of the surveying vehicle is used to connect the groups. Specifically, candidate points are generated such that an approximate curve formed by the two groups and the candidate points generated between the two groups is shaped similarly to the nearby traveling trajectory.
- To connect the groups, the location of the median point of the candidate points included in each group is calculated first, and then the distance between the median point of one of the groups and each candidate point included in the other group is calculated. Some candidate points are extracted sequentially in order of this distance, starting with a point with the shortest distance. By interchanging the roles of the groups, some candidate points are also extracted from the other group. Then, the location of the median point of the extracted candidate points are calculated in each of the groups, and the distance between the median points of the two groups is calculated. If this distance between the median points is within a predetermined range, the two groups are connected.
- The above method will be specifically described using
Fig. 9. Fig. 9 is a conceptual diagram representing a method for connecting two groups of a demarcation line. - Using a traveling
trajectory 37 of the surveying vehicle, agroup 38 and agroup 39 of the demarcation line are connected. If thegroup 38 and thegroup 39 are within a predetermined distance range, candidate points 40 are generated so that these two groups are joined. These candidate points 40 are generated such that an approximate curve formed by thegroup 38, thegroup 39, and the candidate points 40 is shaped similarly to the travelingtrajectory 37. - In
Fig. 9 , the traveling trajectory is depicted as an arrow. In this embodiment, however, the traveling trajectory is a sequence of points indicating the location of the vehicle at each time point when measured by the GNSS receiver. Therefore, in this embodiment, an approximate curve is first generated from the sequence of points measured by the GNSS receiver, and two groups are connected such that this approximate curve is shaped similarly to an approximate curve formed by the two groups and newly generated candidate points. However, the method for connecting two groups is not limited to the above method, and groups may be connected using only the positional relationship with a sequence of points measured by the GNSS receiver without generating an approximate curve. For example, two groups may be connected by generating new candidate points such that the angles formed by connecting the points measured by the GNSS receiver with straight lines are the same as the angles formed by connecting the candidate points newly generated between the two groups with straight lines. - Then, location information of the finally obtained candidate points for the demarcation line is generated as location information of the demarcation line indicating determined locations of the demarcation line.
- Next, a method for generating location information of a road shoulder edge will be described.
- As illustrated in
Fig. 10 , the boundary locationinformation generation unit 5 deletes thecandidate point 34, thecandidate point 35, and the candidate points 36 included in the section determined as erroneously detected as the candidates for the road shoulder edge by the selection unit 4. For the deleted section, candidate points 41 are generated utilizing the shape of the travelingtrajectory 31. A specific method is the same as the method used to connect groups in generating location information of the demarcation line. The candidate points 41 are generated such that an approximate curve formed by candidate points other than thecandidate point 34, thecandidate point 35, and the candidate points 36 that have been excluded is shaped similarly to the nearby travelingtrajectory 31. - When candidate points for a road shoulder edge are to be generated, if a demarcation line has already been plotted, the demarcation line may be used instead of the traveling trajectory. Conversely, when groups of a demarcation line are to be connected, if a road shoulder edge has already been plotted, the road shoulder edge may be used instead of the traveling trajectory.
- As to the generation of location information of a road shoulder edge where there are significant irregularities sideways and a significant deviation from the shape of the real road shoulder edge, location information may be generated by using only points that are locally near the trajectory of the vehicle itself and connecting these points, as illustrated in
Fig. 11 . InFig. 11 , location information of aroad shoulder edge 47 can be generated by using acandidate point 43, acandidate point 44, acandidate point 45, acandidate point 46, and a travelingtrajectory 42 and generating new candidate points (not illustrated) on a line connecting thecandidate point 43, thecandidate point 44, thecandidate point 45, and thecandidate point 46. - In automated driving of a vehicle, it is often the case that driving is set so as not to stray outside the road shoulder edge. Thus, by generating a road shoulder edge, using only candidate points at a shorter distance to the traveling trajectory, that is, the center of the road, the vehicle can travel without contacting the road shoulder even in a situation in which the vehicle travels by automated driving while being positioned near the road shoulder edge.
- The boundary location
information generation unit 5 generates location information of the candidate points for the road shoulder edge thus obtained, as location information of the road shoulder edge indicating determined locations of the road shoulder edge. - The
information processing device 1 in the first embodiment is configured as described above. The functions of theinformation processing device 1 are realized by hardware illustrated inFig. 12 . - The hardware illustrated in
Fig. 12 includes aprocessing device 50 such as a central processing unit (CPU), astorage device 51 such as a read only memory (ROM) or a hard disk, and acommunication device 52. - The
information acquisition unit 2 illustrated inFig. 1 is realized by thecommunication device 52, and the candidate locationinformation generation unit 3, the selection unit 4, and the boundary locationinformation generation unit 5 are realized by execution of programs stored in thestorage device 51 by theprocessing device 50. - The
information acquisition unit 2 may be configured such that theprocessing device 50 acquires feature measurement information and trajectory information that are stored in advance in thestorage device 51, instead of using thecommunication device 52. - The method for realizing the functions of the
information processing device 1 is not limited to a combination of hardware and software as described above, and the functions may be realized solely by hardware such as a large-scale integrated circuit (LSI), which is a processing device in which programs are implemented, or some of the functions may be realized by dedicated hardware and some of the functions may be realized by a combination of a processing device and programs. - The
information processing device 1 in this embodiment is configured as described above. - Operation of the
information processing device 1 in this embodiment will now be described. - First, operation when a demarcation line is plotted will be described using a flowchart of
Fig. 13 . - In step S1, the
information acquisition unit 2 acquires, from an external server or the like, point cloud information measured by the mobile mapping system, as feature measurement information, and traveling trajectory information indicating a traveling trajectory of the surveying vehicle in which the mobile mapping system is installed when measuring the point cloud information, as trajectory information. - In
step 2, thedivision unit 6 sets a plurality of planes perpendicular to the traveling trajectory of the surveying vehicle, based on the traveling trajectory information acquired by theinformation acquisition unit 2 in step S1, and divides the point cloud information acquired by theinformation acquisition unit 2 into sets respectively belonging to the planes that have been set. - In step S3, the divided-area candidate location
information generation unit 7 generates candidate location information indicating locations of candidate points for the demarcation line, based on luminance level information included in point cloud information on each plane set by thedivision unit 6 in step S2. - In step S4, the selection unit 4 performs a grouping process and a group integration process as described above on the candidate points indicated by the candidate location information generated by the divided-area candidate location
information generation unit 7 in step S3. - Then, in steps S5 to S11, the selection unit 4 performs a process of selecting groups of candidate points to be used for generating a boundary and excluding a group due to a disturbance factor such as a smudge on the road surface, from among the groups generated in step S4. In step S5, the number of candidate points included in each group and the distance between each pair of groups are calculated. Then, in steps S6 to S8, it is determined whether each group meets three conditions for being excluded from candidates to be used for generating location information of the demarcation line. In step S9, the group that meets all the three conditions is excluded. In step S10, the group that does not meet any of the conditions is selected as a candidate to be used for generating location information.
- Specifically, in step S6, it is determined whether a target group includes only the predetermined number of candidate points or less, that is, whether it is a group of a predetermined size or smaller size. In step S7, it is determined whether there is a larger group near the target group. In step S8, it is determined whether the small target group is located between larger groups. In step S9, the group that meets all the conditions of steps S6 to S8 is excluded from candidates to be used for generating location information. In step S10, the group that does not meet any of the conditions is selected as a candidate to be used for generating location information.
- In step S11, it is determined whether the process of steps S6 to S10 has been performed on all the groups. If the process has not been performed on all the groups, processing returns to step S6 and the process of steps S6 to S10 is performed on a next group.
- If it is determined in step S11 that the above process has been performed on all the groups, processing proceeds to step S12.
- In step S12, the boundary location
information generation unit 5 determines whether there are two or more groups to be candidates for the demarcation line. If there are two or more groups, processing proceeds to step S13. If there is only one group, processing proceeds to step S14. - If the boundary location
information generation unit 5 determines in step S12 that there are two or more groups to be candidates for the demarcation line, the boundary locationinformation generation unit 5 connects groups in step S13. Specifically, the boundary locationinformation generation unit 5 generates new candidate points for the demarcation line between two groups, based on the traveling trajectory. The new candidate points are generated such that an approximate curve formed by the candidate points is shaped similarly to the traveling trajectory. - As a result of the operation of steps S1 to S13 as described above, the candidate points for the demarcation line are obtained. Then, in step S14, the boundary location
information generation unit 5 generates location information of the finally obtained candidate points for the demarcation line, as location information of the demarcation line indicating determined locations of the demarcation line. That is, in step S14, the boundary location information generation unit generates boundary location information indicating determined locations of the boundary, as a process of determining boundary location information to be finally output. - The demarcation line indicated by the location information generated by the operation as described above is generated by excluding disturbance factors such as smudges on the road, and thus is a smooth line corresponding to the traveling trajectory along the road, so that when it is used for map information, location information of the demarcation line that properly reflects the real demarcation line can be obtained.
- Next, operation when a road shoulder edge is to be plotted will be described using a flowchart of
Fig. 14 . - The operation in steps S21 and S22 is substantially the same as that in steps S1 and S2, respectively, in plotting a demarcation line.
- In step S21, the
information acquisition unit 2 acquires point cloud information as feature measurement information and traveling trajectory information as trajectory information. - In step S22, the
division unit 6 sets a plurality of planes perpendicular to the traveling trajectory of the surveying vehicle, based on the traveling trajectory information acquired by theinformation acquisition unit 2, and divides the point cloud information acquired by theinformation acquisition unit 2 into sets respectively belonging to the planes that have been set. - In step S23, the divided-area candidate location
information generation unit 7 generates candidate location information indicating locations of candidate points for the road shoulder edge, based on height information included in the point cloud information on each plane set by the division unit in step S22. - In step S24, the selection unit 4 performs the grouping process described above on the candidate points indicated by the candidate location information generated by the divided-area candidate location
information generation unit 7 in step S23. Since only one road shoulder edge is to be set on each of the right and left sides of the traveling trajectory, the selection unit 4 groups pairs of candidate points selected on the respective planes such that candidate points existing on the left side of the traveling trajectory are grouped together into one group and candidate points existing on the right side of the traveling trajectory are grouped together into another group different from the group of the left side. - Furthermore, the selection unit 4 excludes candidate points that are likely to cause irregularities in the road shoulder edge to be generated, in each group grouped in step S24. To do so, the selection unit 4 checks the distance between each candidate point and the traveling trajectory sequentially in order of the measurement time points included in the point cloud information, starting with a candidate point with the earliest time point and ending with a point with the latest time point, extracts candidate points with a sharp change in the distance to the traveling trajectory, treats the section between the extracted candidate points as an exclusion section, and excludes the candidate points in the exclusion section from the candidate points to be used for generating location information of the road shoulder edge. Specifically, the selection unit 4 performs the following steps S25 to S26. In step S25, the distance between each candidate point in the group and the traveling trajectory is calculated. Then, in step S26, an exclusion section is detected by comparing the distances calculated for each candidate point in step S25 between candidate points that are placed adjacent to each other when the candidate points are arranged in order of the measurement time points from the earliest to the latest. This detection of an exclusion section is performed as described below, for example.
- With regard to a certain candidate point and two candidate points adjacent to and sandwiching this candidate point, it is determined whether a difference in the distance when compared with one of the adjacent candidate points and a difference in the distance when compared with the other adjacent candidate point are both equal to or greater than a predetermined threshold value. If both are not equal to or greater than the threshold value, this candidate point can be presumed to be a candidate point for a smooth road shoulder edge, so that it is kept as a candidate point in the group. If the difference in the distance is equal to or greater than the threshold value with regard to at least one of the two adjacent candidate points, this candidate point can be presumed to be the start point or end point of an exclusion section, so that it is treated as the start point or end point of the exclusion section. After making such determinations for all the candidate points in the group, the exclusion section corresponding to the candidate points 34 to 36 of
Fig. 8 can be detected, based on the candidate points of the start point and end point of the exclusion section, a time series of measurement time points of the candidate points, the number of continuous candidate points with a difference in the distance equal to or less than the threshold value, or the like. - Then proceeding to step S27, the selection unit 4 excludes the candidate points included in the exclusion section detected in step S26 from the candidates to be used for generating location information of the road shoulder edge.
- In step S28, the boundary location
information generation unit 5 generates candidate points to interpolate the exclusion section excluded by the selection unit 4 in step S27. The specific method is substantially the same as the method by which two groups are connected when plotting a demarcation line, and new candidate points are generated such that an approximate curve formed by the candidate points, other than the excluded candidate points, and the newly generated candidate points is shaped similarly to the nearby traveling trajectory. - As a result of the operation of steps S21 to S28 as described above, the candidate points for the road shoulder edge are obtained. Then, in step S29, the boundary location
information generation unit 5 generates location information of the finally obtained candidate points for the road shoulder edge, as location information of the road shoulder edge indicating determined locations of the road shoulder edge. That is, in step S29, the boundary location information generation unit generates boundary location information indicating determined locations of the boundary, as a process of determining boundary location information to be finally output. - The road shoulder edge indicated by the location information generated by the operation as described above is generated by eliminating disturbance factors such as smudges on the road surface, and thus is a smooth line corresponding to the traveling trajectory along the road. Therefore, when it is used for map information, location information of the road shoulder edge properly reflecting the real road shoulder edge can be obtained.
- By the operation of the
information processing device 1 as described above, when boundaries of a road such as a demarcation line and a road shoulder edge are to be plotted, candidate points corresponding to the traveling trajectory of the surveying vehicle traveling along the road are selected in the selection unit, and based on the selected candidate points, the boundary location information generation unit generates boundary information. As a result, it is possible to generate boundary location information of the road in which inaccuracies due to disturbance factors are reduced. - An information processing device according to the present invention is applicable to plotting of a dynamic map.
- 1: information processing device, 2: information acquisition unit, 3: candidate location information generation unit, 4: selection unit, 5: location information generation unit, 6: division unit, 7: divided-area candidate location information generation unit, 8: point, 9: point, 10: candidate point, 11: threshold value, 12: point, 13: point, 14: point, 15: point, 16: candidate point, 17: threshold value, 18: point, 19: road surface, 20: candidate point, 21: approximate straight line, 22: candidate point, 23: candidate point, 24: candidate point, 25: candidate point, 26: traveling trajectory, 27: group, 28: group, 29: group, 30: group, 31: traveling trajectory, 32: candidate point, 33: candidate point, 34: candidate point, 35: candidate point, 36: candidate point, 37: traveling trajectory, 38: group, 39: group, 40: candidate point, 41: candidate point, 42: traveling trajectory, 43: candidate point, 44: candidate point, 45: candidate point, 46: candidate point, 47: road shoulder edge, 48: demarcation line, 49: deceleration line, 50: processing device, 51: storage device, 52: communication device
Claims (9)
- An information processing device (1) comprising:an information acquisition unit (2) configured to acquire feature measurement information comprising point cloud information indicating locations of surfaces of features including a road having a boundary, and to acquire trajectory information indicating a movement trajectory of a mobile object moving along the road at a time of measuring of the point cloud information;a candidate location information generation unit (3) configured to generate candidate location information indicating locations of candidate elements for the boundary, based on the feature measurement information acquired by the information acquisition unit (2);a selection unit (4) configured to select candidate elements from among the candidate elements indicated by the candidate location information generated by the candidate location information generation unit (3), based on the trajectory information acquired by the information acquisition unit (2); anda boundary location information generation unit (5) configured to generate boundary location information indicating determined locations of the boundary, using the candidate elements selected by the selection unit (4),characterized in thatthe candidate location information generation unit (3) includes a division unit (6) configured to set a plurality of areas along the movement trajectory, and to divide the feature measurement information into sets respectively belonging to the plurality of areas, and a divided-area candidate location information generation unit (7) configured to generate the candidate location information, based on the feature measurement information in each area after being divided by the division unit (6).
- The information processing device (1) according to claim 1,
wherein the selection unit (4) is configured to select candidate elements corresponding to the trajectory, based on a distance between each of the locations of the candidate elements indicated by the candidate location information generated by the candidate location information generation unit (3) and the trajectory indicated by the trajectory information. - The information processing device (1) according to claim 2,
wherein the selection unit (4) is configured to select candidate elements corresponding to the trajectory, based further on a distance between each pair of the candidate elements indicated by the candidate location information generated by the candidate location information generation unit (3). - The information processing device (1) according to any one of claims 1 to 3,
wherein the candidate location information generation unit (3) is configured to generates candidate points for a road shoulder edge as the candidate elements for the boundary, based on the locations of the surfaces of the features indicated by the point cloud information, and to generate candidate location information indicating the locations of the candidate points. - The information processing device (1) according to any one of claims 1 to 3,wherein the information acquisition unit (2) is configured to acquire, as the feature measurement information, point cloud information indicating locations of surfaces of the features and luminance levels of the features at the locations, andwherein the candidate location information generation unit (3) is configured to generate candidate points for a demarcation line, as the candidate elements for the boundary, based on the luminance levels of the features included in the point cloud information.
- The information processing device (1) according to any one of claims 1 to 5,
wherein the plurality of areas into which the division unit (6) divides the feature measurement information are a plurality of spaces intersecting the trajectory indicated by the trajectory information. - The information processing device (1) according to any one of claims 1 to 5,wherein the feature measurement information acquired by the information acquisition unit (2) includes measurement time point information indicating measurement time points at which the locations of the features are measured, andwherein the plurality of areas into which the division unit (6) divides the feature measurement information are spaces corresponding to predetermined time intervals based on the measurement time point information.
- An information processing method comprising:an information acquisition step of acquiring feature measurement information comprising point cloud information indicating locations of surfaces of features including a road having a boundary, and trajectory information indicating a movement trajectory of a mobile object moving along the road at a time of measuring of the point cloud information;a candidate location information generation step of generating candidate location information indicating locations of candidate elements for the boundary, based on the feature measurement information acquired in the information acquisition step;a selection step of selecting candidate elements corresponding to the trajectory indicated by the trajectory information acquired in the information acquisition step, from among the candidate elements indicated by the candidate location information generated in the candidate location information generation step; anda boundary location information generation step of generating boundary location information indicating determined locations of the boundary, using the candidate elements selected in the selection step,characterized in thatthe candidate location information generation step includes setting a plurality of areas along the movement trajectory, dividing the feature measurement information into sets respectively belonging to the plurality of areas, and generating the candidate location information, based on the feature measurement information in each area after being divided into the sets.
- A program for causing a computer to execute all the steps recited in claim 8.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018193957 | 2018-10-15 | ||
| PCT/JP2019/038667 WO2020080088A1 (en) | 2018-10-15 | 2019-10-01 | Information processing device |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| EP3869484A1 EP3869484A1 (en) | 2021-08-25 |
| EP3869484A4 EP3869484A4 (en) | 2022-08-24 |
| EP3869484B1 true EP3869484B1 (en) | 2025-02-19 |
Family
ID=70284282
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP19874002.9A Active EP3869484B1 (en) | 2018-10-15 | 2019-10-01 | Information processing device, information processing method, and program |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11959769B2 (en) |
| EP (1) | EP3869484B1 (en) |
| JP (3) | JP7046218B2 (en) |
| WO (1) | WO2020080088A1 (en) |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113994070B (en) | 2019-05-16 | 2025-03-18 | 斯伦贝谢技术有限公司 | Modular Perforating Tools |
| WO2022104220A1 (en) | 2020-11-13 | 2022-05-19 | Schlumberger Technology Corporation | Oriented-perforation tool |
| CN112837333A (en) * | 2021-02-04 | 2021-05-25 | 南京抒微智能科技有限公司 | A kind of outdoor unmanned sweeper welt cleaning method and equipment |
| JP7723566B2 (en) * | 2021-10-07 | 2025-08-14 | 株式会社パスコ | Plotting device, plotting method and program |
| WO2023105693A1 (en) * | 2021-12-08 | 2023-06-15 | 日本電信電話株式会社 | Sidewalk edge detection device, sidewalk edge detection method, and sidewalk edge detection program |
| CN114129092B (en) * | 2021-12-08 | 2022-10-14 | 上海景吾智能科技有限公司 | Cleaning robot cleaning area planning system and method |
| WO2023105702A1 (en) * | 2021-12-09 | 2023-06-15 | 日本電信電話株式会社 | Device, method, and program for detecting wire-like object |
| US12202476B2 (en) * | 2022-05-27 | 2025-01-21 | Ford Global Technologies, Llc | Vehicle map data management |
| WO2024011381A1 (en) * | 2022-07-11 | 2024-01-18 | 上海交通大学 | Point cloud encoding method and apparatus, point cloud decoding method and apparatus, device and storage medium |
| JP2024159243A (en) * | 2023-04-28 | 2024-11-08 | 株式会社日立製作所 | Map information processing device |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014163707A (en) * | 2013-02-21 | 2014-09-08 | Pasco Corp | Road deformation detection device, road deformation detection method and program |
| JP2014191796A (en) * | 2013-03-28 | 2014-10-06 | Pasco Corp | Feature surface detection device, feature surface detection method and program |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5161936B2 (en) | 2010-08-11 | 2013-03-13 | 株式会社パスコ | Data analysis apparatus, data analysis method, and program |
| US9530062B2 (en) * | 2014-12-23 | 2016-12-27 | Volkswagen Ag | Fused raised pavement marker detection for autonomous driving using lidar and camera |
| JP6384604B2 (en) * | 2015-05-28 | 2018-09-05 | 日産自動車株式会社 | Self-position estimation apparatus and self-position estimation method |
| JP6695016B2 (en) | 2015-10-19 | 2020-05-20 | 田中 成典 | Road feature determination device |
| US10013610B2 (en) * | 2015-10-23 | 2018-07-03 | Nokia Technologies Oy | Integration of positional data and overhead images for lane identification |
| JP6529463B2 (en) * | 2016-06-14 | 2019-06-12 | 日本電信電話株式会社 | Road structuring device, road structuring method, and road structuring program |
| JP6776717B2 (en) * | 2016-08-12 | 2020-10-28 | トヨタ自動車株式会社 | Road marking device |
| JP6741603B2 (en) * | 2017-01-16 | 2020-08-19 | 株式会社Soken | Estimator |
| KR102265376B1 (en) * | 2017-03-07 | 2021-06-16 | 현대자동차주식회사 | Vehicle and controlling method thereof and autonomous driving system |
| WO2018212287A1 (en) * | 2017-05-19 | 2018-11-22 | パイオニア株式会社 | Measurement device, measurement method, and program |
| US20210049375A1 (en) * | 2018-01-31 | 2021-02-18 | Pioneer Corporation | Road surface information acquisition method |
| US10852146B2 (en) * | 2018-02-28 | 2020-12-01 | Ford Global Technologies, Llc | Localization technique selection |
| US10521913B2 (en) * | 2018-03-29 | 2019-12-31 | Aurora Innovation, Inc. | Relative atlas for autonomous vehicle and generation thereof |
| WO2019239477A1 (en) | 2018-06-12 | 2019-12-19 | 三菱電機株式会社 | Map generation device and map generation system |
| KR102442230B1 (en) * | 2018-09-30 | 2022-09-13 | 그레이트 월 모터 컴퍼니 리미티드 | Construction method and application of driving coordinate system |
| US20220277163A1 (en) * | 2021-02-26 | 2022-09-01 | Here Global B.V. | Predictive shadows to suppress false positive lane marking detection |
| KR102729020B1 (en) * | 2021-09-08 | 2024-11-13 | 한국과학기술원 | Method and system for building lane-level map by using 3D point cloud map |
-
2019
- 2019-10-01 US US17/273,316 patent/US11959769B2/en active Active
- 2019-10-01 WO PCT/JP2019/038667 patent/WO2020080088A1/en not_active Ceased
- 2019-10-01 JP JP2020553024A patent/JP7046218B2/en active Active
- 2019-10-01 EP EP19874002.9A patent/EP3869484B1/en active Active
-
2022
- 2022-03-22 JP JP2022045497A patent/JP7232946B2/en active Active
-
2023
- 2023-02-20 JP JP2023024183A patent/JP7471481B2/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014163707A (en) * | 2013-02-21 | 2014-09-08 | Pasco Corp | Road deformation detection device, road deformation detection method and program |
| JP2014191796A (en) * | 2013-03-28 | 2014-10-06 | Pasco Corp | Feature surface detection device, feature surface detection method and program |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2020080088A1 (en) | 2020-04-23 |
| JP2023059927A (en) | 2023-04-27 |
| JPWO2020080088A1 (en) | 2021-06-10 |
| JP7471481B2 (en) | 2024-04-19 |
| US11959769B2 (en) | 2024-04-16 |
| EP3869484A4 (en) | 2022-08-24 |
| JP2022089828A (en) | 2022-06-16 |
| JP7046218B2 (en) | 2022-04-01 |
| US20210247771A1 (en) | 2021-08-12 |
| EP3869484A1 (en) | 2021-08-25 |
| JP7232946B2 (en) | 2023-03-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3869484B1 (en) | Information processing device, information processing method, and program | |
| US10198632B2 (en) | Survey data processing device, survey data processing method, and survey data processing program | |
| CN113030997B (en) | Method for detecting travelable area of open-pit mine area based on laser radar | |
| KR102125538B1 (en) | Efficient Map Matching Method for Autonomous Driving and Apparatus Thereof | |
| CN114111812B (en) | Method and system for generating and using positioning reference data | |
| EP2767927B1 (en) | Road surface information detection apparatus, vehicle device control system employing road surface information detection apparatus, and carrier medium of road surface information detection program | |
| US11982752B2 (en) | GPS error correction method through comparison of three-dimensional HD-maps in duplicate area | |
| JP6492469B2 (en) | Own vehicle travel lane estimation device and program | |
| US20200124725A1 (en) | Navigable region recognition and topology matching, and associated systems and methods | |
| JP5868586B2 (en) | Road characteristic analysis based on video image, lane detection, and lane departure prevention method and apparatus | |
| Wang et al. | Automatic road extraction from mobile laser scanning data | |
| CN104677361B (en) | A kind of method of comprehensive location | |
| CN112964264A (en) | Road edge detection method and device, high-precision map, vehicle and storage medium | |
| US12479442B2 (en) | Travel road recognition device | |
| CN114119866A (en) | A visual evaluation method of urban road intersection based on point cloud data | |
| CN112014856A (en) | Road edge extraction method and device suitable for cross road section | |
| CN114631040A (en) | Apparatus and method for autonomously locating a mobile vehicle on a railway track | |
| JP2022117835A (en) | Feature data generation device and feature data generation method | |
| JP6837626B1 (en) | Feature data generation system, feature database update system, and feature data generation method | |
| JP7402756B2 (en) | Environmental map generation method and environmental map generation device | |
| US20250052590A1 (en) | Road boundary detection device, road boundary detection method, and road boundary detection program | |
| KR20230092420A (en) | Vehicle lidar system and object detecting method thereof | |
| CN112164098A (en) | Method for predicting local collapse of urban road by using vehicle-mounted LiDAR system | |
| KR102423781B1 (en) | Method of detecting guard-rail using lidar sensor and guard-rail detection device performing method | |
| KR102408402B1 (en) | Method of detecting road-curb using lidar sensor and a road-curb detection device performing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20210224 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G08G 1/16 20060101ALI20220321BHEP Ipc: G06T 7/12 20170101ALI20220321BHEP Ipc: G06T 7/00 20170101ALI20220321BHEP Ipc: G06T 1/00 20060101ALI20220321BHEP Ipc: G01C 21/32 20060101ALI20220321BHEP Ipc: G01C 21/00 20060101AFI20220321BHEP |
|
| REG | Reference to a national code |
Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: G08G0001160000 Ref country code: DE Ref legal event code: R079 Ref document number: 602019066295 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G08G0001160000 Ipc: G01C0021000000 |
|
| A4 | Supplementary search report drawn up and despatched |
Effective date: 20220722 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G08G 1/16 20060101ALI20220718BHEP Ipc: G06T 7/12 20170101ALI20220718BHEP Ipc: G06T 7/00 20170101ALI20220718BHEP Ipc: G06T 1/00 20060101ALI20220718BHEP Ipc: G01C 21/32 20060101ALI20220718BHEP Ipc: G01C 21/00 20060101AFI20220718BHEP |
|
| GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
| INTG | Intention to grant announced |
Effective date: 20241029 |
|
| GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
| GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
| AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
| REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602019066295 Country of ref document: DE |
|
| REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R081 Ref document number: 602019066295 Country of ref document: DE Owner name: MITSUBISHI ELECTRIC CORPORATION, JP Free format text: FORMER OWNER: MITSUBISHI GENERATOR CO., LTD., KOBE-SHI, HYOGO, JP |
|
| REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250519 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250519 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250619 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250620 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250520 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1768636 Country of ref document: AT Kind code of ref document: T Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| P01 | Opt-out of the competence of the unified patent court (upc) registered |
Free format text: CASE NUMBER: UPC_APP_4937_3869484/2025 Effective date: 20250827 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |
|
| PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20250219 |