WO2015151770A1 - Système de génération de carte tridimensionnelle - Google Patents
Système de génération de carte tridimensionnelle Download PDFInfo
- Publication number
- WO2015151770A1 WO2015151770A1 PCT/JP2015/057583 JP2015057583W WO2015151770A1 WO 2015151770 A1 WO2015151770 A1 WO 2015151770A1 JP 2015057583 W JP2015057583 W JP 2015057583W WO 2015151770 A1 WO2015151770 A1 WO 2015151770A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measurement
- measurement sensor
- map generation
- dimensional map
- distance data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
- G09B29/004—Map manufacture or repair; Tear or ink or water resistant maps; Long-life maps
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
Definitions
- the present invention relates to a three-dimensional map generation system, and more particularly to a technique effective for generating a three-dimensional map using three-dimensional point cloud data.
- 3D maps have been used in various scenes such as efficient renovation and layout confirmation in factories and interior changes in museums and commercial facilities.
- the generation of the three-dimensional map is measured using, for example, a three-dimensional laser scanner.
- the laser pulse irradiated from the three-dimensional laser scanner is irradiated to the measurement object, and the three-dimensional laser scanner acquires the data of the reflected light.
- the acquired data is point group data of three-dimensional coordinates, and the surface shape of the measurement target is formed by the point group data.
- the surface shape of the object to be measured is measured from a plurality of measurement points, and the three-dimensional point cloud data acquired for each measurement point is synthesized to synthesize point cloud data Is known, and a technique for eliminating the need for target arrangement when acquiring three-dimensional point cloud data is known (see, for example, Patent Document 1).
- An object of the present invention is to provide a technique capable of easily generating a three-dimensional map in a short time.
- a typical 3D map generation system has the following characteristics.
- the three-dimensional map generation system has a first measurement sensor, a second measurement sensor, and a map generation unit.
- the first measurement sensor collects distance data in the horizontal direction with respect to the floor surface.
- the second measurement sensor is attached so that a depression angle or an elevation angle is generated with respect to the floor surface, and collects distance data in the vertical direction with respect to the floor surface.
- the map generation unit generates a three-dimensional map from the distance data acquired by moving the first measurement sensor and the second measurement sensor.
- the map generation unit identifies the position and angle of the first measurement sensor based on the horizontal distance data collected by the first measurement sensor. Then, the identified position and angle are associated with the vertical distance data collected by the second measurement sensor to generate a three-dimensional map.
- the constituent elements are not necessarily indispensable unless otherwise specified and apparently essential in principle. Needless to say.
- FIG. 1 is an explanatory diagram showing an example of the configuration of the three-dimensional map generation system 10 according to the present embodiment.
- the 3D map generation system 10 includes measurement sensors 11 and 12, a processing device 13, an input unit 14, a display unit 15, and a battery 16.
- the measurement sensors 11 and 12, the processing device 13, the input unit 14, the display unit 15, and the battery 16 are mounted on the measurement carriage 20.
- the measurement carriage 20 is a carriage that is manually pushed by an operator and is provided with, for example, four wheels 21.
- the two wheels 21 are respectively provided in front of the traveling direction of the measurement carriage 20, and the remaining two wheels 21 are respectively attached to the rear of the measurement carriage 20.
- the two wheels 21 provided in front of the measurement carriage 20 are each provided with a free caster so as to be rotatable.
- the two wheels 21 attached to the rear of the measurement carriage 20 have a fixed configuration.
- the four wheels 21 attached to the measurement carriage 20 may all be provided with a universal caster, or only the two wheels 21 attached to the rear of the measurement carriage 20.
- a configuration in which a universal caster is provided may be used.
- the measurement cart 20 is not limited to traveling by hand, and may be configured to allow electric traveling, for example.
- the measurement carriage 20 includes, for example, a motor and a drive circuit that drives the motor. And the wheel of a measurement trolley is driven with the motive power of a motor.
- the measurement carriage 20 performs trackless conveyance without using an induction-type automatic traveling vehicle or induction equipment that is driven by guidance equipment such as a magnetic tape for guidance or a guide tape such as a white line. It may be an automated vehicle to perform.
- the measurement sensor 11 that is the first measurement sensor and the measurement sensor 12 that is the second measurement sensor include, for example, a laser range sensor.
- This laser range sensor is a sensor that scans a two-dimensional plane.
- the laser range sensor is irradiated with a laser pulse by rotating a reflector provided inside the laser range sensor, and the distance to the point where the laser pulse hits. Measure.
- the measurement sensors 11 and 12 have, for example, a measurable distance of 30 m, a measurement range of ⁇ 135 ° in the horizontal direction, and an angle resolution of 0.25 °, and can measure a total distance of 1081 points.
- the data set of 1081 points is referred to as distance data.
- the measurement sensor 11 is mounted so that the irradiated laser pulse is parallel to the traveling road surface 25 on which the measurement carriage 20 travels, that is, the floor surface, and the laser pulse irradiates the side wall surface 26.
- the mounting position of the measurement sensor 11 can be changed in the direction perpendicular to the traveling road surface 25.
- the height of the measurement sensor 11 up to the road surface can be changed according to the object in the surrounding environment.
- the measurement sensor 11 is used to estimate the position and orientation of the measurement carriage 20.
- the measurement sensor 12 has an angle at which the laser pulse irradiates the traveling road surface 25, that is, a depression angle.
- the angle is an angle larger than the depression angle 0 °.
- the depression angle 0 ° is an angle at which the laser pulse irradiation angle is perpendicular to the traveling road surface 25, and the depression angle 90 ° is an angle at which the laser pulse irradiation angle is horizontal with respect to the traveling road surface.
- the mounting angle of the measurement sensor 12 is preferably about a depression angle of about 50 ° to a depression angle of about 70 °, and particularly preferably about a depression angle of about 60 °.
- the measurement sensor 12 irradiates a laser pulse in the vertical direction with respect to the traveling road surface 25, that is, in the vertical direction from the traveling road surface 25 to the ceiling surface 27, so that the three-dimensional shape of the object around the measurement carriage 20 is obtained. Used to measure. Further, the wall surface can be measured densely by the inclination attached to the elevation angle of the measurement sensor 12, and the time correspondence between the distance data of the two sensors can be obtained. The essence of this system is the inclination of the measurement sensor 12.
- the measurement sensor 11 is mounted horizontally as described above. Therefore, when the measurement carriage 20 is moved a little to perform measurement, the distance data includes a lot of data obtained by measuring the same object.
- the relative position and orientation of the measurement sensor 12 can be detected by performing matching processing of two distance data measured by the measurement sensor 11. In this way, the position and orientation of the measurement carriage 20, that is, the position and orientation of the measurement carriage 20 are calculated from the measurement data of the measurement sensor 11.
- the measurement sensor 12 It is possible to calculate 3D map data which is the shape of the measured object.
- the passage can be moved as it is. If the object is moved around the object by the measurement carriage 20, the shape of the target object can be measured without leaving.
- FIG. 1 shows an example in which the measurement sensor 12 is attached so that the laser pulse irradiates the traveling road surface 25. However, as indicated by the dotted line in FIG. It may be possible to take an elevation angle that is a direction of irradiating the surface side opposite to the ceiling surface 27 side.
- the angle is larger than the elevation angle 0 °.
- the elevation angle of 0 ° is an angle at which the irradiation angle of the laser pulse is horizontal to the traveling road surface 25
- the elevation angle of 90 ° is an angle at which the irradiation angle of the laser pulse is perpendicular to the traveling road surface 25. .
- the mounting angle of the measurement sensor 12 is preferably about 50 ° to 70 °, and particularly preferably about 60 °.
- the measurable range becomes small at an angle smaller than the depression angle of 50 °, and the reflection intensity of the laser pulse becomes weak at an angle larger than the depression angle of about 70 °, for example. There is a risk that good measurement will not be possible.
- the measurable range becomes small at an angle larger than about 70 °, for example, and the reflection intensity of the laser pulse is weak at an angle smaller than 50 °, for example. Therefore, there is a risk that efficient measurement cannot be performed.
- the measurement sensor 12 can be freely rotated, and can be attached to the measurement carriage 20 via a rotation mechanism that rotates so that the laser pulse is applied to either the traveling road surface 25 or the ceiling surface 27. May be.
- a rotation mechanism that rotates so that the laser pulse is applied to either the traveling road surface 25 or the ceiling surface 27. May be.
- the three-dimensional map generation system 10 may be configured to include two measurement sensors, a measurement sensor that irradiates the traveling road surface 25 with a laser pulse and a measurement sensor that irradiates the ceiling surface 27.
- the measurement sensor 12 may be other than the laser range sensor described above, and may be any sensor that can acquire shape information of an object such as a stereo camera or a depth camera and can record the measurement time. .
- the input unit 14 is, for example, a keyboard or a mouse, and inputs data for operating the processing device 13.
- the display unit 15 is a liquid crystal display or the like, and displays measurement results and the like.
- the battery 16 supplies power to the measurement sensors 11 and 12, the processing device 13, the input unit 14, and the display unit 15.
- FIG. 2 is an explanatory diagram showing an example of the configuration of the processing device 13 included in the three-dimensional map generation system 10 of FIG.
- the processing device 13 serving as the map generation unit is composed of, for example, a personal computer and processes measurement data from the measurement sensors 11 and 12. As illustrated, the processing device 13 includes a processor 30, a memory 31, and a storage device 32.
- the processor 30 controls the processing device 13.
- the memory 31 is formed of a volatile memory such as a RAM (Random Access Memory), for example, and temporarily stores software and data stored in the storage device 32 or an arithmetic processing result by the processor 30.
- the storage device 32 includes, for example, a hard disk drive and stores various software and data.
- the processor 30 also includes a sensor control unit 33, a 2D map generation / position / posture estimation unit 34, and a 3D map generation unit 35.
- the sensor control unit 33 controls the operation of the measurement sensors 11 and 12.
- the sensor control part 33 provides the number for matching with the distance data acquired from each of the measurement sensors 11 and 12, and uses the distance data to which the number is assigned as log data, which will be described later in the storage device 32. Are stored in the log data storage unit 36 and the second log data storage unit 37, respectively.
- the 2D map generation / position / orientation estimation unit 34 which is a position / orientation estimation unit, estimates the position / orientation of the measurement carriage 20 based on the distance data from the measurement sensor 11 and generates position / orientation estimation data.
- the position / orientation estimation data is stored in a map data / position / orientation data storage unit 38 to be described later in the storage device 32.
- the two-dimensional map generation / position / orientation estimation unit 34 generates a two-dimensional map of an object around the measurement carriage 20 based on the distance data from the measurement sensor 11. Similarly, the generated 2D map is stored in the map data / position / attitude data storage unit 38.
- the three-dimensional map generation unit 35 calculates the coordinates (x, y, z) of the object measured by the distance data of the measurement sensor 12 from the position and orientation of the measurement carriage 20 obtained by the two-dimensional map generation / position / posture estimation unit 34. Calculation is performed, and the calculation result is stored in a later-described three-dimensional map storage unit 39 of the storage device 32.
- the 3D map generation unit 35 executes a calibration process. Although the rough correspondence of the measurement start time of the data by the measurement sensor 11 and the measurement sensor 12 is known, since the measurement start time is not synchronized as described later, there is a deviation. This is because the calibration process strictly corrects the time lag.
- the storage device 32 includes a first log data storage unit 36, a second log data storage unit 37, a map data / position / attitude data storage unit 38, and a 3D map storage unit 39.
- the first log data storage unit 36 and the second log data storage unit 37 respectively store the distance data output from the sensor control unit 33 described above.
- the map data / position / posture data storage unit 38 stores the two-dimensional map generated by the two-dimensional map generation / position / posture estimation unit 34 and the estimated position / posture estimation data of the measurement carriage 20.
- the 3D map storage unit 39 stores the calculation result by the 3D map generation unit 35 as described above.
- FIG. 3 is an explanatory diagram showing an example of a software configuration stored in the storage device 32 included in the processing device 13 of FIG.
- the storage device 32 includes not only the first log data storage unit 36, the second log data storage unit 37, the map data / position / attitude data storage unit 38, and the 3D map storage unit 39, but also a 3D map. Various software for generating is stored.
- the storage device 32 includes an OS (Operating System) 40, an initialization program 41, a laser distance sensor control program 42, a two-dimensional map generation / position / orientation estimation program 43, and a three-dimensional coordinate conversion program. 44 and a calibration program 45 are stored.
- OS Operating System
- the OS 40 is software that manages the entire processing device 13 such as input / output functions such as input from the input unit 14 and output to the display unit 15 and management of the storage device 32.
- the initialization program 41 is a program for starting the OS 40 when the processing device 13 is powered on.
- the laser distance sensor control program 42 is a program for controlling the operation of the measurement sensors 11 and 12 (FIG. 1).
- the 2D map generation / position / orientation estimation program 43 is used when the 2D map generation / position / orientation estimation unit 34 generates 2D map data based on the measurement result of the measurement sensor 11 and the position / orientation of the measurement carriage 20. It is a program used when estimating.
- the 3D coordinate conversion program 44 is a program used when the 3D map generation unit 35 generates a 3D map.
- the calibration program 45 is a program for calibrating a time lag when the measurement sensor 11 acquires data and a time lag when the measurement sensor 12 acquires data.
- FIG. 4 is an explanatory diagram showing an example of the configuration of the measurement sensor 11 provided in the 3D map generation system 10 of FIG.
- the structural example of the measurement sensor 11 is shown in FIG. 4, the structure in the measurement sensor 12 is also the same.
- the measurement sensor 11 includes a light emitting unit 50, a reflector 51, a reflector 52, a light receiving unit 53, a rotating unit 54, an encoder 55, and an internal clock 56 that is a first clock unit.
- a light emitting unit 50, a reflector 51, a reflector 52, and a light receiving unit 53 are located from the upper side to the lower side in FIG.
- a rotating unit 54 that is rotatably attached is provided between the light emitting unit 50 and the light receiving unit 53.
- a reflector 51 is attached to the upper part of the rotating part 54, and a reflector 52 is attached to the lower part of the rotating part 54.
- the light emitting unit 50 emits a laser pulse.
- the reflector 51 reflects the laser pulse irradiated by the light emitting unit 50 and irradiates the laser pulse in the horizontal direction.
- the reflector 52 reflects the reflected light of the laser pulse reflected from the object so that the light receiving unit 53 can receive it.
- These reflectors 51 and 52 are made of a reflector such as a mirror.
- the light receiving unit 53 receives reflected light.
- the laser pulse irradiated by the light emitting unit 50 is reflected by the reflector 51 and irradiated horizontally to the outside of the measurement sensor 11.
- the laser pulse is reflected by the irradiated object, and the reflected light is reflected by the reflector 52 and received by the light receiving unit 53.
- the distance to the object is calculated.
- Rotating the reflectors 51 and 52 attached to the rotating unit 54 performing the rotating operation rotates the laser pulse in the horizontal direction, for example, in the range of 270 °.
- the light emitting unit 50 irradiates the laser pulse in increments of about 0.25 °, for example.
- the time per scan in the light receiving unit 53 is, for example, about 25 ms.
- the rotational angular velocity in the rotating unit 54 is controlled by an encoder 55 that detects the moving direction, moving amount, angle, and the like of the rotating unit 54, and the angular velocity is, for example, about 20 Hz to about 40 Hz. Further, the internal clock 56 measures the time at which the laser pulse is emitted from the light emitting unit 50, that is, the measurement start time.
- This measurement start time is a time measured every time a laser pulse is irradiated when the measurement sensor 11 irradiates a laser pulse in a range of 270 °, which is the first reference pulse, for example, 0 °.
- the measurement sensor 11 outputs distance data obtained by measuring the distance from the object around the moving body and the measurement start time.
- the internal clock 56 of the measurement sensor 11 and the internal clock 56 serving as the second clock unit provided in the measurement sensor 12 are not synchronized.
- the measurement data 11 and 12 will start to shift in the measurement start time of the distance data. This time shift is described above. Correction is performed by calibration processing.
- the internal clock 56 of the measurement sensors 11 and 12 can be synchronized to eliminate the need for hardware, and the configuration of the three-dimensional map generation system 10 can be simplified and the cost can be reduced. ⁇ Measurement example of 3D map generation system>
- FIG. 5 is an explanatory diagram showing an example of measurement by the three-dimensional map generation system 10 mounted on the measurement carriage 20.
- FIG. 5 shows a state in which the measurement sensor 11 mounted on the measurement carriage 20 measures an object in the vicinity.
- a laser pulse LB is irradiated two-dimensionally in the horizontal direction by the measurement sensor 11, and the distance to the object is measured.
- the data obtained by this measurement is as shown on the right side of FIG. 5, for example, and points measured by irradiation with the laser pulse LB are shown as dots. This point represents the shape of the surrounding object.
- distance data is continuously collected while the measurement carriage 20 is moved.
- FIG. 6 is an explanatory diagram showing an example of the position and orientation estimation processing of the measurement carriage 20 by the measurement sensor 11.
- This estimation process is a process executed by the 2D map generation / position / orientation estimation unit 34 based on the 2D map generation / position / orientation estimation program 43. Further, the position and orientation of the measurement carriage 20 means a state expressed by three parameters of position (x, y) and direction ⁇ on the two-dimensional map.
- the direction ⁇ is defined as, for example, how many directions are directed with respect to the X axis.
- the data measured by the measurement sensor 11 at time t_k is 36_k
- the distance data at time t_k + 1 is 36_k + 1.
- the distance data 36_k and 36_k + 1 are data obtained by continuous measurement while the measurement carriage 20 is moving, and the moving amount of the moving body is limited to some extent.
- the distance data 36_k and the distance data 36_k + 1 are measured at close positions and postures, almost the same part of the object 60 is measured. Therefore, since there are many matching parts in the shapes of the two distance data, the relative positional relationship between each other can be known through the shape of the object 60 by performing the matching process.
- the occupied grid map is a method of dividing the space into grid-like areas and expressing the presence or absence of an object that can be measured by a laser in each area. For example, it is a method of expressing 1 when there is an object and 0 when there is no object. If the distance data 36_k is expressed by an occupied grid, the degree of overlap can be evaluated. For this matching processing, an ICP (Iterative Closest Point) method or the like may be used.
- ICP Intelligent Closest Point
- the position of 36_1 may be the origin, and 0 ° of the measurement sensor 11 may be the x-axis direction.
- coordinates may be determined based on distance data. For example, if the measured distance data includes an object whose latitude and longitude are known, the coordinates of the world geodetic system may be used on the basis of this point.
- the position of the measurement carriage 20 at the time t_k is determined from the distance data 36_k + 1.
- the position / orientation (xk, yk, ⁇ k, tk) 61b of the measurement carriage 20 is calculated by performing the position and orientation estimation cycle from the map generation to all the log data. In this way, the position / orientation estimation data of the measurement carriage 20 is calculated.
- FIG. 7 is an explanatory diagram showing an example of generation of a three-dimensional map by the three-dimensional map generation system 10 of FIG.
- the measurement carriage 20 measures the surrounding object 60 while moving, and stores the distance data (36_k) of the measurement sensor 11 and the distance data (37_k) of the measurement sensor 12 in the storage device 32.
- the data of the kth distance data angle ⁇ 2ki measures the point 72
- the distance d2ki is the distance 71.
- the sequence of the position and orientation (x_k, y_k, ⁇ _k, t_k) 61b of the measurement carriage 20 can be calculated from the log data of the measurement sensor 11.
- the distance (d2ki) 71 is applied to the following (Expression 1), the three-dimensional coordinates (x, y, z) of the point 72 measured by the measurement sensor 12 can be obtained.
- the vector X ′ in (Expression 1) is a vector expressing the kth distance d2ki of the measurement sensor 12 in the xy plane orthogonal coordinate system.
- A, b, and c represent differences in position between the measurement sensor 11 and the measurement sensor 12, and ⁇ , ⁇ , and ⁇ represent relative postures, respectively.
- U ( ⁇ , ⁇ , ⁇ , a, b, c) is a homogeneous transformation matrix that rotates and translates a vector around the origin, and when this is calculated as X ′, the coordinate system with the measurement sensor 11 as the origin Can be converted to
- U (0, 0, ⁇ _k, x_k, y_k, 0) is a homogeneous transformation matrix representing the position and orientation of the measurement sensor 11, that is, the position and orientation of the measurement carriage 20, and further, by calculating this, the measurement is performed.
- the position and orientation of the sensor 12 is converted into a map coordinate system.
- the coordinates of the actually measured object can be calculated by performing the above conversion.
- FIG. 8 is a flowchart showing an example of processing from measurement of distance data to generation of a 3D map by the 3D map generation system 10 of FIG.
- the sensor control unit 33 of the processing device 13 acquires measurement data by the measurement sensor 11 and the measurement sensor 12 (step S102, S103). At this time, the sensor control unit 33 assigns the same index k to data measured at approximately the same time, and stores the same in the storage device 32.
- step S102 and the process of step S103 are repeated (step S105) until the measurement cart 20 is moved (step S104) until the measurement of the entire range is completed.
- step S106 the log data stored in the first log data storage unit 36 of the storage device 32 is sequentially read, the map generation process and the position / orientation estimation process of the measurement carriage 20 are repeated, and all the logs are recorded. Position / orientation estimation data is generated for the data.
- the two-dimensional map generation / position / orientation estimation unit 34 stores the generated position / orientation estimation data in the map data / position / orientation data storage unit 38.
- a 3D map generation process is performed by the 3D map generation unit 35 (step S107).
- the 3D map generation unit 35 performs log data and map data / position / attitude data storage unit 38 stored in the second log data storage unit 37 of the storage device 32 based on the 3D coordinate conversion program 44.
- 3D coordinates (x, y, z) are generated from the position / orientation estimation data stored in.
- FIG. 9 is an explanatory view showing an example of measurement of the side wall surface by the measurement sensor 12.
- the measurement carriage 20 performs measurement while moving and moves from a position 91 to a position 93.
- the measurement carriage 20 moves from position 91 to position 92. Then, after turning at the position 92, it moves to the position 93.
- the scan of the measurement sensor 12 is irradiated to the side wall surface at intervals according to the moving speed.
- the interval 94 between the scan planes by the laser pulse is determined by the moving speed ⁇ scanning time.
- the scan surface interval 95 is: moving speed ⁇ scan time ⁇ cos (90 ⁇ ). Become.
- the measurement sensor 12 is mounted with an inclination of ⁇ ° with respect to the traveling road surface, it can be seen that if ⁇ is not 0, the interval is narrower than when measurement is performed at a depression angle of 0 °.
- the number of data points 96 for measuring the side wall surface can be increased by attaching the measurement sensor 12 at an inclination angle of 0 ° instead of 0 °.
- the angle is preferably about a depression angle of about 50 ° to a depression angle of about 70 °, particularly preferably about a depression angle of about 60 °, and in the case of an elevation angle, it is preferably on the traveling road surface.
- the elevation angle is about 50 ° to about 70 °, particularly preferably about 60 °.
- the three-dimensional distance data measured at different times intersect at the side wall surface and the floor surface.
- the scan planes of the measurement sensor 12 intersect, it is possible to check the consistency of measurement data, and it is possible to eliminate the following error factors.
- the measurement cart 20 turns and the intersection on the traveling surface occurs but does not occur on the side wall surface. become. Therefore, the above error cannot be excluded.
- FIG. 10 is an explanatory diagram showing the influence of the difference in measurement start time between the measurement sensor 11 and the measurement sensor 12.
- FIG. 10 is a two-dimensional view of the state of measurement in FIG. With reference to FIG. 10, the influence of the difference in measurement start time between the measurement sensor 11 and the measurement sensor 12 will be described.
- measurement is performed by moving the measurement carriage 20 along the movement path 104, and the distance data of the measurement sensor 11 and the distance data of the measurement sensor 12 are measured. Further, based on the measurement result by the measurement sensor 11, the position and orientation estimation results 105, 106, 101, and 102 of the measurement carriage 20 are obtained.
- the sensor control unit 33 is in a state in which the correspondence between the time data of the distance data of the measurement sensor 11 and the distance data of the measurement sensor 12 is on a certain level, but is not accurate.
- the scan plane intersects with the side wall surface, and the time data of the distance data of the measurement sensor 12 and the position / orientation estimation data are temporal. If the correspondence is correct, the distance data 103a and 107a are correctly overlapped. When the correspondence is shifted, the measurement surfaces 103b and 107b of the three-dimensional point cloud data do not overlap on the side wall surface when the data is restored.
- the deviation ⁇ t in the measurement start time between the measurement sensor 11 and the measurement sensor 12 can be detected in software as a value in which the entire distance data overlaps cleanly. Therefore, a general-purpose OS can be used for the processing device 13. In addition, since a special system for synchronizing the measurement start time is not required, the system can be configured at low cost. ⁇ Example of calibration processing>
- FIG. 11 is a flowchart showing an example of calibration processing by the processing device 13 of FIG.
- This calibration process is executed by the three-dimensional map generation unit 35 in the processing device 13 based on the calibration program 45 stored in the storage device 32, and calculates ⁇ t, which is the amount of measurement time deviation. .
- ⁇ t which is the amount of measurement time deviation.
- the position / orientation estimation result at time t is represented by x (t).
- x (t) the position / orientation estimation result at time t.
- T represents the range of time error and varies depending on the measurement cycle of the measurement sensor. For example, when measuring at a cycle of 25 ms, it is set to about 50 ms for two cycles.
- the three-dimensional data is reconstructed with the time error as ⁇ t (step S102). If the difference between the measurement start time t1_1 of the position / orientation estimation result and the measurement start time t2_1 of the measurement sensor 12 is D, the position of the measurement sensor 12 at time t2_k is x (t2_k ⁇ D + ⁇ t), y (t2_k ⁇ D + ⁇ t). , ⁇ (t2_k ⁇ D + ⁇ t). This value is substituted into (Equation 2) described later to calculate three-dimensional data. Then, the number of intersections of the distance data of the measurement sensor 12 is calculated, and the number is set to P (step S203).
- step S204 Pmax and P are compared (step S204), and if the number of intersections is large, ⁇ t_opt and Pmax are updated (step S205). If the number of intersections is small, ⁇ t_opt and Pmax are not updated, and the process proceeds to the next step.
- Pmax is the maximum value of the number of intersections
- ⁇ t_opt is the amount of time shift when the number of intersections is maximum.
- step S206 whether ⁇ t is within the search range is compared (step S206), and if within the search range, ⁇ t is updated (step S207).
- U is an update width, which varies depending on the accuracy of calibration.
- step S208 The above processing is performed in a range of ⁇ T ⁇ ⁇ t ⁇ T, and an optimum deviation amount ⁇ T is calculated (step S208), and the processing ends.
- the three-dimensional map generation unit 35 corrects the measurement start time error between the measurement sensor 11 and the measurement sensor 12 using the optimum deviation amount ⁇ T calculated in the process of step S208, and generates three-dimensional point cloud data. . Thereby, a highly accurate three-dimensional map without distortion etc. can be generated.
- the interpolation method is performed by linear interpolation or spline interpolation, for example. Further, interpolation by a state estimation method such as a Kalman filter may be performed in consideration of the influence of errors included in the position estimation result.
- the most basic round robin is searched as a method for searching for the optimum ⁇ t.
- the search may be performed using an efficient method such as a binary search or a steepest gradient method.
- evaluation value of calibration is evaluated by the number of intersections for easy understanding, but any evaluation function may be used as long as it can evaluate the overlap. Examples of other evaluation functions include the following.
- the space is divided into a grid, and the evaluation formula P of Formula 2 is used by using the number of points p1, p2,... In the divided cells.
- the value of P increases.
- the cell size is determined by sensor noise.
- the calibration method described above can also be used to calibrate the mounting position relationship between the measurement sensor 11 and the measurement sensor 12.
- Equation 3 ⁇ k, ⁇ k, ⁇ k, ⁇ xk, ⁇ yk, ⁇ zk in (Equation 3) shown below are changed and data overlap is evaluated using (Equation 2), errors due to the influence of the swell of the floor surface can also be obtained. It is possible to correct automatically.
- accurate tertiary origin group data can be generated by attaching the measurement sensor 12 at an angle.
- a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .
- the first measurement sensor that collects distance data in the horizontal direction with respect to the floor, the two second measurement sensors that collect distance data in the vertical direction, and the first and second measurement sensors are moved.
- a map generation unit that generates a three-dimensional map from the distance data acquired.
- One second measurement sensor is attached so that a depression angle is generated with respect to the floor surface.
- the other second measurement sensor is attached so that an elevation angle is generated with respect to the floor surface.
- the map generation unit identifies the position and angle of the first measurement sensor based on the distance data in the horizontal direction collected by the first measurement sensor.
- a three-dimensional map is generated by associating the collected distance data in the vertical direction.
- the map generation unit estimates the position and angle of the first measurement sensor on the two-dimensional map based on the distance data collected by the first measurement sensor. Based on the position / orientation estimation unit that outputs the position / orientation estimation data and the position / orientation estimation data on the two-dimensional map estimated by the position / orientation estimation unit, the distance data collected by the second measurement sensor is converted into three-dimensional coordinates.
- a three-dimensional map generation unit that generates a three-dimensional map composed of three-dimensional point cloud data.
- the first measurement sensor and the second measurement sensor further have first and second timepiece units that measure the measurement start time of the distance data, respectively.
- the three-dimensional map generation unit uses the position and orientation estimation data estimated by the position and orientation estimation unit and the distance data collected by the second measurement sensor at the measurement start times measured by the first clock unit and the second clock unit, respectively. Match based on.
- the three-dimensional map generation unit calculates the amount of time shift at the point where the number of points where the three-dimensional point cloud data intersects is the largest, and the calculated time Based on the deviation amount, the measurement start time measured by the first clock unit and the second clock unit is corrected.
- the depression angle and the elevation angle of the second measurement sensor are 50 ° to 70 °, respectively.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Mathematical Physics (AREA)
- Processing Or Creating Images (AREA)
- Instructional Devices (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention permet de générer une carte tridimensionnelle de manière simple et en une courte période de temps. Un système de génération de carte tridimensionnelle (10) comprend un capteur de mesure (11), un capteur de mesure (12) et un dispositif de traitement (13). Le capteur de mesure (11) recueille des données de distance pour la direction parallèle à une surface de sol. Le capteur de mesure (12) est fixé de manière à former un angle de dépression ou d'élévation avec la surface du sol, qui est la surface de trajet de déplacement d'un chariot de mesure (20), et recueille des données de distance pour la direction orthogonale à la surface de trajet de déplacement. Un dispositif de traitement (13) génère une carte tridimensionnelle à partir des données de distance obtenues en déplaçant le premier et le second capteur de mesure. Le dispositif de traitement (13) identifie la position et l'angle de capteur de mesure (11) en fonction des données de distance de direction parallèle collectées par un capteur de mesure (11), et génère une image tridimensionnelle en associant les données de distance orthogonale collectées par un capteur de mesure (12) à la position et l'angle identifiés.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016511505A JP6272460B2 (ja) | 2014-03-31 | 2015-03-13 | 三次元地図生成システム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014-072121 | 2014-03-31 | ||
| JP2014072121 | 2014-03-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015151770A1 true WO2015151770A1 (fr) | 2015-10-08 |
Family
ID=54240102
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/057583 Ceased WO2015151770A1 (fr) | 2014-03-31 | 2015-03-13 | Système de génération de carte tridimensionnelle |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6272460B2 (fr) |
| WO (1) | WO2015151770A1 (fr) |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017150977A (ja) * | 2016-02-25 | 2017-08-31 | 株式会社大林組 | 計測装置 |
| JP2017198517A (ja) * | 2016-04-27 | 2017-11-02 | 株式会社国際電気通信基礎技術研究所 | 3次元地図生成システム |
| JP2018151985A (ja) * | 2017-03-14 | 2018-09-27 | トヨタ自動車株式会社 | 自律移動体 |
| WO2019044500A1 (fr) * | 2017-09-04 | 2019-03-07 | 日本電産株式会社 | Système d'estimation de position et corps mobile comprenant ledit système |
| WO2019107536A1 (fr) * | 2017-11-30 | 2019-06-06 | 三菱電機株式会社 | Système, procédé et programme de génération de carte tridimensionnelle |
| JP2019148471A (ja) * | 2018-02-27 | 2019-09-05 | 国際航業株式会社 | 可搬型レーザー測量機台座、計測車両、及びレーザー計測方法 |
| JP2021019829A (ja) * | 2019-07-26 | 2021-02-18 | シャープ株式会社 | ドライヤ |
| JP2021148698A (ja) * | 2020-03-23 | 2021-09-27 | 株式会社日立製作所 | 自動検査装置 |
| US20210404843A1 (en) * | 2020-06-25 | 2021-12-30 | Canon Kabushiki Kaisha | Information processing apparatus, control method for information processing apparatus, and storage medium |
| JP2022096155A (ja) * | 2020-12-17 | 2022-06-29 | 株式会社日立産機システム | 移動体の位置検出装置及び位置検出方法 |
| JP2023030560A (ja) * | 2021-08-23 | 2023-03-08 | 清水建設株式会社 | 位置測定システム、及び位置測定方法 |
| JP2023061692A (ja) * | 2021-10-20 | 2023-05-02 | オムロン株式会社 | 地図作成車両 |
| JP7412651B1 (ja) * | 2023-03-01 | 2024-01-12 | 三菱電機株式会社 | 点群接合装置、点群接合方法、点群接合プログラムおよび計測車両 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011112644A (ja) * | 2009-11-24 | 2011-06-09 | Ind Technol Res Inst | マップ作成方法および装置とそのマップを利用した定位方法 |
| WO2012176249A1 (fr) * | 2011-06-21 | 2012-12-27 | 国立大学法人奈良先端科学技術大学院大学 | Dispositif, procédé et programme d'estimation d'une autoposition, et objet mobile |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4486737B2 (ja) * | 2000-07-14 | 2010-06-23 | アジア航測株式会社 | モービルマッピング用空間情報生成装置 |
-
2015
- 2015-03-13 WO PCT/JP2015/057583 patent/WO2015151770A1/fr not_active Ceased
- 2015-03-13 JP JP2016511505A patent/JP6272460B2/ja active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011112644A (ja) * | 2009-11-24 | 2011-06-09 | Ind Technol Res Inst | マップ作成方法および装置とそのマップを利用した定位方法 |
| WO2012176249A1 (fr) * | 2011-06-21 | 2012-12-27 | 国立大学法人奈良先端科学技術大学院大学 | Dispositif, procédé et programme d'estimation d'une autoposition, et objet mobile |
Non-Patent Citations (2)
| Title |
|---|
| SHIN'YA IWASHIN A ET AL.: "3-D SLAM in Dynamic Environments by a Mobile Robot Equipped with two Laser Range Finders", DAI 26 KAI ANNUAL CONFERENCE OF THE ROBOTICS SOCIETY OF JAPAN YOKOSHU, 9 September 2008 (2008-09-09) * |
| TAKUMI NAKAMOTO ET AL.: "3-D Map Generation in a Dynamic Environment by a Mobile Robot Equipped with Laser Range Finders", IEICE TECHNICAL REPORT, vol. 106, no. 144, 29 June 2006 (2006-06-29), pages 25 - 30 * |
Cited By (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017150977A (ja) * | 2016-02-25 | 2017-08-31 | 株式会社大林組 | 計測装置 |
| JP2017198517A (ja) * | 2016-04-27 | 2017-11-02 | 株式会社国際電気通信基礎技術研究所 | 3次元地図生成システム |
| JP2018151985A (ja) * | 2017-03-14 | 2018-09-27 | トヨタ自動車株式会社 | 自律移動体 |
| WO2019044500A1 (fr) * | 2017-09-04 | 2019-03-07 | 日本電産株式会社 | Système d'estimation de position et corps mobile comprenant ledit système |
| WO2019107536A1 (fr) * | 2017-11-30 | 2019-06-06 | 三菱電機株式会社 | Système, procédé et programme de génération de carte tridimensionnelle |
| JPWO2019107536A1 (ja) * | 2017-11-30 | 2020-11-19 | 三菱電機株式会社 | 三次元地図生成システム、三次元地図生成方法および三次元地図生成プログラム |
| JP7147176B2 (ja) | 2018-02-27 | 2022-10-05 | 国際航業株式会社 | 可搬型レーザー測量機台座、計測車両、及びレーザー計測方法 |
| JP2019148471A (ja) * | 2018-02-27 | 2019-09-05 | 国際航業株式会社 | 可搬型レーザー測量機台座、計測車両、及びレーザー計測方法 |
| JP7329382B2 (ja) | 2019-07-26 | 2023-08-18 | シャープ株式会社 | ドライヤ |
| JP2021019829A (ja) * | 2019-07-26 | 2021-02-18 | シャープ株式会社 | ドライヤ |
| JP2021148698A (ja) * | 2020-03-23 | 2021-09-27 | 株式会社日立製作所 | 自動検査装置 |
| JP7235691B2 (ja) | 2020-03-23 | 2023-03-08 | 株式会社日立製作所 | 自動検査装置 |
| US20210404843A1 (en) * | 2020-06-25 | 2021-12-30 | Canon Kabushiki Kaisha | Information processing apparatus, control method for information processing apparatus, and storage medium |
| JP2022096155A (ja) * | 2020-12-17 | 2022-06-29 | 株式会社日立産機システム | 移動体の位置検出装置及び位置検出方法 |
| JP7336430B2 (ja) | 2020-12-17 | 2023-08-31 | 株式会社日立産機システム | 移動体の位置検出装置及び位置検出方法 |
| JP2023030560A (ja) * | 2021-08-23 | 2023-03-08 | 清水建設株式会社 | 位置測定システム、及び位置測定方法 |
| JP7736483B2 (ja) | 2021-08-23 | 2025-09-09 | 清水建設株式会社 | 位置測定システム、及び位置測定方法 |
| JP2023061692A (ja) * | 2021-10-20 | 2023-05-02 | オムロン株式会社 | 地図作成車両 |
| JP7718224B2 (ja) | 2021-10-20 | 2025-08-05 | オムロン株式会社 | 地図作成車両 |
| JP7412651B1 (ja) * | 2023-03-01 | 2024-01-12 | 三菱電機株式会社 | 点群接合装置、点群接合方法、点群接合プログラムおよび計測車両 |
| WO2024180698A1 (fr) * | 2023-03-01 | 2024-09-06 | 三菱電機株式会社 | Dispositif d'union de nuage de points, procédé d'union de nuage de points, programme d'union de nuage de points et véhicule de mesure |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2015151770A1 (ja) | 2017-05-25 |
| JP6272460B2 (ja) | 2018-01-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6272460B2 (ja) | 三次元地図生成システム | |
| KR102142162B1 (ko) | 로봇 위치 선정 시스템 | |
| US10481265B2 (en) | Apparatus, systems and methods for point cloud generation and constantly tracking position | |
| US11080872B2 (en) | Device and method for indoor mobile mapping of an environment | |
| US9322646B2 (en) | Adaptive mechanism control and scanner positioning for improved three-dimensional laser scanning | |
| CN105572679B (zh) | 一种二维扫描型激光雷达的扫描数据修正方法及系统 | |
| US20160059417A1 (en) | Automatic in-situ registration and calibration of robotic arm/sensor/workspace system | |
| US20110010033A1 (en) | Autonomous mobile robot, self position estimation method, environmental map generation method, environmental map generation apparatus, and data structure for environmental map | |
| EP3435028B1 (fr) | Métrologie en direct d'un objet en cours de fabrication ou d'autres opérations | |
| JP5902275B1 (ja) | 自律移動装置 | |
| JP2019128175A (ja) | トンネル壁面検査装置及びトンネル壁面検査プログラム | |
| WO2008157434A1 (fr) | Système et procédé d'assemblage d'images sensiblement exemptes de distorsion | |
| Fossel et al. | 2D-SDF-SLAM: A signed distance function based SLAM frontend for laser scanners | |
| CN112068152A (zh) | 使用3d扫描仪同时进行2d定位和2d地图创建的方法和系统 | |
| JP6548452B2 (ja) | 地図生成装置および地図生成方法 | |
| CA3135442A1 (fr) | Localisation et mise en correspondance simultanees | |
| KR102203284B1 (ko) | 이동 로봇의 주행 평가 방법 | |
| JP6759625B2 (ja) | 計測装置 | |
| JP2009198382A (ja) | 環境地図取得装置 | |
| Chen et al. | Low cost and efficient 3D indoor mapping using multiple consumer RGB-D cameras | |
| CN103852031B (zh) | 一种电子设备及测量物体形状的方法 | |
| CN212903049U (zh) | 一种便携式三维激光扫描系统 | |
| Wülfing et al. | Towards real time robot 6d localization in a polygonal indoor map based on 3d tof camera data | |
| JP6601178B2 (ja) | 局所地図作成装置および局所地図作成方法 | |
| US20110004366A1 (en) | 3-dimensional perception system and method for mobile platform |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15773804 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2016511505 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase | ||
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15773804 Country of ref document: EP Kind code of ref document: A1 |