[go: up one dir, main page]

WO2021111613A1 - Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program - Google Patents

Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program Download PDF

Info

Publication number
WO2021111613A1
WO2021111613A1 PCT/JP2019/047794 JP2019047794W WO2021111613A1 WO 2021111613 A1 WO2021111613 A1 WO 2021111613A1 JP 2019047794 W JP2019047794 W JP 2019047794W WO 2021111613 A1 WO2021111613 A1 WO 2021111613A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
dimensional map
floor
dimensional
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/047794
Other languages
French (fr)
Japanese (ja)
Inventor
健 宮本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to JP2020519464A priority Critical patent/JPWO2021111613A1/en
Priority to PCT/JP2019/047794 priority patent/WO2021111613A1/en
Priority to TW109116874A priority patent/TW202123157A/en
Publication of WO2021111613A1 publication Critical patent/WO2021111613A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present invention relates to a three-dimensional map creation device, a three-dimensional map creation method, and a three-dimensional map creation program.
  • a positioning system using WiFi or Beacon is known as a system for detecting a self-position in a large-scale indoor environment such as a factory or a building.
  • an autonomous mobile robot which is a robot mounted on an Automated Guided Vehicle (AGV)
  • AGV Automated Guided Vehicle
  • the device for detecting the position and orientation of the object there is a demand to eliminate as much as possible additional equipment for detecting the position and orientation from the viewpoint of introduction cost.
  • a method of using a three-dimensional map is known in an application of an autonomous mobile robot and an application of augmented reality in which contents as virtual visual information are superimposed on an existing landscape.
  • Simultaneous Localization And Mapping which estimates the self-position and creates a map at the same time based on sensor data acquired from sensors such as Laser Imaging Detection and Range (LiDAR) and a camera.
  • SLAM Simultaneous Localization And Mapping
  • FIGS. 1 (A) and 1 (B) show the accumulation of positional errors when the indoor environment is scanned while moving along the floor 201 and the wall surface of the floor 201, which are the target areas for creating a three-dimensional map by SLAM.
  • the scan results 202 which are the appearances, are shown respectively.
  • FIGS. 2A and 2B show a tablet PC (Tablet Personal Computer) 210 that normally superimposes the content 212 on the images 211 of the actual objects (for example, devices) A1 to A3 and abnormally.
  • the tablet PC 210s to be displayed on top of each other are shown.
  • the present invention has been made to solve the above problems, and provides a three-dimensional map creation device capable of generating a highly accurate three-dimensional map, a three-dimensional map creation method, and a three-dimensional map creation program.
  • the purpose is.
  • the three-dimensional map creating device generates a three-dimensional map for each area including an object based on sensor data detected by a sensor moving on the floor.
  • the one or more first It has a floor map registration unit that generates a second 3D map including the 3D map of.
  • the three-dimensional map creation method includes a step of generating a first three-dimensional map for each area including an object based on sensor data detected by a sensor moving on the floor.
  • a second 3 that includes the one or more first 3D maps by acquiring the floor map of the floor and arranging the generated one or more first 3D maps on the floor map. It has a step of generating a three-dimensional map.
  • (A) is a diagram showing one actual floor which is a target area for creating a three-dimensional map by SLAM
  • (B) is a position error when scanning an indoor environment while moving along the wall surface of one floor. It is a figure which shows the scan result which is the state of the accumulation of.
  • (A) is a diagram showing a tablet PC that normally superimposes content as virtual visual information on an image displaying an existing object
  • (B) is a diagram showing an image displaying an existing object. It is a figure which shows the tablet PC which displays the content abnormally superimposed. It is a figure which shows the example of the hardware composition of the 3D map making apparatus which concerns on Embodiment 1 of this invention.
  • FIG. (A) is a plan view showing an example of a floor map
  • (B) is a perspective view showing an example of an object represented by a small-scale first three-dimensional map.
  • (A) is a perspective view showing the rotation of the first three-dimensional map around the xyz axis
  • (B) is a plan view showing the rotation of the first three-dimensional map around the normal of the ground.
  • FIG. (A) is a flowchart showing a three-dimensional map generation process
  • (B) is a flowchart showing a position / orientation estimation process
  • FIG. 5 is a flowchart showing a floor map registration process executed by the floor map registration unit of the three-dimensional map creation device according to the first embodiment. It is a flowchart which shows the range division processing executed by the range division part of the 3D map making apparatus which concerns on Embodiment 1.
  • FIG. It is a figure for demonstrating the case where the similarity is calculated and the case where it is not calculated. It is a flowchart which shows the similarity determination process which is executed by the range division part of the 3D map making apparatus which concerns on Embodiment 1.
  • FIG. It is a functional block diagram which shows schematic structure of the 3D map making apparatus which concerns on Embodiment 2 of this invention.
  • (A) is a flowchart showing a three-dimensional map generation process
  • (B) is a flowchart showing a process for displaying content superimposed.
  • (A) is a flowchart showing a three-dimensional map generation process
  • (B) is a flowchart showing a position / orientation estimation process.
  • the three-dimensional map creating device is, for example, an autonomous mobile robot having a computer.
  • the three-dimensional map creating device may be a tablet PC that superimposes a content that is visual information on an image that displays an existing object.
  • the three-dimensional map creating device may be a personal computer or a smartphone that can be moved by the user.
  • the floor map or the figure showing the object (for example, equipment, equipment, etc.) which is an object on the floor shows the coordinate axes of the xyz orthogonal coordinate system and around each coordinate axis.
  • the direction of rotation is shown.
  • the floor is an example of a reference plane and is generally parallel to the ground.
  • the x-axis and z-axis are coordinate axes parallel to the plane including the floor.
  • the y-axis is a coordinate axis in a direction orthogonal to the plane including the floor.
  • the + Rz direction is a clockwise direction when facing the + z axis direction
  • the ⁇ Rz direction is a counterclockwise direction which is the opposite direction of the + Rz direction.
  • the + Rx direction is a clockwise direction when facing the + x-axis direction
  • the ⁇ Rx direction is a counterclockwise direction which is the opposite direction of the + Rx direction
  • the + Ry direction is a clockwise direction when facing the + y-axis direction
  • the ⁇ Ry direction is a counterclockwise direction which is the opposite direction of the + Ry direction.
  • FIG. 3 is a diagram showing an example of a hardware (HW) configuration of the three-dimensional map creating device 100 according to the first embodiment.
  • the three-dimensional map creation device 100 is a device capable of implementing the three-dimensional map creation method according to the first embodiment.
  • the three-dimensional map creating device 100 may be a tablet PC that superimposes a content that is visual information on an image that displays an existing object.
  • the actual object is, for example, a device or equipment.
  • the three-dimensional map creating device 100 has a computer 10.
  • the computer 10 has a memory 12 as a storage unit capable of storing a program as software, and a processor 11 as an information processing unit capable of executing a program stored in the memory 12.
  • the program includes a three-dimensional map creation program capable of causing the computer 10 to execute the three-dimensional map creation method according to the first embodiment.
  • the program can be recorded on a recording medium that can be read by the computer 10.
  • the recording medium is, for example, a magnetic disk, an optical disk, a semiconductor memory, or the like.
  • the three-dimensional map creation device 100 has a three-dimensional map DB 40.
  • the three-dimensional map DB 40 is a storage device in which a database (DB) used for managing the three-dimensional map is stored.
  • DB database
  • the three-dimensional map DB 40 may be provided in an external storage device or a server on the network that is communicably connected to the three-dimensional map creation device 100.
  • the three-dimensional map creating device 100 may have one or more of various sensors such as a distance sensor 21, a camera 22, a gyro sensor 23, an acceleration sensor 24, and a geomagnetic sensor 25.
  • the distance sensor 21 is a sensor that measures a distance using LiDAR, infrared rays, or the like.
  • the camera 22 is a sensor that acquires an image (for example, a color image).
  • the gyro sensor 23 is a sensor that acquires an angular velocity.
  • the acceleration sensor 24 is a sensor that acquires acceleration.
  • the geomagnetic sensor 25 is a sensor that acquires an orientation.
  • the various sensors may be part of the three-dimensional mapping apparatus 100. However, various sensors may be provided in an external device communicably connected to the three-dimensional map creating device 100. For example, various sensors 21 to 25 shown in FIG. 3 may be provided on the AGV, and the three-dimensional mapping device 100 may be configured by a computer 10 placed in a place other than the AGV.
  • the three-dimensional map creating device 100 has a display 30.
  • the display 30 is a display device for displaying an image.
  • the display 30 displays an image displaying a real object as shown in FIG. 2A and augmented reality content. ..
  • the three-dimensional map creating device 100 may be a device that does not include the display 30.
  • a user holding a tablet PC which is a three-dimensional map creation device 100, moves to the front of the object.
  • the three-dimensional map creating device 100 estimates the position and posture of the object based on the sensor data acquired while moving to the object and at the front position of the object.
  • the position of the object is considered to be the same as the self-position, which is the position of the user.
  • a 3D map of the object and its surroundings that is, a nearby area
  • Content about the object can be displayed on the display 30 of the tablet PC at an appropriate position on or near the image of the object.
  • the autonomous mobile robot moves to the front of the object.
  • the autonomous mobile robot estimates the position and posture of the object based on the sensor data while moving to the object and at the front position of the object.
  • an autonomous mobile robot performs maintenance and inspection of an object, if the 3D map of the object and its vicinity is created accurately, even if the 3D map of the position far from the object is not created or it is inaccurate. Even so, the object can be operated by a robot hand or the like. That is, the autonomous mobile robot requires a three-dimensional map having high position accuracy around the object, but there is no problem even if the position accuracy of the three-dimensional map is low in the passage used for movement.
  • SLAM can be used to create a 3D map with high accuracy. This is because when creating a three-dimensional map of a small-scale environment using SLAM, the accumulation of errors is small, and loop detection is easy. Loop detection is, for example, a process called loop closure performed in SLAM.
  • the three-dimensional map creating device 100 creates a large-scale second three-dimensional map that enables highly accurate estimation of position and posture only in the object and its surroundings.
  • the three-dimensional map creating device 100 according to the first embodiment has one or more first three-dimensional maps on a floor map 300 on which a layout including the positions of objects such as equipment to be inspected is drawn. By arranging (ie, a small 3D map), one second 3D map (ie, a large 3D map) is created.
  • the 3D map creation device 100 When the 3D map creation device 100 is a tablet PC, when the user carrying the tablet PC moves to the front of the object, the 3D map creation device 100 has one or more thirds in one or more areas of the floor map. By registering each of the three-dimensional maps of No. 1, as shown in FIG. 2A, the content 212 is superimposed and displayed at an appropriate position in the image 211 in which the objects A1 to A3 are displayed.
  • FIG. 4 is a functional block diagram schematically showing the configuration of the three-dimensional map creating device 100 according to the first embodiment.
  • the three-dimensional map creating device 100 generates a first three-dimensional map for each area including an object based on the sensor data detected by the sensor moving on the floor.
  • the map generation unit 110 and the floor map of the floor By acquiring the map generation unit 110 and the floor map of the floor and arranging one or more first three-dimensional maps generated by the three-dimensional map generation unit 110 on the floor map, one or more first ones.
  • It has a floor map registration unit 120 that generates a large-scale second three-dimensional map including the three-dimensional map of the above, and a range division unit 130 that divides the second three-dimensional map into a plurality of ranges.
  • the three-dimensional map creating device 100 may have a range designation unit 140 and a position / orientation estimation unit 150.
  • FIG. 5A is a plan view showing an example of the floor map 300
  • FIG. 5B is a perspective view showing an example of an object represented by a small-scale first three-dimensional map 400.
  • the three-dimensional map generation unit 110 generates one or more first three-dimensional maps by using, for example, SLAM or the like.
  • the first three-dimensional map 400 is a three-dimensional map of an object on the floor and a small area around it. Corresponding areas 301 to 305, which are locations for arranging objects, are drawn on the floor map 300.
  • the first 3D map 400 is fitted to the corresponding area 303 by performing any one or more of rotation, translation, and scale adjustment.
  • FIG. 6A is a perspective view showing the rotation of the first 3D map 400 around the xyz axis
  • FIG. 6B is a floor map 300 parallel to the ground of the first 3D map 400. It is a top view which shows the rotation around a normal line (that is, the rotation around the y-axis in the ⁇ Ry direction).
  • the floor map registration unit 120 acquires the floor map 300 on which the corresponding areas 301 to 305 are drawn, for example, from an external storage device.
  • the floor map registration unit 120 may have a storage unit that stores in advance the floor map 300 on which the corresponding areas 301 to 305 are drawn.
  • the floor map registration unit 120 may acquire the floor map 300 on which the corresponding areas 301 to 305 are drawn from an external storage device or a server on the network according to a user input operation from the operation input unit.
  • the floor map registration unit 120 is as shown in FIG. 6 (B).
  • the adjustment around one axis of rotation required for floor map registration eg, rotation around the y-axis in the ⁇ Ry direction
  • make unnecessary adjustments around the axis of rotation eg, ⁇ Rx around the x-axis.
  • the process of invalidating the rotation in the direction and the rotation in the ⁇ Rz direction around the z-axis may be performed.
  • the user registers the first three-dimensional map (for example, 400) in the designated corresponding area (for example, 303) on the floor map 300 without adjusting the three-dimensional rotation. It is possible. For example, by specifying the rotation angle and the amount of translation of the first three-dimensional map around the normal of the ground by user operation, the floor map registration unit 120 floors one or more first three-dimensional maps. Register in map 300.
  • FIG. 7 is a functional block diagram schematically showing the configuration of the floor map registration unit 120 shown in FIG.
  • the floor map registration unit 120 includes a ground detection unit 121, an external parameter calculation unit 122, an external parameter input unit 123, and an external parameter action unit 124.
  • the ground detection unit 121 detects the ground based on a three-dimensional map.
  • the ground detection method there is a method using Random Sample Consensus (RANSAC), which is one of the algorithms for robust estimation.
  • RANSAC Random Sample Consensus
  • the relationship between the coefficient (ab c d) T of the plane and the position (x y z) T on the plane is expressed by the following equation (1).
  • indicates the inner product. Since the plane obtained by using RANSAC is an infinite plane, the ground detection unit 121 calculates the range of the plane using a convex hull or the like. The ground detection unit 121 detects the plane having the largest range as the ground.
  • the ground detection unit 121 may detect the ground by the following method. For example, the ground detection unit 121 may detect the ground (that is, a horizontal plane) from a plurality of planes based on a user input operation. Alternatively, the ground detection unit 121 may determine the direction of gravity from the acceleration measured by the inertial measurement unit (IMU), and may determine the plane whose normal line is close to gravity as the ground. Alternatively, the ground detection unit 121 may detect the ground using the direction of gravity measured by the IMU and the size of the area of the plane.
  • IMU inertial measurement unit
  • the external parameter calculation unit 122 calculates the external parameter T 1 from the relationship between the ground and the floor map 300 detected by the ground detection unit 121. First, the external parameter calculation unit 122 obtains the rotation R 1 from the detected normal line ng of the ground and the normal line n f of the floor map 300. ng and n f are represented by the following equations (2) and (3).
  • the external parameter calculation unit 122 obtains the rotation R 1 by the equation (5), and then uses the vector x p of one point on the plane. To calculate.
  • the vector t 1 is a vector obtained by extracting only the rotation R 1 and the vector x p in the height direction, and is expressed by the following equation (7).
  • the external parameter input unit 123 accepts user input. Some external parameters are input by user input in the external parameter input unit 123.
  • the external parameters input to the external parameter input unit 123 are, for example, the rotation R 2 around the y-axis in the coordinate system of the floor map 300 shown in FIG. 8 and the translation amount t 2 in the xz plane. From these, the external parameter input unit 123 acquires the external parameter T 2 by the following equation (8). Further, assuming that the user input is the rotation angle ⁇ y , the rotation R 2 is expressed by the following equation (9).
  • t 2 is a translation amount with reference to the lower left point of the floor map 300 (that is, the origin).
  • the rotation angle ⁇ around the y-axis may be acquired by using the geomagnetic sensor 25.
  • the value of the angle ⁇ is automatically input from the geomagnetic sensor 25.
  • the angle ⁇ may be input by the user with the detection value of the geomagnetic sensor 25 as the initial value.
  • the input method of the angle ⁇ and the translation amount t 2 may be either the operation of the Graphics User Interface (GUI) or the numerical input from the keyboard or the like. Further, the origin of the translation amount t 2 may be a point different from the lower left point of the floor map shown in FIG.
  • GUI Graphics User Interface
  • the external parameter action unit 124 causes the external parameter calculated by the external parameter calculation unit 122 and the external parameter input from the external parameter input unit 123 to act on the three-dimensional map.
  • the external parameter action performed on the point cloud managed by the three-dimensional map is shown in the following equation (10).
  • the vector p indicates one point in the point cloud before the action, and the vector p'indicates the point after the action.
  • ⁇ Range division 130> When a plurality of objects having the same shape and the same pattern are lined up on the floor map, the estimation of the position and orientation of the target objects may fail. For example, in an environment where a plurality of objects B1 to B6 having the same shape and the same pattern are lined up, when the position and posture are estimated by scanning the object B1, the position and position of any of the objects B2 to B6 and Posture may be output. In order to avoid erroneous estimation, it is desirable to first specify the range in which the user is when estimating the position and posture. By designating the range, it is possible to estimate the detailed position and posture based on the three-dimensional map 400 of the object included in the range.
  • FIG. 9 is a diagram showing a range division process executed by the range division unit 130 shown in FIG.
  • the range dividing unit 130 divides one floor represented by the floor map 300 into a plurality of ranges (for example, ranges # 1 to # 4).
  • the range dividing unit 130 statically divides one floor of a building into a plurality of ranges # 1 to # 4, and manages a three-dimensional map in each of the plurality of ranges # 1 to # 4.
  • the range dividing unit 130 divides one floor map 300 into four ranges # 1 to # 4, for example, as shown in FIG.
  • one floor map 300 is divided into a plurality of ranges and the three-dimensional map is managed so that a plurality of objects having the same pattern and the same shape belong to different ranges, the target object can be obtained.
  • the possibility of falsely presuming that it is another object with the same shape and the same pattern is reduced.
  • the possibility of erroneous estimation cannot be sufficiently reduced.
  • FIG. 10 is a functional block diagram schematically showing the configuration of the range dividing portion 130 shown in FIG.
  • the range dividing unit 130 includes an image feature calculation unit 131 and a clustering unit 132.
  • the image feature calculation unit 131 calculates the features of the image based on the similarity of the images taken by the camera when generating the three-dimensional map.
  • the image feature calculation unit 131 performs a process for range division using, for example, Bag of Words (BoW).
  • the image feature calculation unit 131 vectorizes an image taken by each camera of the first three-dimensional map of the object using BoW.
  • the image feature calculation unit 131 determines that the vectors are "similar” if they are close to each other.
  • the clustering unit 132 performs clustering based on the result of the image feature calculation unit 131 and divides the range. That is, the clustering unit 132 performs the range division process so that these vectors belong to different ranges.
  • FIG. 11 is a diagram showing a screen at the time of a range designation operation using the range designation unit 140 of the three-dimensional map creation device 100 according to the first embodiment.
  • the range designation unit 140 is determined by the range division unit 130 and designates one range from a plurality of ranges.
  • the range designation unit 140 specifies, for example, one range selected by a user operation from the user operation unit.
  • the user operation unit is a device connected to the range designation unit 140 or a part of the range designation unit 140.
  • the user operation unit is, for example, a touch panel of a tablet PC as shown in FIG. When the user selects one range from the plurality of ranges displayed on the touch panel by tap operation or the like, the range designation unit 140 specifies the selected range.
  • the position / orientation estimation unit 150 estimates the position and orientation of the object using the three-dimensional map included in the range designated by the range designation unit 140.
  • the technique for estimating the position and the posture there is a method of combining BoW and Perceptive N Points (PnP).
  • FIG. 12A is a flowchart showing a three-dimensional map generation process
  • FIG. 12B is a flowchart showing a position / orientation estimation process.
  • the three-dimensional map generation unit 110 generates a small-scale first three-dimensional map for each object (steps S11 and S12).
  • the floor map registration unit 120 registers the first three-dimensional map in the floor map 300 (step S13).
  • the processes of steps S11 to S13 are repeated for the number of objects, and as a result, a large-scale second three-dimensional map is generated.
  • the range dividing unit 130 divides the second three-dimensional map into a plurality of ranges (step S14).
  • the position and posture are estimated using the three-dimensional map.
  • the range designation unit 140 selects a designated range, which is an approximate range in which the user is present, based on the user operation (step S21).
  • the position / orientation estimation unit 150 estimates the position and orientation using the three-dimensional map within the designated range (step S22).
  • FIG. 13 is a flowchart showing a floor map registration process (step S13) executed by the floor map registration unit 120 of the three-dimensional map creation device 100.
  • the ground detection unit 121 detects the ground (steps S131 to S135).
  • the ground detection unit 121 detects the ground by detecting a plurality of planes (steps S131 to S133) and then finding the maximum plane that can be regarded as the ground (steps S134, S135).
  • the external parameter calculation unit 122 calculates the movement amount using the relationship between the ground and the floor map 300, and the external parameter input unit 123 is based on the parallel movement amount input by the user operation and the rotation amount around the ground normal. The amount of movement is obtained (steps S136 and S137).
  • the external parameter action unit 124 causes the external parameters, which are the two movement amounts, to act on the three-dimensional map (step S138).
  • FIG. 14 is a flowchart showing a range division process executed by the range division unit 130 of the three-dimensional map creation device 100.
  • the image feature calculation unit 131 calculates the image features using BoW or the like, and when the image features are similar to each other, adds this information to the rule.
  • the index of the first loop (step S141) of the three-dimensional map is i (where i is an integer of 0 or more and N or less), and the index of the second loop (step S143) is j (where j is 0 or more and N).
  • the image feature calculation unit 131 calculates the similarity of the three-dimensional map when i ⁇ j (steps S141 to S147).
  • FIG. 15 is a diagram for explaining a case where the similarity is calculated and a case where the similarity is not calculated. That is, the similarity is calculated when it is white in FIG. 15, and the similarity is not calculated when it is in the shaded area.
  • FIG. 16 is a flowchart showing a similarity determination process executed by the range dividing unit 130 of the three-dimensional map creating device 100.
  • the image feature calculation unit 131 of the range dividing section 130 first calculates the distance between the two image features. Assuming that the image features here are vector v 1 and vector v 2 , the range dividing unit 130 uses, for example, the Euclidean distance represented by the following equation (11) as the distance between the two image features.
  • the image feature calculation unit 131 calculates the calculation of the distance between the two image features (step S1403), and repeats the process of obtaining the minimum distance (steps S1404 and S1405) N (j) times, which is the number of three-dimensional maps (step). ST1401), and further, the processing of steps S1401 to S1404 is repeated N (i) times, which is the number of three-dimensional maps (step ST1400).
  • the image feature calculation unit 131 determines that the minimum distance is less than ⁇ (true in step S1405), determines that it is similar, and ends the process, and if the minimum distance is ⁇ or more (step). In S1405, false), it is determined that they are dissimilar, and the process is terminated.
  • is a preset value, and is, for example, a parameter set by a developer, a user, or the like.
  • the clustering unit 132 executes clustering based on the rule (steps S148 to S151 in FIG. 14).
  • FIG. 14 shows an example in which clustering is performed using divided clustering, which is hierarchical clustering.
  • the clustering unit 132 divides the range of the three-dimensional map to be divided.
  • the three-dimensional maps to be divided are C1 and C2.
  • other 3D maps near the center position of the 3D map C1 belong to the C1 class
  • other 3D maps near the center position of the 3D map C2 belong to the C2 class.
  • Perform range division The clustering unit 132 repeats such a range division of the three-dimensional map to perform the final range division.
  • the three-dimensional map creating device 100 creates one or more small-scale first three-dimensional maps using SLAM, and creates a floor map.
  • a large-scale second three-dimensional map (for example, a three-dimensional map of the entire indoor environment) is created. Since the position error accumulated by SLAM when creating the first three-dimensional map is small, the position error in the second three-dimensional map is suppressed. Therefore, the three-dimensional map creating device 100 according to the first embodiment can produce a large-scale three-dimensional map having a small position error.
  • FIG. 17 is a functional block diagram schematically showing the configuration of the three-dimensional map creating device 101 according to the second embodiment.
  • components that are the same as or correspond to the components shown in FIG. 4 are designated by the same reference numerals as those shown in FIG.
  • the three-dimensional map creation device 101 according to the second embodiment is different from the three-dimensional map creation device 100 according to the first embodiment in that it has a content registration unit 160 and a content superimposition display unit 170.
  • the HW configuration of the three-dimensional map creating device 101 is the same as that shown in FIG.
  • the content registration unit 160 registers the position of the content to be displayed on the image displaying each object by using the first three-dimensional map of each of the one or more objects.
  • the content superimposition display unit 170 superimposes the registered content on the image taken by the camera 22 and displays it on the display 30.
  • the content can include, for example, information indicating a figure such as a line or a plane and an object, information indicating a three-dimensional figure such as a cube or a sphere and an object, or a combination thereof.
  • FIG. 18 (A) is a flowchart showing a three-dimensional map generation process performed by the three-dimensional map creation device 101
  • FIG. 18 (B) shows a content superimposed display by the three-dimensional map creation device 101. It is a flowchart which shows the process to perform for this.
  • the same processing steps as those in FIGS. 12A and 12B are designated by the same reference numerals as those in the processing steps of FIGS. 12A and 12B.
  • the three-dimensional map creation device 101 according to the second embodiment is a content registration process (step S15) for registering the position of the content to be displayed superimposed on the image displaying each object.
  • Step S23 is different from the three-dimensional map creating device 100 according to the first embodiment.
  • the three-dimensional map creating device 101 creates one or more small-scale first three-dimensional maps using SLAM, and creates a floor map.
  • a large-scale second three-dimensional map (for example, a three-dimensional map of the entire indoor environment) is created. Since the position error accumulated by SLAM when creating the first three-dimensional map is small, the position error in the second three-dimensional map is suppressed. Therefore, the three-dimensional map creating device 101 according to the second embodiment can superimpose and display the contents at appropriate positions as shown in FIG. 2 (A).
  • the second embodiment is the same as the first embodiment.
  • FIG. 19 is a functional block diagram schematically showing the configuration of the three-dimensional map creating device 102 according to the third embodiment.
  • components that are the same as or correspond to the components shown in FIG. 4 are designated by the same reference numerals as those shown in FIG.
  • the three-dimensional map creation device 102 according to the third embodiment is different from the three-dimensional map creation device 100 according to the first embodiment in that it has a relative movement distance estimation unit 180.
  • the HW configuration of the three-dimensional map creating device 102 is the same as that shown in FIG.
  • the position / orientation estimation unit 150 estimates the position and orientation of the object based on a small-scale first three-dimensional map in and around the object.
  • the three-dimensional map creation device 102 has a function of calculating the relative movement distance by providing the relative movement distance estimation unit 180.
  • FIG. 20A is a flowchart showing a three-dimensional map generation process performed by the three-dimensional map creation device 102
  • FIG. 20B is a position / orientation estimation performed by the three-dimensional map creation device 102. It is a flowchart which shows the process.
  • the same processing steps as those in FIGS. 12A and 12B are designated by the same reference numerals as those in the processing steps of FIGS. 12A and 12B.
  • the three-dimensional map creating device 102 according to the third embodiment relates to the first embodiment in that it performs a process (steps S24, S25) for obtaining a relative moving distance. It is different from the three-dimensional map creating device 100.
  • the first method is a method using SLAM.
  • This is a method of obtaining the movement amount by using the image taken by the camera, the sensor data detected by the distance sensor 21, or both of them.
  • This is, for example, a method of obtaining the movement amount by sequentially integrating the movement amounts between two frames arranged in the time direction.
  • the second method is dead reckoning using any one or more of the gyro sensor 23, the acceleration sensor 24, and the geomagnetic sensor 25 on the image taken by the camera and the sensor data detected by the distance sensor 21. It is a combined method.
  • Dead reckoning is a method of obtaining a movement amount using a gyro sensor 23, an acceleration sensor 24, a geomagnetic sensor 25, or the like. In this method, the moving speed and the moving direction are obtained by integrating the acceleration and the gyro sensor 23, and the moving distance is obtained by integrating the speeds.
  • the third method is a method using pedestrian dead reckoning.
  • the third method is a method of obtaining the stride length and the number of steps from the sensor data of the gyro sensor 23 and the acceleration sensor 24, and obtaining the moving distance from these.
  • the three-dimensional map creating device 102 creates one or more small-scale first three-dimensional maps using SLAM, and creates a floor map.
  • one large-scale second three-dimensional map for example, a three-dimensional map of the entire indoor environment. Since the position error accumulated by SLAM when creating the first three-dimensional map is small, the position error in the second three-dimensional map is reduced. Therefore, the three-dimensional map creating device 102 according to the third embodiment can superimpose and display the contents at appropriate positions as shown in FIG. 2 (A). Further, the position and the posture can be estimated even at a place away from the first three-dimensional map.
  • the third embodiment is the same as the first embodiment. It is also possible to provide the relative movement distance estimation unit 180 in the configuration of the second embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Geometry (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Processing Or Creating Images (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Instructional Devices (AREA)

Abstract

A three-dimensional map creation device (100) includes: a three-dimensional map generation unit (110) that, on the basis of sensor data detected by a sensor that moves on a floor, generates a first three-dimensional map for each region that includes an object of interest; and a map registration unit (120) that acquires a floor map (300) of the floor, and arranges one or more first three-dimensional maps generated by the three-dimensional map generation unit (110) on the floor map (300) to generate a second three-dimensional map that includes the one or more first three-dimensional maps.

Description

3次元地図作成装置、3次元地図作成方法、及び3次元地図作成プログラム3D map creation device, 3D map creation method, and 3D map creation program

 本発明は、3次元地図作成装置、3次元地図作成方法、及び3次元地図作成プログラムに関する。 The present invention relates to a three-dimensional map creation device, a three-dimensional map creation method, and a three-dimensional map creation program.

 工場又はビルなどの大規模な屋内環境において自己位置を検出するためのシステムとして、WiFi又はBeaconを利用した測位システムが知られている。しかし、例えば、Automated Guided Vehicle(AGV)に搭載されたロボットである自律移動ロボットでは、点検の対象物の位置に加えて姿勢を検出したいという要求がある。また、対象物の位置及び姿勢を検出する装置では、導入コストの観点から位置及び姿勢を検出するための追加の機器をできるだけ無くしたいという要求がある。これらの要求を満たす方法として、自律移動ロボットの用途及び実在する風景にバーチャルの視覚情報としてのコンテンツを重ねて表示する拡張現実の用途などにおいて3次元地図を用いる方法が知られている。 A positioning system using WiFi or Beacon is known as a system for detecting a self-position in a large-scale indoor environment such as a factory or a building. However, for example, in an autonomous mobile robot which is a robot mounted on an Automated Guided Vehicle (AGV), there is a demand to detect a posture in addition to a position of an object to be inspected. Further, in the device for detecting the position and orientation of the object, there is a demand to eliminate as much as possible additional equipment for detecting the position and orientation from the viewpoint of introduction cost. As a method of satisfying these demands, a method of using a three-dimensional map is known in an application of an autonomous mobile robot and an application of augmented reality in which contents as virtual visual information are superimposed on an existing landscape.

 3次元地図を作成する方法としては、Laser Imaging Detection and Ranging(LiDAR)、カメラ、などのセンサから取得されたセンサデータに基づいて、自己位置の推定と地図作成とを同時に行うSimultaneous Localization And Mapping(SLAM)が知られている。 As a method of creating a three-dimensional map, Simultaneous Localization And Mapping (Simultaneous Localization And Mapping), which estimates the self-position and creates a map at the same time based on sensor data acquired from sensors such as Laser Imaging Detection and Range (LiDAR) and a camera. SLAM) is known.

特開2014-229020号公報Japanese Unexamined Patent Publication No. 2014-229020

 しかしながら、SLAMでは移動距離に応じて位置誤差が蓄積されるため、SLAMを大規模な屋内環境で使用した場合、精度の高い3次元地図を作成できないという問題がある。 However, since SLAM accumulates position errors according to the distance traveled, there is a problem that a highly accurate 3D map cannot be created when SLAM is used in a large-scale indoor environment.

 ここで、図1(A)及び(B)は、SLAMによる3次元地図作成の対象領域であるフロア201及びフロア201の壁面に沿って移動しながら屋内環境をスキャンした場合における位置誤差の蓄積の様子であるスキャン結果202をそれぞれ示す。また、図2(A)及び(B)は、実在する対象物(例えば、機器)A1~A3の画像211にコンテンツ212を正常に重ねて表示するタブレットPC(Tablet Personal Computer)210及び非正常に重ねて表示するタブレットPC210をそれぞれ示す。 Here, FIGS. 1 (A) and 1 (B) show the accumulation of positional errors when the indoor environment is scanned while moving along the floor 201 and the wall surface of the floor 201, which are the target areas for creating a three-dimensional map by SLAM. The scan results 202, which are the appearances, are shown respectively. Further, FIGS. 2A and 2B show a tablet PC (Tablet Personal Computer) 210 that normally superimposes the content 212 on the images 211 of the actual objects (for example, devices) A1 to A3 and abnormally. The tablet PC 210s to be displayed on top of each other are shown.

 本発明は、上記課題を解決するためになされたものであり、精度が高い3次元地図を生成することができる3次元地図作成装置、3次元地図作成方法、及び3次元地図作成プログラムを提供することを目的とする。 The present invention has been made to solve the above problems, and provides a three-dimensional map creation device capable of generating a highly accurate three-dimensional map, a three-dimensional map creation method, and a three-dimensional map creation program. The purpose is.

 本発明の一態様に係る3次元地図作成装置は、フロア上を移動するセンサによって検出されたセンサデータに基づいて、対象物を含む領域毎に第1の3次元地図を生成する3次元地図生成部と、前記フロアのフロアマップを取得し、前記3次元地図生成部によって生成された1つ以上の第1の3次元地図を前記フロアマップ上に配置することによって、前記1つ以上の第1の3次元地図を含む第2の3次元地図を生成するフロアマップ登録部とを有する。 The three-dimensional map creating device according to one aspect of the present invention generates a three-dimensional map for each area including an object based on sensor data detected by a sensor moving on the floor. By acquiring the unit and the floor map of the floor and arranging one or more first three-dimensional maps generated by the three-dimensional map generation unit on the floor map, the one or more first It has a floor map registration unit that generates a second 3D map including the 3D map of.

 本発明の他の態様に係る3次元地図作成方法は、フロア上を移動するセンサによって検出されたセンサデータに基づいて、対象物を含む領域毎に第1の3次元地図を生成するステップと、前記フロアのフロアマップを取得し、生成された1つ以上の第1の3次元地図を前記フロアマップ上に配置することによって、前記1つ以上の第1の3次元地図を含む第2の3次元地図を生成するステップとを有する。 The three-dimensional map creation method according to another aspect of the present invention includes a step of generating a first three-dimensional map for each area including an object based on sensor data detected by a sensor moving on the floor. A second 3 that includes the one or more first 3D maps by acquiring the floor map of the floor and arranging the generated one or more first 3D maps on the floor map. It has a step of generating a three-dimensional map.

 本発明によれば、精度が高い3次元地図を生成することができる。 According to the present invention, it is possible to generate a highly accurate three-dimensional map.

(A)は、SLAMによる3次元地図作成の対象領域である現実の1フロアを示す図であり、(B)は、1フロアの壁面に沿って移動しながら屋内環境をスキャンした場合における位置誤差の蓄積の様子であるスキャン結果を示す図である。(A) is a diagram showing one actual floor which is a target area for creating a three-dimensional map by SLAM, and (B) is a position error when scanning an indoor environment while moving along the wall surface of one floor. It is a figure which shows the scan result which is the state of the accumulation of. (A)は、実在する対象物を表示する画像にバーチャルな視覚情報としてのコンテンツを正常に重ねて表示するタブレットPCを示す図であり、(B)は、実在する対象物を表示する画像にコンテンツを非正常に重ねて表示するタブレットPCを示す図である。(A) is a diagram showing a tablet PC that normally superimposes content as virtual visual information on an image displaying an existing object, and (B) is a diagram showing an image displaying an existing object. It is a figure which shows the tablet PC which displays the content abnormally superimposed. 本発明の実施の形態1に係る3次元地図作成装置のハードウェア構成の例を示す図である。It is a figure which shows the example of the hardware composition of the 3D map making apparatus which concerns on Embodiment 1 of this invention. 実施の形態1に係る3次元地図作成装置の構成を概略的に示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the 3D map making apparatus which concerns on Embodiment 1. FIG. (A)は、フロアマップの例を示す平面図であり、(B)は、小規模な第1の3次元地図で表された対象物の例を示す斜視図である。(A) is a plan view showing an example of a floor map, and (B) is a perspective view showing an example of an object represented by a small-scale first three-dimensional map. (A)は、第1の3次元地図のxyz軸周りの回転を示す斜視図であり、(B)は、第1の3次元地図の地面の法線周りの回転を示す平面図である。(A) is a perspective view showing the rotation of the first three-dimensional map around the xyz axis, and (B) is a plan view showing the rotation of the first three-dimensional map around the normal of the ground. 図4に示されるフロアマップ登録部の構成を概略的に示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the floor map registration part shown in FIG. xyz直交座標系におけるフロアマップの例を示す図である。It is a figure which shows the example of the floor map in the xyz Cartesian coordinate system. 図4に示される範囲分割部によって実行される範囲分割処理を示す図である。It is a figure which shows the range division processing executed by the range division part shown in FIG. 図4に示される範囲分割部の構成を概略的に示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the range division part shown in FIG. 実施の形態1に係る3次元地図作成装置の範囲指定部を用いた範囲指定操作時の画面を示す図である。It is a figure which shows the screen at the time of the range designation operation using the range designation part of the 3D map making apparatus which concerns on Embodiment 1. FIG. (A)は、3次元地図生成処理を示すフローチャートであり、(B)は、位置姿勢推定処理を示すフローチャートである。(A) is a flowchart showing a three-dimensional map generation process, and (B) is a flowchart showing a position / orientation estimation process. 実施の形態1に係る3次元地図作成装置のフロアマップ登録部によって実行されるフロアマップ登録処理を示すフローチャートである。FIG. 5 is a flowchart showing a floor map registration process executed by the floor map registration unit of the three-dimensional map creation device according to the first embodiment. 実施の形態1に係る3次元地図作成装置の範囲分割部によって実行される範囲分割処理を示すフローチャートである。It is a flowchart which shows the range division processing executed by the range division part of the 3D map making apparatus which concerns on Embodiment 1. FIG. 類似度を計算する場合と計算しない場合とを説明するための図である。It is a figure for demonstrating the case where the similarity is calculated and the case where it is not calculated. 実施の形態1に係る3次元地図作成装置の範囲分割部によって実行される類似判定処理を示すフローチャートである。It is a flowchart which shows the similarity determination process which is executed by the range division part of the 3D map making apparatus which concerns on Embodiment 1. FIG. 本発明の実施の形態2に係る3次元地図作成装置の構成を概略的に示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the 3D map making apparatus which concerns on Embodiment 2 of this invention. (A)は、3次元地図生成処理を示すフローチャートであり、(B)は、コンテンツ重畳表示のための処理を示すフローチャートである。(A) is a flowchart showing a three-dimensional map generation process, and (B) is a flowchart showing a process for displaying content superimposed. 本発明の実施の形態2に係る3次元地図作成装置の構成を概略的に示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the 3D map making apparatus which concerns on Embodiment 2 of this invention. (A)は、3次元地図生成処理を示すフローチャートであり、(B)は、位置姿勢推定処理を示すフローチャートである。(A) is a flowchart showing a three-dimensional map generation process, and (B) is a flowchart showing a position / orientation estimation process.

 以下に、本発明の実施の形態に係る3次元地図作成装置、3次元地図作成方法、及び3次元地図作成プログラムを、図面を参照しながら説明する。以下の実施の形態は、例にすぎず、本発明の範囲内で種々の変更が可能である。 The three-dimensional map creation device, the three-dimensional map creation method, and the three-dimensional map creation program according to the embodiment of the present invention will be described below with reference to the drawings. The following embodiments are merely examples, and various modifications can be made within the scope of the present invention.

 実施の形態に係る3次元地図作成装置は、例えば、コンピュータを有する自律移動ロボットである。実施の形態に係る3次元地図作成装置は、実在する対象物を表示する画像に視覚情報であるコンテンツを重ねて表示するタブレットPCであってもよい。ただし、実施の形態に係る3次元地図作成装置は、ユーザが持ち運ぶことにより移動可能なパーソナルコンピュータ又はスマートフォンなどであってもよい。 The three-dimensional map creating device according to the embodiment is, for example, an autonomous mobile robot having a computer. The three-dimensional map creating device according to the embodiment may be a tablet PC that superimposes a content that is visual information on an image that displays an existing object. However, the three-dimensional map creating device according to the embodiment may be a personal computer or a smartphone that can be moved by the user.

 本出願において、フロアマップ又はフロア上の物体である対象物(例えば、設備、機器など)を示す図には、発明の理解を容易にするために、xyz直交座標系の座標軸と、各座標軸周りの回転方向とが示されている。フロアは、基準平面の例であり、一般的には地面に平行である。x軸及びz軸は、フロアを含む平面に平行な座標軸である。y軸は、フロアを含む平面に直交する方向の座標軸である。+Rz方向は、+z軸方向を向いたときにおける時計回り方向であり、-Rz方向は、+Rz方向の逆方向である反時計回り方向である。+Rx方向は、+x軸方向を向いたときにおける時計回り方向であり、-Rx方向は、+Rx方向の逆方向である反時計回り方向である。+Ry方向は、+y軸方向を向いたときにおける時計回り方向であり、-Ry方向は、+Ry方向の逆方向である反時計回り方向である。 In the present application, in order to facilitate the understanding of the invention, the floor map or the figure showing the object (for example, equipment, equipment, etc.) which is an object on the floor shows the coordinate axes of the xyz orthogonal coordinate system and around each coordinate axis. The direction of rotation is shown. The floor is an example of a reference plane and is generally parallel to the ground. The x-axis and z-axis are coordinate axes parallel to the plane including the floor. The y-axis is a coordinate axis in a direction orthogonal to the plane including the floor. The + Rz direction is a clockwise direction when facing the + z axis direction, and the −Rz direction is a counterclockwise direction which is the opposite direction of the + Rz direction. The + Rx direction is a clockwise direction when facing the + x-axis direction, and the −Rx direction is a counterclockwise direction which is the opposite direction of the + Rx direction. The + Ry direction is a clockwise direction when facing the + y-axis direction, and the −Ry direction is a counterclockwise direction which is the opposite direction of the + Ry direction.

《1》実施の形態1
《1-1》構成
〈3次元地図作成装置100〉
 図3は、実施の形態1に係る3次元地図作成装置100のハードウェア(HW)構成の例を示す図である。3次元地図作成装置100は、実施の形態1に係る3次元地図作成方法を実施することができる装置である。3次元地図作成装置100は、実在する対象物を表示する画像に視覚情報であるコンテンツを重ねて表示するタブレットPCであってもよい。実在する対象物は、例えば、機器又は設備などである。
<< 1 >> Embodiment 1
<< 1-1 >> configuration <3D map creation device 100>
FIG. 3 is a diagram showing an example of a hardware (HW) configuration of the three-dimensional map creating device 100 according to the first embodiment. The three-dimensional map creation device 100 is a device capable of implementing the three-dimensional map creation method according to the first embodiment. The three-dimensional map creating device 100 may be a tablet PC that superimposes a content that is visual information on an image that displays an existing object. The actual object is, for example, a device or equipment.

 図3に示されるように、3次元地図作成装置100は、コンピュータ10を有する。コンピュータ10は、ソフトウェアとしてのプログラムを格納することができる記憶部としてのメモリ12と、メモリ12に記憶されているプログラムを実行することができる情報処理部としてのプロセッサ11とを有する。プログラムは、実施の形態1に係る3次元地図作成方法をコンピュータ10に実行させることができる3次元地図作成プログラムを含む。プログラムは、コンピュータ10によって読み取り可能な記録媒体に記録されることができる。記録媒体は、例えば、磁気ディスク、光ディスク、又は半導体メモリ、などである。 As shown in FIG. 3, the three-dimensional map creating device 100 has a computer 10. The computer 10 has a memory 12 as a storage unit capable of storing a program as software, and a processor 11 as an information processing unit capable of executing a program stored in the memory 12. The program includes a three-dimensional map creation program capable of causing the computer 10 to execute the three-dimensional map creation method according to the first embodiment. The program can be recorded on a recording medium that can be read by the computer 10. The recording medium is, for example, a magnetic disk, an optical disk, a semiconductor memory, or the like.

 3次元地図作成装置100は、3次元地図DB40を有する。3次元地図DB40は、3次元地図を管理するために用いられるデータベース(DB)が格納された記憶装置である。ただし、3次元地図DB40は、3次元地図作成装置100と通信可能に接続された外部の記憶装置又はネットワーク上のサーバに備えられてもよい。 The three-dimensional map creation device 100 has a three-dimensional map DB 40. The three-dimensional map DB 40 is a storage device in which a database (DB) used for managing the three-dimensional map is stored. However, the three-dimensional map DB 40 may be provided in an external storage device or a server on the network that is communicably connected to the three-dimensional map creation device 100.

 3次元地図作成装置100は、距離センサ21、カメラ22、ジャイロセンサ23、加速度センサ24、地磁気センサ25などの各種センサのうちの1つ以上を有してもよい。距離センサ21は、LiDAR、赤外線などを用いて距離を測定するセンサである。カメラ22は、画像(例えば、カラー画像)を取得するセンサである。ジャイロセンサ23は、角速度を取得するセンサである。加速度センサ24は、加速度を取得するセンサである。地磁気センサ25は、方位を取得するセンサである。各種センサは、3次元地図作成装置100の一部であってもよい。ただし、各種センサは、3次元地図作成装置100と通信可能に接続された外部の装置に備えられてもよい。例えば、図3に示される各種センサ21~25がAGV上に備えられ、3次元地図作成装置100が、AGV上ではない他の場所に置かれたコンピュータ10で構成されることも可能である。 The three-dimensional map creating device 100 may have one or more of various sensors such as a distance sensor 21, a camera 22, a gyro sensor 23, an acceleration sensor 24, and a geomagnetic sensor 25. The distance sensor 21 is a sensor that measures a distance using LiDAR, infrared rays, or the like. The camera 22 is a sensor that acquires an image (for example, a color image). The gyro sensor 23 is a sensor that acquires an angular velocity. The acceleration sensor 24 is a sensor that acquires acceleration. The geomagnetic sensor 25 is a sensor that acquires an orientation. The various sensors may be part of the three-dimensional mapping apparatus 100. However, various sensors may be provided in an external device communicably connected to the three-dimensional map creating device 100. For example, various sensors 21 to 25 shown in FIG. 3 may be provided on the AGV, and the three-dimensional mapping device 100 may be configured by a computer 10 placed in a place other than the AGV.

 また、3次元地図作成装置100は、ディスプレイ30を有する。ディスプレイ30は、画像を表示する表示装置である。3次元地図作成装置100が、ユーザが携帯するタブレットPCである場合、ディスプレイ30に、図2(A)に示されるような現実の対象物を表示する画像と拡張現実のコンテンツとが表示される。ただし、3次元地図作成装置100は、ディスプレイ30を備えない装置であってもよい。 Further, the three-dimensional map creating device 100 has a display 30. The display 30 is a display device for displaying an image. When the three-dimensional map creating device 100 is a tablet PC carried by the user, the display 30 displays an image displaying a real object as shown in FIG. 2A and augmented reality content. .. However, the three-dimensional map creating device 100 may be a device that does not include the display 30.

 例えば、拡張現実を用いて対象物の保守点検を行う場合、3次元地図作成装置100であるタブレットPCを持ったユーザが対象物の正面まで移動する。3次元地図作成装置100は、対象物までの移動中において及び対象物の正面位置において取得されたセンサデータに基づいて、対象物の位置及び姿勢の推定を行う。なお、タブレットPCの場合には、対象物の位置は、ユーザの位置である自己位置と同じであるものとみなす。拡張現実を用いて対象物の保守点検を行う場合、対象物及びその周辺(すなわち、近くの領域)の3次元地図が正確に作成されていれば、対象物から遠い位置の3次元地図が作成されていなくても或いは不正確であっても、対象物に関するコンテンツを、タブレットPCのディスプレイ30において、対象物の画像上又は対象物の画像の近くの適切な位置に表示することができる。 For example, when performing maintenance and inspection of an object using augmented reality, a user holding a tablet PC, which is a three-dimensional map creation device 100, moves to the front of the object. The three-dimensional map creating device 100 estimates the position and posture of the object based on the sensor data acquired while moving to the object and at the front position of the object. In the case of a tablet PC, the position of the object is considered to be the same as the self-position, which is the position of the user. When performing maintenance and inspection of an object using augmented reality, if a 3D map of the object and its surroundings (that is, a nearby area) is accurately created, a 3D map of a position far from the object is created. Content about the object, whether not done or inaccurate, can be displayed on the display 30 of the tablet PC at an appropriate position on or near the image of the object.

 また、3次元地図作成装置100である自律移動ロボットを用いて対象物の保守点検を行う場合、自律移動ロボットが対象物の正面まで移動する。自律移動ロボットは、対象物までの移動中における及び対象物の正面位置におけるセンサデータに基づいて、対象物の位置及び姿勢の推定を行う。自律移動ロボットが対象物の保守点検を行う場合、対象物及びその近くの3次元地図が正確に作成されていれば、対象物から遠い位置の3次元地図が作成されていなくても或いは不正確であっても、ロボットハンドなどによって対象物を操作することができる。つまり、自律移動ロボットは、対象物の周辺では、高い位置精度を持つ3次元地図を必要とするが、移動に使用される通路などでは、3次元地図の位置精度は低くても問題ない。 Further, when the maintenance and inspection of the object is performed using the autonomous mobile robot which is the three-dimensional map creation device 100, the autonomous mobile robot moves to the front of the object. The autonomous mobile robot estimates the position and posture of the object based on the sensor data while moving to the object and at the front position of the object. When an autonomous mobile robot performs maintenance and inspection of an object, if the 3D map of the object and its vicinity is created accurately, even if the 3D map of the position far from the object is not created or it is inaccurate. Even so, the object can be operated by a robot hand or the like. That is, the autonomous mobile robot requires a three-dimensional map having high position accuracy around the object, but there is no problem even if the position accuracy of the three-dimensional map is low in the passage used for movement.

 機器単体又は数メートル四方の領域などのような小規模な環境であれば、SLAMを用いることによって、高精度に3次元地図を作成することができる。これは、SLAMを用いて小規模な環境の3次元地図を作成する場合、誤差の蓄積が少なく、さらに、ループの検出が容易だからである。ループの検出は、例えば、SLAMで行われるループ閉じ込み(Loop Closure)と呼ばれる処理である。 In a small environment such as a single device or an area of several meters square, SLAM can be used to create a 3D map with high accuracy. This is because when creating a three-dimensional map of a small-scale environment using SLAM, the accumulation of errors is small, and loop detection is easy. Loop detection is, for example, a process called loop closure performed in SLAM.

 実施の形態1に係る3次元地図作成装置100は、対象物及びその周辺のみで高精度に位置及び姿勢の推定をすることを可能にする大規模な第2の3次元地図を作成する。具体的には、実施の形態1に係る3次元地図作成装置100は、点検対象機器などの対象物の位置を含むレイアウトが描かれたフロアマップ300上に1つ以上の第1の3次元地図(すなわち、小規模な3次元地図)を配置することによって、1つの第2の3次元地図(すなわち、大規模な3次元地図)を作成する。 The three-dimensional map creating device 100 according to the first embodiment creates a large-scale second three-dimensional map that enables highly accurate estimation of position and posture only in the object and its surroundings. Specifically, the three-dimensional map creating device 100 according to the first embodiment has one or more first three-dimensional maps on a floor map 300 on which a layout including the positions of objects such as equipment to be inspected is drawn. By arranging (ie, a small 3D map), one second 3D map (ie, a large 3D map) is created.

 3次元地図作成装置100がタブレットPCである場合、タブレットPCを携帯したユーザが対象物の正面まで移動すると、3次元地図作成装置100は、フロアマップの1つ以上の領域に1つ以上の第1の3次元地図をそれぞれ登録することによって、図2(A)に示されるように、対象物A1~A3が表示された画像211における適切な位置にコンテンツ212を重畳表示する。 When the 3D map creation device 100 is a tablet PC, when the user carrying the tablet PC moves to the front of the object, the 3D map creation device 100 has one or more thirds in one or more areas of the floor map. By registering each of the three-dimensional maps of No. 1, as shown in FIG. 2A, the content 212 is superimposed and displayed at an appropriate position in the image 211 in which the objects A1 to A3 are displayed.

 図4は、実施の形態1に係る3次元地図作成装置100の構成を概略的に示す機能ブロック図である。図4に示されるように、3次元地図作成装置100は、フロア上を移動するセンサによって検出されたセンサデータに基づいて、対象物を含む領域毎に第1の3次元地図を生成する3次元地図生成部110と、フロアのフロアマップを取得し、3次元地図生成部110によって生成された1つ以上の第1の3次元地図をフロアマップ上に配置することによって、1つ以上の第1の3次元地図を含む大規模な第2の3次元地図を生成するフロアマップ登録部120と、第2の3次元地図を複数の範囲に分割する範囲分割部130とを有する。また、3次元地図作成装置100は、範囲指定部140と、位置姿勢推定部150とを有してもよい。 FIG. 4 is a functional block diagram schematically showing the configuration of the three-dimensional map creating device 100 according to the first embodiment. As shown in FIG. 4, the three-dimensional map creating device 100 generates a first three-dimensional map for each area including an object based on the sensor data detected by the sensor moving on the floor. By acquiring the map generation unit 110 and the floor map of the floor and arranging one or more first three-dimensional maps generated by the three-dimensional map generation unit 110 on the floor map, one or more first ones. It has a floor map registration unit 120 that generates a large-scale second three-dimensional map including the three-dimensional map of the above, and a range division unit 130 that divides the second three-dimensional map into a plurality of ranges. Further, the three-dimensional map creating device 100 may have a range designation unit 140 and a position / orientation estimation unit 150.

〈3次元地図生成部110〉
 図5(A)は、フロアマップ300の例を示す平面図であり、図5(B)は、小規模な第1の3次元地図400で表された対象物の例を示す斜視図である。3次元地図生成部110は、例えば、SLAMなどを用いて1つ以上の第1の3次元地図を生成する。第1の3次元地図400は、フロア上の対象物及びその周辺の小規模な領域の3次元地図である。フロアマップ300には、対象物の配置場所である対応領域301~305が描かれている。例えば、第1の3次元地図400は、回転、平行移動、及びスケール調整のいずれか1つ以上が実行され、対応領域303に当てはめられる。
<3D map generator 110>
FIG. 5A is a plan view showing an example of the floor map 300, and FIG. 5B is a perspective view showing an example of an object represented by a small-scale first three-dimensional map 400. .. The three-dimensional map generation unit 110 generates one or more first three-dimensional maps by using, for example, SLAM or the like. The first three-dimensional map 400 is a three-dimensional map of an object on the floor and a small area around it. Corresponding areas 301 to 305, which are locations for arranging objects, are drawn on the floor map 300. For example, the first 3D map 400 is fitted to the corresponding area 303 by performing any one or more of rotation, translation, and scale adjustment.

〈フロアマップ登録部120〉
 図6(A)は、第1の3次元地図400のxyz軸周りの回転を示す斜視図であり、図6(B)は、第1の3次元地図400の地面に平行なフロアマップ300の法線周りの回転(すなわち、y軸周りの±Ry方向の回転)を示す平面図である。実施の形態1では、フロアマップ登録部120は、図6(B)に示されるように、対応領域301~305が描かれているフロアマップ300を、例えば、外部の記憶装置から取得する。フロアマップ登録部120は、対応領域301~305が描かれているフロアマップ300を予め記憶する記憶部を有してもよい。或いは、フロアマップ登録部120は、対応領域301~305が描かれているフロアマップ300を、操作入力部からのユーザ入力操作に従って、外部の記憶装置又はネットワーク上のサーバから取得してもよい。
<Floor map registration unit 120>
FIG. 6A is a perspective view showing the rotation of the first 3D map 400 around the xyz axis, and FIG. 6B is a floor map 300 parallel to the ground of the first 3D map 400. It is a top view which shows the rotation around a normal line (that is, the rotation around the y-axis in the ± Ry direction). In the first embodiment, as shown in FIG. 6B, the floor map registration unit 120 acquires the floor map 300 on which the corresponding areas 301 to 305 are drawn, for example, from an external storage device. The floor map registration unit 120 may have a storage unit that stores in advance the floor map 300 on which the corresponding areas 301 to 305 are drawn. Alternatively, the floor map registration unit 120 may acquire the floor map 300 on which the corresponding areas 301 to 305 are drawn from an external storage device or a server on the network according to a user input operation from the operation input unit.

 例えば、フロアマップ300の中から、小規模な第1の3次元地図として再構成された対応領域303が、ユーザ操作によって指定される。この際、図6(A)に示されるように、xyz軸周りの3つの回転調整は、ユーザにとって煩雑な作業であるため、フロアマップ登録部120は、図6(B)に示されるように、フロアマップ登録に必要な1つの回転軸周りの調整(例えば、y軸周りの±Ry方向の回転)を自動的に選択して、不要な回転軸の調整(例えば、x軸周りの±Rx方向の回転及びz軸周りの±Rz方向の回転)を無効化する処理を行ってもよい。この処理を加えることで、3次元的な回転を調整することなく、ユーザは、フロアマップ300上の指定の対応領域(例えば、303)に第1の3次元地図(例えば、400)を登録することが可能である。例えば、ユーザ操作によって第1の3次元地図の地面の法線周りの回転角度と平行移動量とを指定することで、フロアマップ登録部120は、1つ以上の第1の3次元地図をフロアマップ300に登録する。 For example, from the floor map 300, the corresponding area 303 reconstructed as a small-scale first three-dimensional map is designated by a user operation. At this time, as shown in FIG. 6 (A), since the three rotation adjustments around the xyz axis are complicated tasks for the user, the floor map registration unit 120 is as shown in FIG. 6 (B). , Automatically select the adjustment around one axis of rotation required for floor map registration (eg, rotation around the y-axis in the ± Ry direction) and make unnecessary adjustments around the axis of rotation (eg, ± Rx around the x-axis). The process of invalidating the rotation in the direction and the rotation in the ± Rz direction around the z-axis may be performed. By adding this process, the user registers the first three-dimensional map (for example, 400) in the designated corresponding area (for example, 303) on the floor map 300 without adjusting the three-dimensional rotation. It is possible. For example, by specifying the rotation angle and the amount of translation of the first three-dimensional map around the normal of the ground by user operation, the floor map registration unit 120 floors one or more first three-dimensional maps. Register in map 300.

 図7は、図4に示されるフロアマップ登録部120の構成を概略的に示す機能ブロック図である。フロアマップ登録部120は、地面検出部121と、外部パラメータ計算部122と、外部パラメータ入力部123と、外部パラメータ作用部124とを有する。 FIG. 7 is a functional block diagram schematically showing the configuration of the floor map registration unit 120 shown in FIG. The floor map registration unit 120 includes a ground detection unit 121, an external parameter calculation unit 122, an external parameter input unit 123, and an external parameter action unit 124.

 地面検出部121は、3次元地図に基づいて地面を検出する。地面検出方法の例として、ロバスト推定のアルゴリズムの1つであるRandom Sample Consensus(RANSAC)などを用いる方法がある。平面の係数(a b c d)と、平面上の位置(x y z)の関係は、以下の式(1)で表される。 The ground detection unit 121 detects the ground based on a three-dimensional map. As an example of the ground detection method, there is a method using Random Sample Consensus (RANSAC), which is one of the algorithms for robust estimation. The relationship between the coefficient (ab c d) T of the plane and the position (x y z) T on the plane is expressed by the following equation (1).

Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001

 式(1)において、「・」は、内積を示す。RANSACを用いて求められた平面は、無限平面であるため、地面検出部121は、凸包(Convex hull)などを用いて平面の範囲を計算する。地面検出部121は、範囲が最も大きい平面を地面として検出する。 In equation (1), "・" indicates the inner product. Since the plane obtained by using RANSAC is an infinite plane, the ground detection unit 121 calculates the range of the plane using a convex hull or the like. The ground detection unit 121 detects the plane having the largest range as the ground.

 範囲が最も大きい平面が地面とは、限らないケースもあるため、地面検出部121は、以下の方法で地面を検出してもよい。例えば、地面検出部121は、ユーザ入力操作に基づいて、複数の平面の中から地面(すなわち、水平面)を検出してもよい。或いは、地面検出部121は、慣性計測装置(Inertial Measurement Unit:IMU)によって計測された加速度から重力方向を求め、法線が重力に近い平面を地面と判断してもよい。或いは、地面検出部121は、IMUによって計測された重力方向と、平面の面積の大きさを用いて地面を検出してもよい。 Since there are cases where the plane having the largest range is not necessarily the ground, the ground detection unit 121 may detect the ground by the following method. For example, the ground detection unit 121 may detect the ground (that is, a horizontal plane) from a plurality of planes based on a user input operation. Alternatively, the ground detection unit 121 may determine the direction of gravity from the acceleration measured by the inertial measurement unit (IMU), and may determine the plane whose normal line is close to gravity as the ground. Alternatively, the ground detection unit 121 may detect the ground using the direction of gravity measured by the IMU and the size of the area of the plane.

 外部パラメータ計算部122は、地面検出部121によって検出された地面とフロアマップ300との関係性から外部パラメータTを計算する。まず、外部パラメータ計算部122は、検出した地面の法線nと、フロアマップ300の法線nとから回転Rを求める。nとnは、以下の式(2)及び(3)で表される。 The external parameter calculation unit 122 calculates the external parameter T 1 from the relationship between the ground and the floor map 300 detected by the ground detection unit 121. First, the external parameter calculation unit 122 obtains the rotation R 1 from the detected normal line ng of the ground and the normal line n f of the floor map 300. ng and n f are represented by the following equations (2) and (3).

Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002

 フロアマップ300は、図8に示されるようなxyz直交座標系で示されると仮定する。外部パラメータ計算部122は、回転Rを、式(2)に示されるnのx方向に対する偏角θと式(3)に示されるnのz方向に対する偏角θとを用いて計算する。偏角θ、θを用いて求めた回転行列をそれぞれR、Rとすると、これらは、式(4)で示され、回転Rは式(5)で示される。 It is assumed that the floor map 300 is shown in the xyz Cartesian coordinate system as shown in FIG. External parameter calculation unit 122, a rotation R 1, using the polarization angle theta z for z-direction of n f shown in declination theta x and equation (3) with respect to the x-direction of n g represented by the formula (2) To calculate. Assuming that the rotation matrices obtained by using the declinations θ x and θ z are R x and R z , respectively, these are represented by the equation (4), and the rotation R 1 is represented by the equation (5).

Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003

 外部パラメータ計算部122は、式(5)で回転Rを求めた後、平面上の一点のベクトルxから

Figure JPOXMLDOC01-appb-M000004
を計算する。 The external parameter calculation unit 122 obtains the rotation R 1 by the equation (5), and then uses the vector x p of one point on the plane.
Figure JPOXMLDOC01-appb-M000004
To calculate.

 高さ方向に相当するy座標であるhを平行移動量とすると、外部パラメータTは、回転R及びベクトルtから、以下の式(6)によって表される。

Figure JPOXMLDOC01-appb-M000005
When the h y and translation amount is y-coordinate, corresponding to the height direction, the external parameter T 1 from the rotation R 1 and vector t 1, represented by the following equation (6).
Figure JPOXMLDOC01-appb-M000005

 式(6)において、ベクトルtは、回転R及びベクトルxの高さ方向のみを取り出したベクトルであり、以下の式(7)のように表される。

Figure JPOXMLDOC01-appb-M000006
In the equation (6), the vector t 1 is a vector obtained by extracting only the rotation R 1 and the vector x p in the height direction, and is expressed by the following equation (7).
Figure JPOXMLDOC01-appb-M000006

 外部パラメータ入力部123は、ユーザ入力を受け付ける。外部パラメータ入力部123におけるユーザ入力により、一部の外部パラメータが入力される。外部パラメータ入力部123に入力される外部パラメータは、例えば、図8に示されるフロアマップ300の座標系においてy軸周りの回転Rと、xz平面における平行移動量tである。これらから、外部パラメータ入力部123は、外部パラメータTを以下の式(8)によって取得する。また、ユーザ入力を回転の角度θとすると、回転Rは以下の式(9)によって表される。 The external parameter input unit 123 accepts user input. Some external parameters are input by user input in the external parameter input unit 123. The external parameters input to the external parameter input unit 123 are, for example, the rotation R 2 around the y-axis in the coordinate system of the floor map 300 shown in FIG. 8 and the translation amount t 2 in the xz plane. From these, the external parameter input unit 123 acquires the external parameter T 2 by the following equation (8). Further, assuming that the user input is the rotation angle θ y , the rotation R 2 is expressed by the following equation (9).

Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007

 なお、図8において、tは、フロアマップ300の左下の点を基準(すなわち、原点)とした平行移動量である。 In FIG. 8, t 2 is a translation amount with reference to the lower left point of the floor map 300 (that is, the origin).

 y軸周りの回転の角度θは、地磁気センサ25を用いて取得してもよい。角度θの値は、地磁気センサ25から自動的に入力される。或いは、角度θは、地磁気センサ25の検出値を初期値としてユーザが入力してもよい。 The rotation angle θ around the y-axis may be acquired by using the geomagnetic sensor 25. The value of the angle θ is automatically input from the geomagnetic sensor 25. Alternatively, the angle θ may be input by the user with the detection value of the geomagnetic sensor 25 as the initial value.

 角度θ及び平行移動量tの入力方法は、Graphics User Interface(GUI)の操作、キーボードなどからの数値入力のいずれであってもよい。また、平行移動量tの原点は、図8に示されるフロアマップの左下の点と異なる点であってもよい。 The input method of the angle θ and the translation amount t 2 may be either the operation of the Graphics User Interface (GUI) or the numerical input from the keyboard or the like. Further, the origin of the translation amount t 2 may be a point different from the lower left point of the floor map shown in FIG.

 外部パラメータ作用部124は、外部パラメータ計算部122で計算された外部パラメータ及び外部パラメータ入力部123から入力された外部パラメータを3次元地図に対して作用させる。3次元地図で管理している点群に対して行う外部パラメータ作用は、以下の式(10)に示される。ここで、ベクトルpは、作用前の点群の一点、ベクトルp′は、作用後の点を示す。 The external parameter action unit 124 causes the external parameter calculated by the external parameter calculation unit 122 and the external parameter input from the external parameter input unit 123 to act on the three-dimensional map. The external parameter action performed on the point cloud managed by the three-dimensional map is shown in the following equation (10). Here, the vector p indicates one point in the point cloud before the action, and the vector p'indicates the point after the action.

Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008

〈範囲分割部130〉
 フロアマップに、互いに同じ形状で同じ模様の複数の対象物が並んでいる場合、目標の対象物の位置及び姿勢の推定が失敗する可能性がある。例えば、同じ形状で同じ模様の複数の対象物B1~B6が並んでいる環境では、対象物B1をスキャンして位置及び姿勢を推定する場合、間違って対象物B2~B6のいずれかの位置及び姿勢が出力される可能性がある。誤推定を避けるために、位置及び姿勢の推定時には、まず、ユーザがいる範囲を指定することが望ましい。範囲を指定することで、範囲内に含まれている対象物の3次元地図400を元に、詳細な位置及び姿勢を推定することが可能になる。
<Range division 130>
When a plurality of objects having the same shape and the same pattern are lined up on the floor map, the estimation of the position and orientation of the target objects may fail. For example, in an environment where a plurality of objects B1 to B6 having the same shape and the same pattern are lined up, when the position and posture are estimated by scanning the object B1, the position and position of any of the objects B2 to B6 and Posture may be output. In order to avoid erroneous estimation, it is desirable to first specify the range in which the user is when estimating the position and posture. By designating the range, it is possible to estimate the detailed position and posture based on the three-dimensional map 400 of the object included in the range.

 図9は、図4に示される範囲分割部130によって実行される範囲分割処理を示す図である。範囲分割部130は、例えば、フロアマップ300で表される1つのフロアを複数の範囲(例えば、範囲#1~#4)に分割する。範囲分割部130は、例えば、ビルの1つのフロアを複数の範囲#1~#4に静的に分割し、複数の範囲#1~#4の各々における3次元地図を管理する。範囲分割部130は、例えば、図9に示されるように、1つのフロアマップ300を4つの範囲#1~#4に分割する。このように、1つのフロアマップ300を複数の範囲に分割し、同じ模様及び同じ形状を持つ複数の対象物が互いに異なる範囲に属するようにして3次元地図を管理すれば、目標の対象物を、これと同じ形状で同じ模様の他の対象物であると誤って誤推定する可能性は低下する。しかし、この場合には、誤推定が発生する可能性を十分に低減できない。 FIG. 9 is a diagram showing a range division process executed by the range division unit 130 shown in FIG. The range dividing unit 130 divides one floor represented by the floor map 300 into a plurality of ranges (for example, ranges # 1 to # 4). For example, the range dividing unit 130 statically divides one floor of a building into a plurality of ranges # 1 to # 4, and manages a three-dimensional map in each of the plurality of ranges # 1 to # 4. The range dividing unit 130 divides one floor map 300 into four ranges # 1 to # 4, for example, as shown in FIG. In this way, if one floor map 300 is divided into a plurality of ranges and the three-dimensional map is managed so that a plurality of objects having the same pattern and the same shape belong to different ranges, the target object can be obtained. , The possibility of falsely presuming that it is another object with the same shape and the same pattern is reduced. However, in this case, the possibility of erroneous estimation cannot be sufficiently reduced.

 このような理由から、範囲分割部130は、動的な範囲分割を行うことが望ましい。図10は、図4に示される範囲分割部130の構成を概略的に示す機能ブロック図である。範囲分割部130は、画像特徴算出部131と、クラスタリング部132とを有する。 For this reason, it is desirable that the range dividing unit 130 dynamically divides the range. FIG. 10 is a functional block diagram schematically showing the configuration of the range dividing portion 130 shown in FIG. The range dividing unit 130 includes an image feature calculation unit 131 and a clustering unit 132.

 画像特徴算出部131は、3次元地図を生成するときにカメラ撮影した画像の類似度に基づいて画像の特徴を算出する。画像特徴算出部131は、例えば、Bag of Words(BoW)を用いて範囲分割のための処理を行う。画像特徴算出部131は、対象物の第1の3次元地図の各々のカメラ撮影画像に対してBoWを使って画像をベクトル化する。画像特徴算出部131は、ベクトル同士の距離が近ければ「類似している」と判断する。 The image feature calculation unit 131 calculates the features of the image based on the similarity of the images taken by the camera when generating the three-dimensional map. The image feature calculation unit 131 performs a process for range division using, for example, Bag of Words (BoW). The image feature calculation unit 131 vectorizes an image taken by each camera of the first three-dimensional map of the object using BoW. The image feature calculation unit 131 determines that the vectors are "similar" if they are close to each other.

 クラスタリング部132は、画像特徴算出部131の結果を元にクラスタリングを行い、範囲分割する。つまり、クラスタリング部132は、これらのベクトルが互いに異なる範囲に属するように、範囲分割処理を行う。 The clustering unit 132 performs clustering based on the result of the image feature calculation unit 131 and divides the range. That is, the clustering unit 132 performs the range division process so that these vectors belong to different ranges.

〈範囲指定部140〉
 図11は、実施の形態1に係る3次元地図作成装置100の範囲指定部140を用いた範囲指定操作時の画面を示す図である。範囲指定部140は、範囲分割部130によって決定され複数の範囲の中から、1つの範囲を指定する。範囲指定部140は、例えば、ユーザ操作部からのユーザ操作によって選択された1つの範囲を指定する。ユーザ操作部は、範囲指定部140に接続された装置又は範囲指定部140の一部である。ユーザ操作部は、例えば、図11に示されるようなタブレットPCのタッチパネルである。ユーザが、タッチパネルに表示された複数の範囲の中から、タップ操作などよって1つの範囲を選択すると、範囲指定部140は、選択された範囲を指定する。
<Range specification unit 140>
FIG. 11 is a diagram showing a screen at the time of a range designation operation using the range designation unit 140 of the three-dimensional map creation device 100 according to the first embodiment. The range designation unit 140 is determined by the range division unit 130 and designates one range from a plurality of ranges. The range designation unit 140 specifies, for example, one range selected by a user operation from the user operation unit. The user operation unit is a device connected to the range designation unit 140 or a part of the range designation unit 140. The user operation unit is, for example, a touch panel of a tablet PC as shown in FIG. When the user selects one range from the plurality of ranges displayed on the touch panel by tap operation or the like, the range designation unit 140 specifies the selected range.

〈位置姿勢推定部150〉
 位置姿勢推定部150は、範囲指定部140によって指定された範囲に含まれる3次元地図を用いて、対象物の位置及び姿勢を推定する。位置及び姿勢を推定する技術の例として、BoWとPerspective N Points(PnP)とを組み合わせる方法がある。
<Position and orientation estimation unit 150>
The position / orientation estimation unit 150 estimates the position and orientation of the object using the three-dimensional map included in the range designated by the range designation unit 140. As an example of the technique for estimating the position and the posture, there is a method of combining BoW and Perceptive N Points (PnP).

《1-2》動作
 図12(A)は、3次元地図生成処理を示すフローチャートであり、図12(B)は、位置姿勢推定処理を示すフローチャートである。まず、図12(A)に示されるように、3次元地図生成部110は、対象物毎に小規模な第1の3次元地図を生成する(ステップS11、S12)。次に、フロアマップ登録部120は、第1の3次元地図をフロアマップ300に登録する(ステップS13)。ステップS11~S13の処理は、対象物の個数分、繰り返され、その結果、大規模の第2の3次元地図が生成される。全ての対象物の登録が終了した後、範囲分割部130は、第2の3次元地図を複数の範囲に分割する(ステップS14)。
<< 1-2 >> Operation FIG. 12A is a flowchart showing a three-dimensional map generation process, and FIG. 12B is a flowchart showing a position / orientation estimation process. First, as shown in FIG. 12A, the three-dimensional map generation unit 110 generates a small-scale first three-dimensional map for each object (steps S11 and S12). Next, the floor map registration unit 120 registers the first three-dimensional map in the floor map 300 (step S13). The processes of steps S11 to S13 are repeated for the number of objects, and as a result, a large-scale second three-dimensional map is generated. After the registration of all the objects is completed, the range dividing unit 130 divides the second three-dimensional map into a plurality of ranges (step S14).

 次に、図12(B)に示されるように、3次元地図を用いた位置及び姿勢の推定が行われる。まず、範囲指定部140は、ユーザ操作に基づいて、ユーザがいるおおよその範囲である指定範囲を選択する(ステップS21)。その後、位置姿勢推定部150は、指定範囲内の3次元地図を使って位置及び姿勢の推定を行う(ステップS22)。 Next, as shown in FIG. 12B, the position and posture are estimated using the three-dimensional map. First, the range designation unit 140 selects a designated range, which is an approximate range in which the user is present, based on the user operation (step S21). After that, the position / orientation estimation unit 150 estimates the position and orientation using the three-dimensional map within the designated range (step S22).

 図13は、3次元地図作成装置100のフロアマップ登録部120によって実行されるフロアマップ登録処理(ステップS13)を示すフローチャートである。まず、地面検出部121は、地面の検出を行う(ステップS131~S135)。例えば、地面検出部121は、複数の平面を検出した後(ステップS131~S133)、その中から地面とみなすことができる最大の平面を求めることで地面を検出する(ステップS134、S135)。その後、外部パラメータ計算部122は、地面とフロアマップ300の関係を使った移動量を算出し、外部パラメータ入力部123はユーザ操作で入力された平行移動量と地面法線周りの回転量とから移動量を求める(ステップS136、S137)。次に、外部パラメータ作用部124は、これら2つの移動量である外部パラメータを3次元地図に作用させる(ステップS138)。 FIG. 13 is a flowchart showing a floor map registration process (step S13) executed by the floor map registration unit 120 of the three-dimensional map creation device 100. First, the ground detection unit 121 detects the ground (steps S131 to S135). For example, the ground detection unit 121 detects the ground by detecting a plurality of planes (steps S131 to S133) and then finding the maximum plane that can be regarded as the ground (steps S134, S135). After that, the external parameter calculation unit 122 calculates the movement amount using the relationship between the ground and the floor map 300, and the external parameter input unit 123 is based on the parallel movement amount input by the user operation and the rotation amount around the ground normal. The amount of movement is obtained (steps S136 and S137). Next, the external parameter action unit 124 causes the external parameters, which are the two movement amounts, to act on the three-dimensional map (step S138).

 図14は、3次元地図作成装置100の範囲分割部130によって実行される範囲分割処理を示すフローチャートである。まず、画像特徴算出部131は、BoWなどを用いて画像特徴を算出し、画像特徴が互いに類似する場合には、この情報をルールに追加する。3次元地図の最初のループ(ステップS141)のインデックスをi(ここで、iは0以上N以下の整数)、2番目のループ(ステップS143)のインデックスをj(ここで、jは0以上N以下の整数)とすると、画像特徴算出部131は、i<jのときに3次元地図の類似度を計算する(ステップS141~S147)。図15は、類似度を計算する場合と計算しない場合とを説明するための図である。つまり、図15の白色のときに類似度を計算し、斜線範囲のときに類似度を計算しない。 FIG. 14 is a flowchart showing a range division process executed by the range division unit 130 of the three-dimensional map creation device 100. First, the image feature calculation unit 131 calculates the image features using BoW or the like, and when the image features are similar to each other, adds this information to the rule. The index of the first loop (step S141) of the three-dimensional map is i (where i is an integer of 0 or more and N or less), and the index of the second loop (step S143) is j (where j is 0 or more and N). (The following integers), the image feature calculation unit 131 calculates the similarity of the three-dimensional map when i <j (steps S141 to S147). FIG. 15 is a diagram for explaining a case where the similarity is calculated and a case where the similarity is not calculated. That is, the similarity is calculated when it is white in FIG. 15, and the similarity is not calculated when it is in the shaded area.

 図16は、3次元地図作成装置100の範囲分割部130によって実行される類似判定処理を示すフローチャートである。図16に示されるように、範囲分割部130の画像特徴算出部131は、まず、2つの画像特徴の距離を計算する。ここでの画像特徴を、ベクトルv、ベクトルvとすると、範囲分割部130は、例えば、以下の式(11)で表されるユークリッド距離を、2つの画像特徴の距離として用いる。 FIG. 16 is a flowchart showing a similarity determination process executed by the range dividing unit 130 of the three-dimensional map creating device 100. As shown in FIG. 16, the image feature calculation unit 131 of the range dividing section 130 first calculates the distance between the two image features. Assuming that the image features here are vector v 1 and vector v 2 , the range dividing unit 130 uses, for example, the Euclidean distance represented by the following equation (11) as the distance between the two image features.

Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009

 画像特徴算出部131は、2つの画像特徴の距離の計算を計算し(ステップS1403)、最小距離を求める処理(ステップS1404、S1405)を3次元地図の数であるN(j)回繰り返し(ステップST1401)、さらに、ステップS1401~S1404の処理を3次元地図の数であるN(i)回繰り返す(ステップST1400)。画像特徴算出部131は、画像特徴の最小距離を求めた後、最小距離がα未満の場合(ステップS1405においてtrue)、類似と判定して処理を終了し、最小距離がα以上の場合(ステップS1405においてfalse)、非類似と判定して処理を終了する。ここでのαは、予め設定された値であり、例えば、開発者又はユーザなどが設定するパラメータである。 The image feature calculation unit 131 calculates the calculation of the distance between the two image features (step S1403), and repeats the process of obtaining the minimum distance (steps S1404 and S1405) N (j) times, which is the number of three-dimensional maps (step). ST1401), and further, the processing of steps S1401 to S1404 is repeated N (i) times, which is the number of three-dimensional maps (step ST1400). After obtaining the minimum distance of the image feature, the image feature calculation unit 131 determines that the minimum distance is less than α (true in step S1405), determines that it is similar, and ends the process, and if the minimum distance is α or more (step). In S1405, false), it is determined that they are dissimilar, and the process is terminated. Here, α is a preset value, and is, for example, a parameter set by a developer, a user, or the like.

 上記の処理でルールが決定した後、クラスタリング部132は、ルールに基づいてクラスタリングを実行する(図14におけるステップS148~S151)。図14は、階層型クラスタリングである分割型クラスタリング(Divisive Clustering)を用いてクラスタリングを行う例を示す。まず、クラスタリング部132は、分割対象の3次元地図を範囲分割する。ここで分割する3次元地図をC1,C2とする。クラスタリング部132は、3次元地図C1の中心位置に近い他の3次元地図は、C1のクラスに属し、3次元地図C2の中心位置に近い他の3次元地図は、C2のクラスに属するように、範囲分割を行う。クラスタリング部132は、このような3次元地図の範囲分割を繰り返して、最終的な範囲分割を行う。 After the rule is determined by the above process, the clustering unit 132 executes clustering based on the rule (steps S148 to S151 in FIG. 14). FIG. 14 shows an example in which clustering is performed using divided clustering, which is hierarchical clustering. First, the clustering unit 132 divides the range of the three-dimensional map to be divided. Here, the three-dimensional maps to be divided are C1 and C2. In the clustering unit 132, other 3D maps near the center position of the 3D map C1 belong to the C1 class, and other 3D maps near the center position of the 3D map C2 belong to the C2 class. , Perform range division. The clustering unit 132 repeats such a range division of the three-dimensional map to perform the final range division.

《1-3》効果
 以上に説明したように、実施の形態1に係る3次元地図作成装置100は、小規模な1つ以上の第1の3次元地図をSLAMを用いて作成し、フロアマップ300上に1つ以上の第1の3次元地図を配置することによって大規模な第2の3次元地図(例えば、屋内環境全体の3次元地図)を作成している。第1の3次元地図を作成するときに、SLAMによって蓄積する位置誤差は小さいので、第2の3次元地図における位置誤差は抑制される。したがって、実施の形態1に係る3次元地図作成装置100は、位置誤差の小さい大規模な3次元地図を作製することができる。
<< 1-3 >> Effect As described above, the three-dimensional map creating device 100 according to the first embodiment creates one or more small-scale first three-dimensional maps using SLAM, and creates a floor map. By arranging one or more first three-dimensional maps on the 300, a large-scale second three-dimensional map (for example, a three-dimensional map of the entire indoor environment) is created. Since the position error accumulated by SLAM when creating the first three-dimensional map is small, the position error in the second three-dimensional map is suppressed. Therefore, the three-dimensional map creating device 100 according to the first embodiment can produce a large-scale three-dimensional map having a small position error.

《2》実施の形態2
《2-1》構成
 図17は、実施の形態2に係る3次元地図作成装置101の構成を概略的に示す機能ブロック図である。図17において、図4に示される構成要素と同一又は対応する構成要素には、図4に示される符号と同じ符号が付されている。実施の形態2に係る3次元地図作成装置101は、コンテンツ登録部160及びコンテンツ重畳表示部170を有する点において、実施の形態1に係る3次元地図作成装置100と異なる。なお、3次元地図作成装置101のHW構成は、図3に示されるものと同じである。
<< 2 >> Embodiment 2
<< 2-1 >> Configuration FIG. 17 is a functional block diagram schematically showing the configuration of the three-dimensional map creating device 101 according to the second embodiment. In FIG. 17, components that are the same as or correspond to the components shown in FIG. 4 are designated by the same reference numerals as those shown in FIG. The three-dimensional map creation device 101 according to the second embodiment is different from the three-dimensional map creation device 100 according to the first embodiment in that it has a content registration unit 160 and a content superimposition display unit 170. The HW configuration of the three-dimensional map creating device 101 is the same as that shown in FIG.

 コンテンツ登録部160は、1つ以上の対象物の各々の第1の3次元地図を用いて、各対象物を表示する画像に重ねて表示するコンテンツの位置を登録する。コンテンツ重畳表示部170は、カメラ22で撮影した画像に、登録されているコンテンツを重畳してディスプレイ30に表示させる。コンテンツは、例えば、線若しくは平面などの図形と対象物を示す情報、立方体若しくは球などの3次元の図形と対象物を示す情報、又はこれらの組み合わせなどを含むことができる。 The content registration unit 160 registers the position of the content to be displayed on the image displaying each object by using the first three-dimensional map of each of the one or more objects. The content superimposition display unit 170 superimposes the registered content on the image taken by the camera 22 and displays it on the display 30. The content can include, for example, information indicating a figure such as a line or a plane and an object, information indicating a three-dimensional figure such as a cube or a sphere and an object, or a combination thereof.

《2-2》動作
 図18(A)は、3次元地図作成装置101が行う3次元地図生成処理を示すフローチャートであり、図18(B)は、3次元地図作成装置101がコンテンツ重畳表示のために行う処理を示すフローチャートである。図18(A)及び(B)において、図12(A)及び(B)と同じ処理ステップには、図12(A)及び(B)の処理ステップにおける符号と同じ符号が付されている。図18(A)に示されるように、実施の形態2に係る3次元地図作成装置101は、各対象物を表示する画像に重ねて表示するコンテンツの位置を登録するコンテンツ登録処理(ステップS15)を行う点において、実施の形態1に係る3次元地図作成装置100と異なる。また、図18(B)に示されるように、実施の形態2に係る3次元地図作成装置101は、コンテンツ登録処理で登録されたコンテンツを各対象物を表示する画像に重ねて表示するコンテンツ表示(ステップS23)を行う点において、実施の形態1に係る3次元地図作成装置100と異なる。
<< 2-2 >> Operation FIG. 18 (A) is a flowchart showing a three-dimensional map generation process performed by the three-dimensional map creation device 101, and FIG. 18 (B) shows a content superimposed display by the three-dimensional map creation device 101. It is a flowchart which shows the process to perform for this. In FIGS. 18A and 18B, the same processing steps as those in FIGS. 12A and 12B are designated by the same reference numerals as those in the processing steps of FIGS. 12A and 12B. As shown in FIG. 18A, the three-dimensional map creation device 101 according to the second embodiment is a content registration process (step S15) for registering the position of the content to be displayed superimposed on the image displaying each object. This is different from the three-dimensional map creating device 100 according to the first embodiment. Further, as shown in FIG. 18B, the three-dimensional map creation device 101 according to the second embodiment displays the content registered in the content registration process by superimposing it on the image displaying each object. (Step S23) is different from the three-dimensional map creating device 100 according to the first embodiment.

《2-3》効果
 以上に説明したように、実施の形態2に係る3次元地図作成装置101は、小規模な1つ以上の第1の3次元地図をSLAMを用いて作成し、フロアマップ300上に1つ以上の第1の3次元地図を配置することによって大規模な第2の3次元地図(例えば、屋内環境全体の3次元地図)を作成している。第1の3次元地図を作成するときに、SLAMによって蓄積する位置誤差は小さいので、第2の3次元地図における位置誤差は抑制される。したがって、実施の形態2に係る3次元地図作成装置101は、図2(A)に示されるように、コンテンツを適切な位置に重畳表示することができる。
<< 2-3 >> Effect As described above, the three-dimensional map creating device 101 according to the second embodiment creates one or more small-scale first three-dimensional maps using SLAM, and creates a floor map. By arranging one or more first three-dimensional maps on the 300, a large-scale second three-dimensional map (for example, a three-dimensional map of the entire indoor environment) is created. Since the position error accumulated by SLAM when creating the first three-dimensional map is small, the position error in the second three-dimensional map is suppressed. Therefore, the three-dimensional map creating device 101 according to the second embodiment can superimpose and display the contents at appropriate positions as shown in FIG. 2 (A).

 上記以外の点に関し、実施の形態2は、実施の形態1と同じである。 Regarding points other than the above, the second embodiment is the same as the first embodiment.

《3》実施の形態3
《3-1》構成
 図19は、実施の形態3に係る3次元地図作成装置102の構成を概略的に示す機能ブロック図である。図19において、図4に示される構成要素と同一又は対応する構成要素には、図4に示される符号と同じ符号が付されている。実施の形態3に係る3次元地図作成装置102は、相対移動距離推定部180を有する点において、実施の形態1に係る3次元地図作成装置100と異なる。なお、3次元地図作成装置102のHW構成は、図3に示されるものと同じである。
<< 3 >> Embodiment 3
<< 3-1 >> Configuration FIG. 19 is a functional block diagram schematically showing the configuration of the three-dimensional map creating device 102 according to the third embodiment. In FIG. 19, components that are the same as or correspond to the components shown in FIG. 4 are designated by the same reference numerals as those shown in FIG. The three-dimensional map creation device 102 according to the third embodiment is different from the three-dimensional map creation device 100 according to the first embodiment in that it has a relative movement distance estimation unit 180. The HW configuration of the three-dimensional map creating device 102 is the same as that shown in FIG.

 実施の形態1において説明したように、位置姿勢推定部150は、対象物及びその周辺における小規模な第1の3次元地図に基づいて、対象物の位置及び姿勢を推定する。しかし、第1の3次元地図から離れた場所に存在する物体の位置及び姿勢を推定するためには、位置及び姿勢の推定が成功した場所からの相対的な移動距離を求める必要がある。実施の形態3に係る3次元地図作成装置102は、相対移動距離推定部180を設けることによって、相対的な移動距離を算出する機能を有している。 As described in the first embodiment, the position / orientation estimation unit 150 estimates the position and orientation of the object based on a small-scale first three-dimensional map in and around the object. However, in order to estimate the position and orientation of an object existing at a location away from the first three-dimensional map, it is necessary to obtain the relative movement distance from the location where the estimation of the position and orientation is successful. The three-dimensional map creation device 102 according to the third embodiment has a function of calculating the relative movement distance by providing the relative movement distance estimation unit 180.

《3-2》動作
 図20(A)は、3次元地図作成装置102が行う3次元地図生成処理を示すフローチャートであり、図20(B)は、3次元地図作成装置102が行う位置姿勢推定処理を示すフローチャートである。図20(A)及び(B)において、図12(A)及び(B)と同じ処理ステップには、図12(A)及び(B)の処理ステップにおける符号と同じ符号が付されている。図20(B)に示されるように、実施の形態3に係る3次元地図作成装置102は、相対的な移動距離を求める処理(ステップS24、S25)を行う点において、実施の形態1に係る3次元地図作成装置100と異なる。
<< 3-2 >> Operation FIG. 20A is a flowchart showing a three-dimensional map generation process performed by the three-dimensional map creation device 102, and FIG. 20B is a position / orientation estimation performed by the three-dimensional map creation device 102. It is a flowchart which shows the process. In FIGS. 20A and 20B, the same processing steps as those in FIGS. 12A and 12B are designated by the same reference numerals as those in the processing steps of FIGS. 12A and 12B. As shown in FIG. 20B, the three-dimensional map creating device 102 according to the third embodiment relates to the first embodiment in that it performs a process (steps S24, S25) for obtaining a relative moving distance. It is different from the three-dimensional map creating device 100.

 相対的な移動距離の推定には、例えば、以下の方法が使用可能である。第1の方法は、SLAMを用いる方法である。これは、カメラで撮影した画像、又は距離センサ21で検出されたセンサデータ、又はこれらの両方を使って移動量を求める方法である。これは、例えば、時間方向に並ぶ2つのフレーム間の移動量を、順次積算することによって移動量を求める方法である。 For example, the following method can be used to estimate the relative travel distance. The first method is a method using SLAM. This is a method of obtaining the movement amount by using the image taken by the camera, the sensor data detected by the distance sensor 21, or both of them. This is, for example, a method of obtaining the movement amount by sequentially integrating the movement amounts between two frames arranged in the time direction.

 第2の方法は、カメラで撮影した画像及び距離センサ21で検出されたセンサデータに、ジャイロセンサ23、加速度センサ24、地磁気センサ25のいずれか1つ以上を使ったデッドレコニング(Dead Reckoning)を組み合わせた方法である。Dead Reckoningは、ジャイロセンサ23、加速度センサ24、地磁気センサ25などを使って移動量を求める方法である。この方法は、加速度の積分とジャイロセンサ23により移動速度と移動方向を求め、速度を積分することで移動距離を求める方法である。 The second method is dead reckoning using any one or more of the gyro sensor 23, the acceleration sensor 24, and the geomagnetic sensor 25 on the image taken by the camera and the sensor data detected by the distance sensor 21. It is a combined method. Dead reckoning is a method of obtaining a movement amount using a gyro sensor 23, an acceleration sensor 24, a geomagnetic sensor 25, or the like. In this method, the moving speed and the moving direction are obtained by integrating the acceleration and the gyro sensor 23, and the moving distance is obtained by integrating the speeds.

 第3の方法は、歩行者デッドレコニング(Pedestrian Dead Reckoning)を用いる方法である。第3の方法は、ジャイロセンサ23及び加速度センサ24のセンサデータから歩幅及び歩数を求め、これらから移動距離を求める方法である。 The third method is a method using pedestrian dead reckoning. The third method is a method of obtaining the stride length and the number of steps from the sensor data of the gyro sensor 23 and the acceleration sensor 24, and obtaining the moving distance from these.

《3-3》効果
 以上に説明したように、実施の形態3に係る3次元地図作成装置102は、小規模な1つ以上の第1の3次元地図をSLAMを用いて作成し、フロアマップ300上に1つ以上の第1の3次元地図を配置することによって大規模な1つの第2の3次元地図(例えば、屋内環境全体の3次元地図)を作成している。第1の3次元地図を作成するときにSLAMによって蓄積する位置誤差は小さいので、第2の3次元地図における位置誤差は低減される。したがって、実施の形態3に係る3次元地図作成装置102は、図2(A)に示されるように、コンテンツを適切な位置に重畳表示することができる。さらに、第1の3次元地図から離れた場所でも位置及び姿勢の推定ができる。
<< 3-3 >> Effect As described above, the three-dimensional map creating device 102 according to the third embodiment creates one or more small-scale first three-dimensional maps using SLAM, and creates a floor map. By arranging one or more first three-dimensional maps on the 300, one large-scale second three-dimensional map (for example, a three-dimensional map of the entire indoor environment) is created. Since the position error accumulated by SLAM when creating the first three-dimensional map is small, the position error in the second three-dimensional map is reduced. Therefore, the three-dimensional map creating device 102 according to the third embodiment can superimpose and display the contents at appropriate positions as shown in FIG. 2 (A). Further, the position and the posture can be estimated even at a place away from the first three-dimensional map.

 上記以外の点に関し、実施の形態3は、実施の形態1と同じである。また、相対移動距離推定部180を実施の形態2の構成に備えることも可能である。 Regarding points other than the above, the third embodiment is the same as the first embodiment. It is also possible to provide the relative movement distance estimation unit 180 in the configuration of the second embodiment.

 10 コンピュータ、 11 プロセッサ、 12 メモリ、 21 距離センサ、 22 カメラ、 23 ジャイロセンサ、 24 加速度センサ、 25 地磁気センサ、 30 ディスプレイ、 40 3次元地図DB、 100、101、102 3次元地図作成装置、 110 3次元地図生成部、 120 フロアマップ登録部、 121 地面検出部、 122 外部パラメータ計算部、 123 外部パラメータ入力部、 124 外部パラメータ作用部、 130 範囲分割部、 140 範囲指定部、 150 位置姿勢推定部、 160 コンテンツ登録部、 170 コンテンツ重畳表示部、 180 相対移動距離推定部、 300 フロアマップ。 10 computers, 11 processors, 12 memories, 21 distance sensors, 22 cameras, 23 gyro sensors, 24 acceleration sensors, 25 geomagnetic sensors, 30 displays, 40 3D map DB, 100, 101, 102 3D map creation devices, 110 3 Dimension map generation unit, 120 floor map registration unit, 121 ground detection unit, 122 external parameter calculation unit, 123 external parameter input unit, 124 external parameter action unit, 130 range division unit, 140 range specification unit, 150 position / orientation estimation unit, 160 content registration unit, 170 content superimposition display unit, 180 relative movement distance estimation unit, 300 floor map.

Claims (9)

 フロア上を移動するセンサによって検出されたセンサデータに基づいて、対象物を含む領域毎に第1の3次元地図を生成する3次元地図生成部と、
 前記フロアのフロアマップを取得し、前記3次元地図生成部によって生成された1つ以上の第1の3次元地図を前記フロアマップ上に配置することによって、前記1つ以上の第1の3次元地図を含む第2の3次元地図を生成するフロアマップ登録部と、
 を有することを特徴とする3次元地図作成装置。
A 3D map generator that generates a first 3D map for each area including an object based on sensor data detected by a sensor moving on the floor.
By acquiring the floor map of the floor and arranging the one or more first three-dimensional maps generated by the three-dimensional map generation unit on the floor map, the one or more first three dimensions are obtained. A floor map registration unit that generates a second 3D map including a map,
A three-dimensional map making device characterized by having.
 前記第2の3次元地図を複数の範囲に分割する範囲分割部をさらに有することを特徴とする請求項1に記載の3次元地図作成装置。 The three-dimensional map creating device according to claim 1, further comprising a range dividing portion for dividing the second three-dimensional map into a plurality of ranges.  前記複数の範囲のうちのいずれかを指定範囲として指定する範囲指定部と、
 前記1つ以上の第1の3次元地図のうちの前記指定範囲内に配置された第1の3次元地図を用いて、前記指定範囲内に存在する対象物の位置及び姿勢を推定する位置姿勢推定部と、
 をさらに有することを特徴とする請求項2に記載の3次元地図作成装置。
A range specification unit that specifies any of the plurality of ranges as a specified range, and
Position and orientation for estimating the position and orientation of an object existing in the designated range using the first three-dimensional map arranged within the designated range of the one or more first three-dimensional maps. Estimator and
The three-dimensional map creating apparatus according to claim 2, further comprising.
 前記フロアマップ登録部は、
 複数の平面を検出し、前記複数の平面のうちから前記フロアに対応する地面を検出する地面検出部と、
 前記地面を用いて前記センサの移動量に基づく第1の外部パラメータを算出する外部パラメータ計算部と、
 ユーザ入力によって取得された移動量に基づく第2の外部パラメータを生成する外部パラメータ入力部と、
 前記第1の外部パラメータ及び前記第2の外部パラメータに基づいて、前記第2の3次元地図を補正する外部パラメータ作用部と
 を有することを特徴とする請求項1から3のいずれか1項に記載の3次元地図作成装置。
The floor map registration department
A ground detection unit that detects a plurality of planes and detects the ground corresponding to the floor from the plurality of planes.
An external parameter calculation unit that calculates a first external parameter based on the amount of movement of the sensor using the ground.
An external parameter input unit that generates a second external parameter based on the movement amount acquired by user input,
The invention according to any one of claims 1 to 3, further comprising an external parameter action unit that corrects the second three-dimensional map based on the first external parameter and the second external parameter. The described three-dimensional map making device.
 前記範囲分割部は、
 前記第1の3次元地図の特徴の類似度に基づいて画像の特徴を算出する画像特徴算出部と、
 前記類似度に基づいて、前記範囲の分割を行うクラスタリング部と、
 を有することを特徴とする請求項2又は3に記載の3次元地図作成装置。
The range dividing portion is
An image feature calculation unit that calculates image features based on the similarity of features of the first three-dimensional map, and an image feature calculation unit.
A clustering unit that divides the range based on the similarity, and
The three-dimensional map making apparatus according to claim 2 or 3, wherein the three-dimensional map making apparatus is characterized by having.
 前記対象物を含む領域毎の第1の3次元地図を用いて、各対象物を表示する画像に重ねて表示するコンテンツの位置を登録するコンテンツ登録部と、
 カメラで撮影した画像に前記コンテンツを重畳してディスプレイに表示させコンテンツ重畳表示部と、
 をさらに有することを特徴とする請求項1から5のいずれか1項に記載の3次元地図作成装置。
Using the first three-dimensional map for each area including the object, a content registration unit that registers the position of the content to be displayed overlaid on the image displaying each object, and the content registration unit.
The content is superimposed on the image taken by the camera and displayed on the display.
The three-dimensional map creating apparatus according to any one of claims 1 to 5, further comprising.
 前記第1の3次元地図から離れた場所に存在する物体までの相対的な移動距離を算出する相対移動距離推定部をさらに有することを特徴とする請求項1から6のいずれか1項に記載の3次元地図作成装置。 The invention according to any one of claims 1 to 6, further comprising a relative movement distance estimation unit that calculates a relative movement distance to an object existing at a location away from the first three-dimensional map. 3D mapping device.  フロア上を移動するセンサによって検出されたセンサデータに基づいて、対象物を含む領域毎に第1の3次元地図を生成するステップと、
 前記フロアのフロアマップを取得し、生成された1つ以上の第1の3次元地図を前記フロアマップ上に配置することによって、前記1つ以上の第1の3次元地図を含む第2の3次元地図を生成するステップと、
 を有することを特徴とする3次元地図作成方法。
Based on the sensor data detected by the sensor moving on the floor, the step of generating the first 3D map for each area including the object, and
A second 3 that includes the one or more first 3D maps by acquiring the floor map of the floor and arranging the generated one or more first 3D maps on the floor map. Steps to generate a dimensional map and
A three-dimensional map creation method characterized by having.
 コンピュータに、
 フロア上を移動するセンサによって検出されたセンサデータに基づいて、対象物を含む領域毎に第1の3次元地図を生成する処理と、
 前記フロアのフロアマップを取得し、生成された1つ以上の第1の3次元地図を前記フロアマップ上に配置することによって、前記1つ以上の第1の3次元地図を含む第2の3次元地図を生成する処理と、
 を実行させることを特徴とする3次元地図作成プログラム。
On the computer
A process to generate a first 3D map for each area including an object based on the sensor data detected by the sensor moving on the floor.
A second 3 that includes the one or more first 3D maps by acquiring the floor map of the floor and arranging the generated one or more first 3D maps on the floor map. The process of generating a dimensional map and
A three-dimensional map creation program characterized by executing.
PCT/JP2019/047794 2019-12-06 2019-12-06 Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program Ceased WO2021111613A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020519464A JPWO2021111613A1 (en) 2019-12-06 2019-12-06 3D map creation device, 3D map creation method, and 3D map creation program
PCT/JP2019/047794 WO2021111613A1 (en) 2019-12-06 2019-12-06 Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program
TW109116874A TW202123157A (en) 2019-12-06 2020-05-21 Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/047794 WO2021111613A1 (en) 2019-12-06 2019-12-06 Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program

Publications (1)

Publication Number Publication Date
WO2021111613A1 true WO2021111613A1 (en) 2021-06-10

Family

ID=76221169

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/047794 Ceased WO2021111613A1 (en) 2019-12-06 2019-12-06 Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program

Country Status (3)

Country Link
JP (1) JPWO2021111613A1 (en)
TW (1) TW202123157A (en)
WO (1) WO2021111613A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023016267A (en) * 2021-07-21 2023-02-02 株式会社日立製作所 Maintenance support system and maintenance support device
WO2024075379A1 (en) 2022-10-03 2024-04-11 フィブイントラロジスティクス株式会社 Automated guided vehicle traveling system
JP2024532299A (en) * 2021-08-24 2024-09-05 グーグル エルエルシー System and method for generating a three-dimensional map of an indoor space - Patents.com

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI833122B (en) * 2021-10-22 2024-02-21 中光電智能機器人股份有限公司 Method and system for building a spatial static map

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004276168A (en) * 2003-03-14 2004-10-07 Japan Science & Technology Agency Map creation system for mobile robots
JP2010191066A (en) * 2009-02-17 2010-09-02 Mitsubishi Electric Corp Three-dimensional map correcting device and three-dimensional map correction program
JP2014146267A (en) * 2013-01-30 2014-08-14 Toyota Motor Corp Pedestrian detection device and driving support device
JP2014229020A (en) * 2013-05-21 2014-12-08 三菱電機ビルテクノサービス株式会社 Information providing device and information providing system
WO2018189770A1 (en) * 2017-04-10 2018-10-18 三菱電機株式会社 Map management device and autonomous mobile body control device
US20190129444A1 (en) * 2017-10-31 2019-05-02 Savioke, Inc. Computer system and method for automated indoor surveying by robots
JP2019174920A (en) * 2018-03-27 2019-10-10 株式会社日立ソリューションズ Article management system and article management program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6025557B2 (en) * 2012-12-27 2016-11-16 キヤノン株式会社 Image recognition apparatus, control method thereof, and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004276168A (en) * 2003-03-14 2004-10-07 Japan Science & Technology Agency Map creation system for mobile robots
JP2010191066A (en) * 2009-02-17 2010-09-02 Mitsubishi Electric Corp Three-dimensional map correcting device and three-dimensional map correction program
JP2014146267A (en) * 2013-01-30 2014-08-14 Toyota Motor Corp Pedestrian detection device and driving support device
JP2014229020A (en) * 2013-05-21 2014-12-08 三菱電機ビルテクノサービス株式会社 Information providing device and information providing system
WO2018189770A1 (en) * 2017-04-10 2018-10-18 三菱電機株式会社 Map management device and autonomous mobile body control device
US20190129444A1 (en) * 2017-10-31 2019-05-02 Savioke, Inc. Computer system and method for automated indoor surveying by robots
JP2019174920A (en) * 2018-03-27 2019-10-10 株式会社日立ソリューションズ Article management system and article management program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
IIJIMA, HIROKI ET AL.: "Path Control for an Electric Wheelchair Using Intersection Maps", SYSTEMS, CONTROL AND INFORMATION, vol. 59, no. 1, 15 January 2015 (2015-01-15), pages 12 - 21 *
NAGAI, GENKI ET AL: "3D Map Generation Based on Visual SLAM using a Drone", Lecture proceedingS", JSME ANNUAL CONFERENCE ON ROBOTICS AND MECHATRONICS), 8 June 2016 (2016-06-08) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023016267A (en) * 2021-07-21 2023-02-02 株式会社日立製作所 Maintenance support system and maintenance support device
JP2024532299A (en) * 2021-08-24 2024-09-05 グーグル エルエルシー System and method for generating a three-dimensional map of an indoor space - Patents.com
JP7713588B2 (en) 2021-08-24 2025-07-25 グーグル エルエルシー System and method for generating a three-dimensional map of an indoor space - Patents.com
WO2024075379A1 (en) 2022-10-03 2024-04-11 フィブイントラロジスティクス株式会社 Automated guided vehicle traveling system

Also Published As

Publication number Publication date
JPWO2021111613A1 (en) 2021-12-09
TW202123157A (en) 2021-06-16

Similar Documents

Publication Publication Date Title
US9953461B2 (en) Navigation system applying augmented reality
US10636168B2 (en) Image processing apparatus, method, and program
WO2020037492A1 (en) Distance measuring method and device
JP5746477B2 (en) Model generation device, three-dimensional measurement device, control method thereof, and program
US7529387B2 (en) Placement information estimating method and information processing device
US20150235367A1 (en) Method of determining a position and orientation of a device associated with a capturing device for capturing at least one image
US10930008B2 (en) Information processing apparatus, information processing method, and program for deriving a position orientation of an image pickup apparatus using features detected from an image
WO2013111229A1 (en) Camera calibration device, camera calibration method, and camera calibration program
KR102016636B1 (en) Calibration apparatus and method of camera and rader
WO2021111613A1 (en) Three-dimensional map creation device, three-dimensional map creation method, and three-dimensional map creation program
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
JP2016057108A (en) Arithmetic apparatus, arithmetic system, arithmetic method and program
JP2008070267A (en) Position and orientation measurement method and apparatus
WO2020195875A1 (en) Information processing device, information processing method, and program
CN114726978A (en) Information processing apparatus, information processing method, and program
CN113610702B (en) Picture construction method and device, electronic equipment and storage medium
Choi et al. Position-based augmented reality platform for aiding construction and inspection of offshore plants
JP2021047516A (en) Information processing device, coordinate conversion system, coordinate conversion method, and coordinate conversion program
Zhu et al. Wii remote–based low-cost motion capture for automated assembly simulation
Shmatko et al. Estimation of rotation measurement error of objects using computer simulation
Hasler et al. Implementation and first evaluation of an indoor mapping application using smartphones and frameworks
Li et al. A combined vision-inertial fusion approach for 6-DoF object pose estimation
Ji et al. Localization on a-priori information of plane extraction
Volkov et al. Stereo-based visual localization without triangulation for unmanned robotics platform
CN112750205A (en) Plane dynamic detection system and detection method

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020519464

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19954857

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19954857

Country of ref document: EP

Kind code of ref document: A1