CN111009036B - Grid map correction method and device in synchronous positioning and map construction - Google Patents
Grid map correction method and device in synchronous positioning and map construction Download PDFInfo
- Publication number
- CN111009036B CN111009036B CN201911256857.9A CN201911256857A CN111009036B CN 111009036 B CN111009036 B CN 111009036B CN 201911256857 A CN201911256857 A CN 201911256857A CN 111009036 B CN111009036 B CN 111009036B
- Authority
- CN
- China
- Prior art keywords
- grid map
- cad
- map
- mapping
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/38—Registration of image sequences
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a grid map correction method and device in synchronous positioning and map construction. The method comprises the following steps: when judging that the current scene at least comprises a part of artificial environment, starting correction, and obtaining a grid map of the current scene and a CAD (computer aided design) image of the current scene; aligning the grid map with the CAD graph based on an image matching method of the feature points; identifying connected domains in the grid map, and determining a mapping domain corresponding to the connected domains in the aligned CAD graph according to the mapping relation; extracting high-level features indicating a certain geometric shape from the CAD graph according to the mapping domain; and mapping the extracted high-level features in the grid map, and correcting the grid map. The correction scheme is more accurate and efficient than a manual correction method, so that the accuracy of grid map acquisition and the reliability of positioning navigation of robots, unmanned robots and the like are improved.
Description
Technical Field
The invention relates to the technical field of synchronous positioning and map construction, in particular to a grid map correction method, device, electronic equipment and readable storage medium in synchronous positioning and map construction.
Background
SLAM (synchronous positioning and map construction, simultaneous localization andmapping) refers to a process of simultaneously calculating the position of a moving object and constructing an environment map according to information of a sensor, and solves the positioning and map construction problems of a robot, an unmanned, and the like when moving in an unknown environment. Due to the different types and installation modes of the sensors, the realization mode and the difficulty of the SLAM have certain difference, and the laser SLAM is used for collecting cloud points, so that the VSLAM is not required to collect three-dimensional depth information, the calculation performance of the laser SLAM is greatly lower than that of the VSLAM, the algorithm is simple, and the laser SLAM is widely applied to the field of unmanned aerial vehicle control.
However, due to limitations of the type and performance of the laser radar, some differences exist between the map constructed by the laser and the actual scene, for example, certain areas cannot be acquired due to low frequency of the laser radar, and abnormal conditions such as bending of the constructed wall map due to high noise of the laser radar can seriously affect the positioning and navigation functions of the robot. At present, the occupied grid map constructed by the laser SLAM is usually manually corrected, and the method corrects the map to a certain extent, but has the defects of large human factors, manual errors and low efficiency.
Disclosure of Invention
The present invention has been made in view of the above problems, and it is an object of the present invention to provide a method, an apparatus, an electronic device, and a readable storage medium for correcting a grid map in synchronous positioning and map construction, which overcome or at least partially solve the above problems.
According to one aspect of the present invention, there is provided a method for correcting a grid map in synchronous positioning and map construction, the method comprising:
when judging that the current scene at least comprises a part of artificial environment, starting correction, and obtaining a grid map of the current scene and a CAD (computer aided design) image of the current scene;
aligning the grid map with the CAD map based on the image matching method of the characteristic points;
identifying connected domains in the grid map, and determining a mapping domain corresponding to the connected domains in the aligned CAD graph according to the mapping relation;
extracting high-level features indicating a certain geometric shape from the CAD graph according to the mapping domain;
the extracted high-level features are mapped to the grid map, and the grid map is corrected.
Optionally, the aligning the grid map with the CAD image based on the feature point image matching method includes:
ORB feature points are extracted from the CAD graph and the grid map respectively, and descriptors of key pixel points are extracted and calculated;
determining matched characteristic point pairs through descriptors;
determining the scale and direction change value of the CAD graph through least square fitting according to the characteristic point pairs;
and aligning the CAD graph with the grid map according to the change values of the scale and the direction.
Optionally, determining the matched pairs of feature points by descriptors includes:
traversing key pixel points in the grid map and the CAD map, calculating Euclidean distances of key pixel point descriptors in the grid map and the CAD map according to bitwise AND operation, and determining a key pixel point pair with the minimum Euclidean distance value as a matched characteristic point pair.
Optionally, determining the change value of the scale and the direction of the CAD drawing according to the feature point pair by least square fitting includes:
acquiring characteristic point pairs;
connecting two corresponding characteristic points in the CAD graph and the grid map respectively to determine connecting lines;
calculating the ratio of each corresponding connecting line;
and fitting each ratio by a least square method to determine a scale change value and a direction change value suitable for CAD graph adjustment.
Optionally, mapping the extracted high-level features in the grid map includes:
and mapping the high-level features in the CAD graph into the grid map according to the corresponding position and direction and the coordinate transformation relation.
Optionally, modifying the grid map includes:
determining a mathematical expression of the high-level feature according to the pixel information of the feature points in the high-level feature and a linear regression method;
and adjusting the scale and the direction of the high-level features according to the mathematical expression to obtain the optimized high-level features.
Optionally, correcting the grid map further includes:
adding pixel points in a preset neighborhood of the high-level features to realize rendering of the high-level features;
and fusing each rendered high-level feature with the feature adjacent to the high-level feature, so as to update the grid map.
According to another aspect of the present invention, there is provided a correction device for grid map in synchronous positioning and map construction, the device comprising:
the image acquisition unit is suitable for starting correction when judging that the current scene at least comprises a part of artificial environment, and acquiring a grid map of the current scene and a Computer Aided Design (CAD) image of the current scene;
an image alignment unit adapted to align the grid map with the CAD image based on an image matching method of the feature points;
the mapping unit is suitable for identifying the connected domain in the grid map, and determining the mapping domain corresponding to the connected domain in the aligned CAD graph according to the mapping relation;
the feature extraction unit is suitable for extracting high-level features with certain geometric shapes in the CAD graph according to the mapping domain;
and the map correction unit is suitable for mapping the extracted high-level features into the grid map and correcting the grid map.
Optionally, the image alignment unit is adapted to:
ORB feature points are extracted from the CAD graph and the grid map respectively, and descriptors of key pixel points are extracted and calculated;
determining matched characteristic point pairs through descriptors;
determining the scale and direction change value of the CAD graph through least square fitting according to the characteristic point pairs;
and aligning the CAD graph with the grid map according to the change values of the scale and the direction.
Optionally, the high-level features include any one or more of the following: straight lines, broken lines, polygons, curves, ellipses, circles.
According to still another aspect of the present invention, there is provided an electronic apparatus including: a processor; and a memory arranged to store computer executable instructions that, when executed, cause the processor to perform a method as any of the above.
According to a further aspect of the present invention there is provided a computer readable storage medium storing one or more programs which when executed by a processor implement a method as any of the above.
From the above, the correction scheme disclosed by the invention is more accurate and efficient than a manual correction method, so that the accuracy of grid map acquisition and the reliability of positioning navigation of robots, unmanned robots and the like are improved.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 shows a flow diagram of a method of correcting a grid map according to one embodiment of the invention;
fig. 2 is a schematic diagram showing the structure of a correction device of a grid map according to an embodiment of the present invention;
FIG. 3 shows a schematic diagram of an electronic device according to one embodiment of the invention;
FIG. 4 illustrates a schematic diagram of a computer-readable storage medium according to one embodiment of the invention;
FIG. 5 shows a schematic diagram of classification of application scenarios faced by laser mapping according to one embodiment of the invention;
FIG. 6 illustrates a flow diagram of CAD graph and grid map alignment according to one embodiment of the present invention;
FIG. 7 illustrates an exemplary diagram of grid map and CAD graph feature point correspondence, according to one embodiment of the present invention;
FIG. 8 shows a flow diagram of connected domain analysis and mapping according to one embodiment of the invention;
FIG. 9 illustrates an exemplary diagram of a flow for mapping high-level features according to one embodiment of the invention;
FIG. 10 illustrates an example diagram of a high-level feature map according to one embodiment of the invention;
FIG. 11 illustrates a flow diagram of high-level feature set linear fitting and fusion in a grid map according to one embodiment of the invention;
FIG. 12 illustrates an example diagram of fitting, rendering, high-level features according to one embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
FIG. 1 shows a flow diagram of a method of correcting a grid map according to one embodiment of the invention; the method comprises the following steps:
step S110, when judging that the current scene at least comprises a part of artificial environment, starting correction, and obtaining a grid map of the current scene and a CAD (computer aided design) drawing of the current scene.
One of the maps constructed by the laser SLAM is an occupied grid map, and such map construction is performed by continuously updating the occupied state of the grid, as shown in the left side of fig. 7, in which gray is a position area, white is a non-obstacle area, black is an obstacle area, and a darker color represents a greater probability of being an obstacle. In the prior art, a map is not built by combining higher-level features, so that the map is greatly influenced by sensor noise.
Referring to fig. 5, for the environment faced by SLAM, the environment can be roughly classified into a natural environment and an artificial environment according to geometric characteristics. In natural environments, the geometric shapes are mainly represented by random features, while in artificial environments, due to artificial design, the environmental geometric structures often have obvious regularity, such as a large number of geometric elements, such as straight lines, curves, circular arcs and the like, and particularly, a large number of regular geometric figures exist in an indoor constructed laser grid map, and on the other hand, many scene CAD drawings have such features. Therefore, the invention acquires the grid map and the corresponding CAD map which are acquired and formed according to the laser sensor, corrects the grid map by means of the CAD map in the indoor scene, and aims to optimize the artificial environment part in the grid map by referring to the CAD map.
The scheme of the embodiment is applicable to the situation that the scene is an artificial environment, such as the map construction of an indoor scene, and also applicable to the situation that a part of the scene is an artificial environment and a part of the scene is a natural environment, and at the moment, the scene can be detected at first and the artificial environment can be identified.
Step S120, the grid map is aligned with the CAD map based on the image matching method of the feature points.
The CAD drawing of the indoor environment and the grid map constructed by the laser SLAM may have inconsistent dimensions and directions, and at this time, alignment needs to be performed by a method of image matching based on pixel-level feature points. The purpose of the alignment is to keep the dimensions and orientation of the CAD drawing consistent with those of the grid map.
The specific steps of alignment are disclosed in reference to fig. 6, and include respectively obtaining feature points in the CAD drawing and the grid map, and matching the feature points so that the dimensions and directions of the feature points are the same or corresponding to achieve the purpose of scale or direction alignment, thereby obtaining the aligned CAD drawing.
And step S130, identifying the connected domain in the grid map, and determining the mapping domain corresponding to the connected domain in the aligned CAD graph according to the mapping relation.
The connected domain analysis is to perform connected domain analysis on the grid map to obtain a region which can be reached by the robot, such as a white non-obstacle region in the left part of fig. 7, namely, the connected domain.
Fig. 8 shows a process of obtaining a mapped domain (connected domain) in the CAD drawing from the connected domain in the grid map. The connected domain mapping is that after being aligned through the graphics, the CAD graph and the grid map are already pictures with the same or corresponding size and direction, and the result of the mapping is that the connected domain is mapped into a mapping domain in the CAD graph.
Step S140, extracting high-level features indicating a certain geometric shape from the CAD graph according to the mapping domain.
There are a large number of high-level features in indoor scenes, such as straight lines, curves, circles, etc. The embodiment of the invention carries out high-level feature extraction and mapping on the mapped CAD graph so as to carry out grid map correction afterwards.
And step S150, the extracted high-level features are mapped in the grid map, and the grid map is corrected.
Mapping the high-level features on the grid map comprises determining the high-level features in the grid map, and then performing fitting correction, rendering and fusion on the high-level features, so that the grid map is updated.
In summary, the grid map correction scheme disclosed by the embodiment can obtain more accurate and more efficient technical effects than the manual correction method, thereby ensuring the accuracy of synchronous positioning and map construction and improving the reliability of robot positioning and navigation.
In one embodiment, referring to FIG. 6, specific steps for aligning a grid map with a CAD graph are shown.
First, feature point extraction is performed on a CAD drawing and a grid map, and in order to perform feature point matching, the present embodiment is described by taking ORB (Oriented FAST and Rotated BRIEF) feature points and descriptors as examples.
ORB feature points are extracted from the CAD graph and the grid map respectively, including extracting key pixel points and calculating descriptors of the key pixel points.
In fig. 7, the left side is a grid map, the right side is a CAD map, the connecting lines indicate two pairs of corresponding feature points, and feature points are extracted from the CAD map and the grid map, respectively. Extracting the ORB feature includes extracting a keypoint and computing a descriptor of the keypoint, e.g., the descriptor may be a binary value consisting of 256 bits of 0 and 1.
Feature point matching is performed through an ORB feature point descriptor, the Euclidean distance of the descriptor is calculated, for example, a bit-wise and operation mode can be adopted, the smaller the bit-wise and value is, the closer the distance between two key points is indicated, and therefore the similarity of the feature points can be obtained, the closer the calculated distance is, the higher the similarity of the feature points is, and a pair of feature points with the smallest Euclidean distance value is taken as a matched pair of feature points.
All key pixels in the CAD drawing and grid map may be traversed when implemented. When n pairs (more than a certain threshold) of characteristic point pairs are found, the scale and direction change can be obtained through a least square method.
And determining the change value of the scale and the direction of the CAD graph through least square fitting according to the characteristic point pairs. For example, when two pairs of feature points are marked in fig. 7, the left grid map is a (Xa, ya), B (Xb, yb), and the right CAD map is a '(Xa', ya '), B' (Xb ', yb'), a1=ab/a 'B', and a matching feature point C is added similarly, a2, a3 can be obtained from the feature point a and the feature point C. And (3) fitting a1, a2, a3 and the like through a least square method to obtain an optimal scale change value and an optimal direction change value which are suitable for the whole image. Alternatively, a portion of the feature points may be used to obtain an optimal scale change value and orientation change value for a portion of the image.
And adjusting the CAD graph into an image consistent with the grid map according to the scale change value and the direction change value, and obtaining the aligned CAD graph.
In one embodiment, mapping the extracted high-level features in the grid map includes: and mapping the high-level features in the CAD graph into the grid map according to the corresponding position and direction and the coordinate transformation relation.
The robot performs positioning and navigation according to the grid map, so that a connected domain exists in the grid map, the robot can move freely, and map correction is mainly performed on the connected domain. Therefore, it is necessary to extract the connected domain and perform CAD drawing mapping, and the specific process thereof is shown in fig. 8, which specifically includes analyzing the connected domain of the grid map, and then determining the mapped domain corresponding to the connected domain in the aligned CAD drawing, thereby determining the mapped domain in the mapped CAD.
After the mapping domain of CAD is obtained, the boundary of the mapping domain is extracted according to the mapping domain, and such boundary is generally composed of linear or curve-shaped high-level features.
Fig. 9 illustrates a process of mapping corresponding features in a grid map according to line features of CAD, including extracting high-level features according to CAD mapping fields, classifying the extracted high-level features, and then mapping the high-level features into the grid map, thereby obtaining a high-level feature set in the grid map.
The feature classification is mainly to divide high-level features extracted from the CAD graph into linear features, circular features and the like. The feature mapping is to map the position of the high-level feature extracted from the CAD drawing into the grid map, and the mapping is generally realized after transformation according to the coordinate position relationship of the two drawings, specifically, the high-level feature in the CAD can be drawn into the corresponding position of the grid map. Fig. 10 shows an example of mapping to a grid map when the high-level features in the CAD drawing are straight-line shapes, and fig. 10 is enlarged for convenience of illustration with respect to straight lines (indicated by arrows in the drawing).
In one embodiment, step S150 includes: determining a mathematical expression of the high-level feature according to the pixel information of the feature points in the high-level feature and a linear regression method; and adjusting the scale and the direction of the high-level features according to the mathematical expression to obtain the optimized high-level features.
Correcting the grid map further includes: adding pixel points in a preset neighborhood of the high-level features to realize rendering of the high-level features; and fusing each rendered high-level feature with the feature adjacent to the high-level feature, so as to update the grid map.
Fig. 11 shows a flow of correction optimization of a grid map, specifically including: and performing linear regression fitting on linear features according to a high-level feature set in the grid map, and then fusing the fitted features with other features to obtain an updated map. Namely, two kinds of processing are performed on the high-level features in the grid map: firstly, linear regression fitting, and secondly, rendering and fusing the fitted features.
The linear regression fitting means that for each high-level feature, a mathematical function is obtained according to linear regression according to each feature point in the feature and a plurality of feature points in the adjacent domain, and the scale and the direction of the feature are secondarily adjusted according to the expression of the mathematical function, so that the error or deviation of the high-level feature is corrected.
Thus, the mathematical expression for determining the high-level feature according to the linear regression method specifically includes: the features are fitted into a mathematical expression, taking the straight line of the leftmost vertical direction in fig. 10 as an example, assuming that the abscissa direction x=5 (the ordinate direction 5< y < 9) and then the mathematical expression is converted into pixel information of the corresponding position. Thereby avoiding errors or deviations in the linear feature itself.
Rendering and fusing the linear features specifically comprises: and adding proper pixel points to the periphery of the linear feature according to a preset threshold value, widening the linear feature, and fusing the feature with the similar feature to enable the feature to meet the requirement of the grid map, so that the grid map is corrected and updated.
Referring to fig. 12, wherein the left graph is the original grid map; the middle graph is a high-level feature extracted from the CAD graph; the right graph is the high-level features after rendering.
Fig. 2 is a schematic diagram showing the structure of a correction device of a grid map according to an embodiment of the present invention; the device comprises:
the image obtaining unit 210 is adapted to initiate correction when it is determined that the current scene at least includes a part of the artificial environment, and obtain the grid map of the current scene and the CAD drawing of the current scene.
The embodiment of the invention acquires the grid map formed by the acquisition of the laser sensor and the corresponding CAD map, corrects the grid map by means of the CAD map in the indoor scene, and aims at optimizing the artificial environment part in the grid map by referring to the CAD map.
The image alignment unit 220 is adapted to align the grid map with the CAD image based on the image matching method of the feature points.
The CAD drawing of the indoor environment and the grid map constructed by the laser SLAM may have inconsistent dimensions and directions, and at this time, alignment needs to be performed by a method of image matching based on pixel-level feature points.
The mapping unit 230 is adapted to identify connected domains in the grid map, and determine, according to the mapping relationship, a mapping domain corresponding to the connected domain in the aligned CAD drawing.
The connected domain analysis is to analyze the connected domain of the grid map to determine the reachable area of the robot, the connected domain mapping is that after the connected domain is aligned by the graph, the CAD graph and the grid map are already pictures with the corresponding size and direction, and then the connected domain can be mapped into the area in the CAD graph.
The feature extraction unit 240 is adapted to extract high-level features in the CAD drawing indicative of a certain geometry from the mapping domain.
There are a large number of high-level features in indoor scenes, such as straight lines, curves, circles, etc. The invention carries out high-level feature extraction and mapping on the mapped CAD graph so as to carry out grid map correction afterwards.
The map correction unit 250 is adapted to map the extracted high-level features on the grid map and correct the grid map.
And carrying out fitting correction on the high-level features in the grid map. Firstly, carrying out linear regression on the corresponding geometry of the feature set to obtain a geometric figure function. Because the geometric shapes in the grid map have certain widths, at this time, the corresponding widths need to be acquired by rendering and the like so as to realize updating of the grid map.
In one embodiment, the image alignment unit 210 is adapted to: ORB feature points are extracted from the CAD graph and the grid map respectively, and descriptors of key pixel points are extracted and calculated; determining matched characteristic point pairs through descriptors; determining the scale and direction change value of the CAD graph through least square fitting according to the characteristic point pairs; and aligning the CAD graph with the grid map according to the change values of the scale and the direction.
In one embodiment, the high-level features include any one or more of the following: straight lines, broken lines, polygons, curves, ellipses, circles.
In one embodiment, the map modification unit 250 is adapted to: and mapping the high-level features in the CAD graph into the grid map according to the corresponding position and direction and the coordinate transformation relation.
In one embodiment, the map modification unit 250 is adapted to: determining a mathematical expression of the high-level feature according to the pixel information of the feature points in the high-level feature and a linear regression method; and adjusting the scale and the direction of the high-level features according to the mathematical expression to obtain the optimized high-level features.
Correcting the grid map further includes: adding pixel points in a preset neighborhood of the high-level features to realize rendering of the high-level features; and fusing each rendered high-level feature with the feature adjacent to the high-level feature, so as to update the grid map.
In summary, the grid map correction technical scheme disclosed by the invention comprises the following steps: when judging that the current scene at least comprises a part of artificial environment, starting correction, and obtaining a grid map of the current scene and a CAD (computer aided design) image of the current scene; aligning the grid map with the CAD map based on the image matching method of the characteristic points; identifying connected domains in the grid map, and determining a mapping domain corresponding to the connected domains in the aligned CAD graph according to the mapping relation; extracting high-level features indicating a certain geometric shape from the CAD graph according to the mapping domain; the extracted high-level features are mapped to the grid map, and the grid map is corrected. The correction scheme is more accurate and efficient than a manual correction method, so that the accuracy of grid map acquisition and the reliability of positioning navigation of robots, unmanned robots and the like are improved.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may also be used with the teachings herein. The required structure for the construction of such devices is apparent from the description above. In addition, the present invention is not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in the correction device for a grid map according to an embodiment of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
For example, fig. 3 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device 300 comprises a processor 310 and a memory 320 arranged to store computer executable instructions (computer readable program code). The memory 320 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory 320 has a memory space 330 storing computer readable program code 331 for performing any of the method steps described above. For example, the memory space 330 for storing computer readable program code may include respective computer readable program code 331 for implementing the respective steps in the above method, respectively. The computer readable program code 331 can be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. Such a computer program product is typically a computer readable storage medium, such as in fig. 4. Fig. 4 illustrates a schematic structure of a computer-readable storage medium according to an embodiment of the present invention. The computer readable storage medium 400 stores computer readable program code 331 for performing the steps of the method according to the invention, which may be read by the processor 310 of the electronic device 300, which computer readable program code 331, when executed by the electronic device 300, causes the electronic device 300 to perform the steps of the method described above, in particular the computer readable program code 331 stored by the computer readable storage medium may perform the method shown in any of the embodiments described above. The computer readable program code 331 may be compressed in a suitable form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
Claims (10)
1. The method for correcting the grid map in synchronous positioning and map construction is characterized by comprising the following steps of:
when judging that the current scene at least comprises a part of artificial environment, starting correction, and obtaining a grid map of the current scene and a CAD (computer aided design) image of the current scene;
aligning the grid map with the CAD graph based on an image matching method of the feature points;
identifying connected domains in the grid map, and determining a mapping domain corresponding to the connected domains in the aligned CAD graph according to the mapping relation;
extracting high-level features indicating a certain geometric shape from the CAD graph according to the mapping domain;
and mapping the extracted high-level features in the grid map, determining a corresponding high-level feature set in the grid map, performing linear regression fit on the high-level feature set in the grid map, and correcting the grid map.
2. The method of claim 1, wherein the feature-point-based image matching method aligns the grid map with a CAD drawing comprises:
respectively extracting ORB feature points from the CAD graph and the grid map, wherein the ORB feature points comprise descriptors for extracting key pixel points and calculating the key pixel points;
determining matched characteristic point pairs through the descriptors;
determining the change value of the scale and the direction of the CAD graph through least square fitting according to the characteristic point pairs;
and aligning the CAD graph with the grid map according to the change value of the scale and the direction.
3. The method of claim 2, wherein said determining matched pairs of feature points by said descriptors comprises:
traversing the grid map and the key pixel points in the CAD graph, calculating Euclidean distances of the key pixel point descriptors in the grid map and the CAD graph according to bitwise AND operation, and determining the key pixel point pair with the minimum Euclidean distance value as a matched characteristic point pair.
4. The method of claim 2, wherein the determining the change values of the scale and direction of the CAD drawing from the feature point pairs by least squares fitting comprises:
acquiring the characteristic point pairs;
connecting two corresponding characteristic points in the CAD graph and the grid map respectively to determine connecting lines;
calculating the ratio of the corresponding connecting lines;
and fitting each ratio by a least square method to determine a scale change value and a direction change value which are suitable for the adjustment of the CAD graph.
5. The method of claim 1, wherein the mapping the extracted high-level features in the grid map comprises:
and mapping the high-level features in the CAD graph into the grid map according to the corresponding position and direction and the coordinate transformation relation.
6. The method of claim 1, wherein the modifying the grid map comprises:
determining a mathematical expression of the high-level feature according to the pixel information of the feature points in the high-level feature in the CAD graph and a linear regression method;
and adjusting the scale and the direction of the high-level features in the CAD graph according to the mathematical expression to obtain the optimized high-level features in the CAD graph.
7. The method of claim 1 or 6, wherein the modifying the grid map further comprises:
adding pixel points in a preset neighborhood of the high-level features in the CAD graph to realize rendering of the high-level features in the CAD graph;
and fusing each high-level feature in the rendered CAD graph with a feature adjacent to the high-level feature, so as to update the grid map.
8. A device for correcting a grid map in synchronous positioning and map construction, the device comprising:
the image acquisition unit is suitable for starting correction when judging that the current scene at least comprises a part of artificial environment, and acquiring a grid map of the current scene and a Computer Aided Design (CAD) image of the current scene;
an image alignment unit adapted to align the grid map with the CAD image based on an image matching method of the feature points;
the mapping unit is suitable for identifying the connected domain in the grid map, and determining the mapping domain corresponding to the connected domain in the aligned CAD graph according to the mapping relation;
the feature extraction unit is suitable for extracting high-level features which indicate a certain geometric shape in the CAD graph according to the mapping domain;
and the map correction unit is suitable for mapping the extracted high-level features in the grid map, determining a corresponding high-level feature set in the grid map, performing linear regression fit on the high-level feature set in the grid map, and correcting the grid map.
9. The apparatus of claim 8, wherein the image alignment unit is adapted to:
respectively extracting ORB feature points from the CAD graph and the grid map, wherein the ORB feature points comprise descriptors for extracting key pixel points and calculating the key pixel points;
determining matched characteristic point pairs through the descriptors;
determining the change value of the scale and the direction of the CAD graph through least square fitting according to the characteristic point pairs;
and aligning the CAD graph with the grid map according to the change value of the scale and the direction.
10. The apparatus of claim 8, wherein the high-level features comprise any one or more of: straight lines, broken lines, polygons, curves, ellipses, circles.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911256857.9A CN111009036B (en) | 2019-12-10 | 2019-12-10 | Grid map correction method and device in synchronous positioning and map construction |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911256857.9A CN111009036B (en) | 2019-12-10 | 2019-12-10 | Grid map correction method and device in synchronous positioning and map construction |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN111009036A CN111009036A (en) | 2020-04-14 |
| CN111009036B true CN111009036B (en) | 2023-11-21 |
Family
ID=70114435
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201911256857.9A Active CN111009036B (en) | 2019-12-10 | 2019-12-10 | Grid map correction method and device in synchronous positioning and map construction |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111009036B (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113269767B (en) * | 2021-06-07 | 2023-07-18 | 中电科机器人有限公司 | Batch part feature detection method, system, medium and equipment based on machine vision |
| CN120495124B (en) * | 2025-07-16 | 2025-09-19 | 杭州宇树科技股份有限公司 | Feature enhanced map generation method and device for pipe gallery scene |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103369466A (en) * | 2013-07-10 | 2013-10-23 | 哈尔滨工业大学 | Map matching-assistant indoor positioning method |
| CN103717995A (en) * | 2011-08-29 | 2014-04-09 | 株式会社日立制作所 | Monitoring device, monitoring system and monitoring method |
| CN104755880A (en) * | 2012-10-30 | 2015-07-01 | 高通股份有限公司 | Processing and managing multiple maps at a location context identifier (lci) |
| CN107958118A (en) * | 2017-11-29 | 2018-04-24 | 元力云网络有限公司 | A kind of wireless signal acquiring method based on spatial relationship |
| WO2018140701A1 (en) * | 2017-01-27 | 2018-08-02 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
| CN109916397A (en) * | 2019-03-15 | 2019-06-21 | 斑马网络技术有限公司 | For tracking method, apparatus, electronic equipment and the storage medium of inspection track |
| WO2019122939A1 (en) * | 2017-12-21 | 2019-06-27 | University of Zagreb, Faculty of Electrical Engineering and Computing | Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map |
| CN110188151A (en) * | 2019-05-17 | 2019-08-30 | 深圳来电科技有限公司 | A kind of method and electronic device generating indoor map based on CAD and GIS |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9721157B2 (en) * | 2006-08-04 | 2017-08-01 | Nokia Technologies Oy | Systems and methods for obtaining and using information from map images |
| US7668621B2 (en) * | 2006-07-05 | 2010-02-23 | The United States Of America As Represented By The United States Department Of Energy | Robotic guarded motion system and method |
| US10222215B2 (en) * | 2017-04-21 | 2019-03-05 | X Development Llc | Methods and systems for map generation and alignment |
-
2019
- 2019-12-10 CN CN201911256857.9A patent/CN111009036B/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103717995A (en) * | 2011-08-29 | 2014-04-09 | 株式会社日立制作所 | Monitoring device, monitoring system and monitoring method |
| CN104755880A (en) * | 2012-10-30 | 2015-07-01 | 高通股份有限公司 | Processing and managing multiple maps at a location context identifier (lci) |
| CN103369466A (en) * | 2013-07-10 | 2013-10-23 | 哈尔滨工业大学 | Map matching-assistant indoor positioning method |
| WO2018140701A1 (en) * | 2017-01-27 | 2018-08-02 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
| CN107958118A (en) * | 2017-11-29 | 2018-04-24 | 元力云网络有限公司 | A kind of wireless signal acquiring method based on spatial relationship |
| WO2019122939A1 (en) * | 2017-12-21 | 2019-06-27 | University of Zagreb, Faculty of Electrical Engineering and Computing | Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map |
| CN109916397A (en) * | 2019-03-15 | 2019-06-21 | 斑马网络技术有限公司 | For tracking method, apparatus, electronic equipment and the storage medium of inspection track |
| CN110188151A (en) * | 2019-05-17 | 2019-08-30 | 深圳来电科技有限公司 | A kind of method and electronic device generating indoor map based on CAD and GIS |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111009036A (en) | 2020-04-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Wei et al. | Toward automatic building footprint delineation from aerial images using CNN and regularization | |
| KR102143108B1 (en) | Lane recognition modeling method, device, storage medium and device, and recognition method, device, storage medium and device | |
| CN107154040B (en) | A method for detecting cracks in tunnel lining surface images | |
| CN110502985B (en) | Form identification method and device and form identification equipment | |
| CN103703490B (en) | Apparatus for generating three-dimensional feature data and method for generating three-dimensional feature data | |
| CN110866871A (en) | Text image correction method and device, computer equipment and storage medium | |
| CN110443822B (en) | A Semantic Edge Aided Method for Fine Extraction of High-score Remote Sensing Targets | |
| CN111667506B (en) | Motion estimation method based on ORB feature points | |
| CN109001757B (en) | Parking space intelligent detection method based on 2D laser radar | |
| CN101770581A (en) | Semi-automatic detecting method for road centerline in high-resolution city remote sensing image | |
| CN114565726B (en) | A simultaneous localization and mapping method in unstructured dynamic environments | |
| CN111046950B (en) | Image processing method and device, storage medium and electronic device | |
| CN113159103A (en) | Image matching method, image matching device, electronic equipment and storage medium | |
| CN111009036B (en) | Grid map correction method and device in synchronous positioning and map construction | |
| Lee et al. | Temporally consistent road surface profile estimation using stereo vision | |
| CN118226421A (en) | LiDAR-Camera Online Calibration Method and System Based on Reflectivity Map | |
| Nielsen et al. | Survey on 2D LiDAR feature extraction for underground mine usage | |
| CN113255405A (en) | Parking space line identification method and system, parking space line identification device and storage medium | |
| CN110673607A (en) | Feature point extraction method and device in dynamic scene and terminal equipment | |
| Jelinek et al. | Fast total least squares vectorization | |
| US12001218B2 (en) | Mobile robot device for correcting position by fusing image sensor and plurality of geomagnetic sensors, and control method | |
| CN115937003A (en) | Image processing method, image processing device, terminal equipment and readable storage medium | |
| CN115164924A (en) | Fusion positioning method, system, equipment and storage medium based on visual AI | |
| CN117590362B (en) | Multi-laser radar external parameter calibration method, device and equipment | |
| CN113536837A (en) | Region division method and device for indoor scene |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |