[go: up one dir, main page]

CN119359775A - Low-overlap bullet point cloud registration method based on line feature detection - Google Patents

Low-overlap bullet point cloud registration method based on line feature detection Download PDF

Info

Publication number
CN119359775A
CN119359775A CN202411674257.5A CN202411674257A CN119359775A CN 119359775 A CN119359775 A CN 119359775A CN 202411674257 A CN202411674257 A CN 202411674257A CN 119359775 A CN119359775 A CN 119359775A
Authority
CN
China
Prior art keywords
point cloud
source
target point
source point
registration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202411674257.5A
Other languages
Chinese (zh)
Other versions
CN119359775B (en
Inventor
穆治亚
张琦雯
吕游
何丁龙
何昕
魏仲慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN202411674257.5A priority Critical patent/CN119359775B/en
Publication of CN119359775A publication Critical patent/CN119359775A/en
Application granted granted Critical
Publication of CN119359775B publication Critical patent/CN119359775B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

本发明涉及点云配准领域,尤其涉及一种基于线特征检测的低重叠率弹头点云配准方法,包括S1:对源点云和目标点云进行体素下采样;S2:分别对体素下采样后的源点云和目标点云进行中轴线拟合,并对拟合出的两条中轴线进行配准,实现源点云与目标点云的粗配准;S3:利用霍夫变换方法对粗配准后的源点云和目标点云进行线性弹痕特征提取;S4:基于从源点云和目标点云提取的线性弹痕特征寻找最佳旋转角度;S5:将源点云旋转最佳旋转角度与目标点云对齐,完成源点云与目标点云的精配准。本发明能够在源点云与目标点云为低重叠率的情况下,完成弹痕点云数据的精确配准。

The present invention relates to the field of point cloud registration, and in particular to a low-overlap bullet point cloud registration method based on line feature detection, comprising S1: performing voxel downsampling on a source point cloud and a target point cloud; S2: performing central axis fitting on the source point cloud and the target point cloud after voxel downsampling respectively, and registering the two fitted central axes to achieve coarse registration of the source point cloud and the target point cloud; S3: extracting linear bullet mark features from the source point cloud and the target point cloud after coarse registration using the Hough transform method; S4: finding the best rotation angle based on the linear bullet mark features extracted from the source point cloud and the target point cloud; S5: rotating the source point cloud to the best rotation angle and aligning it with the target point cloud to complete the precise registration of the source point cloud and the target point cloud. The present invention can complete the precise registration of bullet mark point cloud data when the source point cloud and the target point cloud have a low overlap rate.

Description

Low-overlap-ratio warhead point cloud registration method based on line feature detection
Technical Field
The invention belongs to the technical field of point cloud registration, and particularly relates to a low-overlapping-rate warhead point cloud registration method based on line feature detection.
Background
The gun is strictly controlled in China, but the number of gun-related cases rises year by year, and the bullet mark information in the gun-related cases has great significance for detecting the gun-related cases because the marks on the bullet heads and the bullet shells have extremely strong responsiveness to gun instruments. The bullet mark is fine and usually reaches the micron level, so that the traditional bullet mark detection is to observe and compare the bullet mark characteristics by operating microscope equipment by a technician and combining an image processing technology, but the comparison method lacks objective quantitative analysis, and the subjective judgment of the technician is strong, and the bullet mark detection result is not objective and comprehensive enough. With the development and popularization of various three-dimensional scanning technologies, the bullet mark detection is gradually changed from 2D to 3D measurement, comparison and identification, the bullet head is usually scanned under different angles to obtain the bullet head surface morphology, and the obtained partial point cloud data are registered by using a registration technology, so that the partial point cloud data are registered under the same coordinate system, namely, a complete bullet head three-dimensional point cloud model is formed.
For warhead point cloud data with low overlapping rate (the overlapping rate of the source point cloud and the target point cloud is lower than 60%), the characteristic which can be extracted is limited because the point cloud data quantity of the overlapping part is small, and a plurality of outlier subsets can be generated by directly utilizing the source point cloud registration, so that the registration process is long in time consumption and easy to match errors. The point cloud registration technology affects the quality of the three-dimensional point cloud model, so that the point cloud registration problem with low overlapping rate is a difficulty in bullet mark detection registration.
Disclosure of Invention
In view of the above, the invention aims to provide a low-overlapping-rate warhead point cloud registration method based on line feature detection, so as to solve the problem of poor registration accuracy of warhead point cloud data with low overlapping rate.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
a low-overlap-ratio warhead point cloud registration method based on line feature detection comprises the following steps:
s1, carrying out voxel downsampling on a source point cloud and a target point cloud;
S2, respectively carrying out central axis fitting on the source point cloud and the target point cloud after voxel downsampling, and registering the two fitted central axes to realize coarse registration of the source point cloud and the target point cloud;
s3, extracting linear bullet mark features of the source point cloud and the target point cloud after coarse registration by using a Hough transformation method;
s4, searching an optimal rotation angle based on linear bullet mark features extracted from a source point cloud and a target point cloud;
And S5, aligning the optimal rotation angle of the rotation of the source point cloud with the target point cloud, and finishing the fine registration of the source point cloud and the target point cloud.
Further, in step S1, the specific process of voxel downsampling the source point cloud and the target point cloud includes:
And constructing a three-dimensional voxel grid by using a cyclic voxel filtering method, gridding the source point cloud and the target point cloud, wherein each cube after gridding is a voxel, and all points in each voxel are replaced by gravity center points, so that voxel downsampling is completed.
Further, in step S2, the specific process of performing central axis fitting on the source point cloud and the target point cloud after voxel downsampling includes:
s21, layering a source point cloud and a target point cloud along a Y axis in the vertical direction, and then arbitrarily selecting four non-coplanar points for each layer Calculation pointsSpherical center coordinates of the outer ball of (2)The calculation equation is as follows:
;
Wherein r represents the radius of the outer ball;
S22, fitting the spherical center coordinates obtained by calculation of each layer into a straight line, and centering all point coordinates of the source point cloud and the target point cloud based on the fitted straight line to form a new point set So that the point setEach point coordinate of (3),The coordinates of the i-th point are indicated,Representing the center point coordinates of the point cloud;
s23, solving covariance matrix Is to find a direction vector based on the eigenvalues and eigenvectors of (a)So that the point setAt the position ofThe variance of the projection in the direction is maximized, wherein,N represents a point setThe number of midpoint coordinates,Representing a set of pointsIn finding the direction of the first principal componentThereafter, alongSelecting a series of point coordinates in the direction, and calculating the position of each centralized point coordinate in the directionProjection in direction,Representing a set of pointsCoordinates of the ith point in (a) at maximum and minimumUniformly select a series ofValues corresponding to each of the fitted straight lines are obtainedPoint coordinates of values
Further, in step S2, the fitted two central axes are registered, and the specific process for implementing coarse registration of the source point cloud and the target point cloud includes:
s24, determining the direction vectors and the center points of the two central axes;
S25, calculating a rotation matrix by using the direction vectors of the two central axes;
s26, obtaining a translation vector according to the difference between the center point of the central axis of the source point cloud and the center point of the central axis of the target point cloud after rotation by rotating the central axis of the source point cloud;
and S27, combining the rotation matrix and the translation vector into a transformation matrix to realize rough registration of the source point cloud and the target point cloud.
Further, in step S3, the specific process of extracting the linear bullet mark features from the source point cloud and the target point cloud after the coarse registration by using the hough transform method includes:
s31, each point in the source point cloud and the target point cloud after voxel downsampling The value of (2) is defined asCalculated by the following formula:
wherein, Representation pointsK field covariance matrix eigenvalues;
If it is calculated that Greater than or equal to the set threshold value tau, then the point is identifiedIs a sharp characteristic point and is attributed to the target point cloudOr source point cloud;
S32, discretizing the parameter space of the Hough transformation into a regular icosahedron;
S33, cloud target points Or source point cloudThe voting result of each sharp feature point in the regular icosahedron is mapped to each face of the regular icosahedron to form a voting matrix of the parameter space;
And S34, performing linear bullet mark feature detection by using a modified Hough transform algorithm.
Further, in step S34, the specific process of performing the linear bullet feature detection using the modified hough transform algorithm includes:
S341, representing a straight line as a parameter (rho, theta, phi) in a voting matrix, wherein rho is the shortest distance from a point to the straight line, and theta and phi are direction angles;
s342, establishing an accumulator, voting each point in the source point cloud and the target point cloud in the accumulator according to the normal vector or the local characteristic of the point cloud, and finding out the combination of parameters (rho, theta, phi) higher than the set ballot number through peak detection to obtain the bullet mark linear characteristic or the bullet mark curve characteristic.
Further, in the process of detecting the linear bullet mark features by using the improved Hough transform algorithm, an iteration method is utilized to optimize the estimation of the linear parameters, and in the process of each iteration, least square is adopted as an error function to perform linear fitting, so that the current linear parameters are corrected.
Further, in step S4, the specific process of finding the optimal rotation angle based on the linear bullet mark features extracted from the source point cloud and the target point cloud includes:
s41 using a three-dimensional rotation matrix To the source point cloudRotating to obtain a rotated source point cloudWherein, the method comprises the steps of,Indicating the rotation angle;
s42, according to the target point cloud With rotated source point cloudsCalculating fitness functionThe calculation formula is as follows:
wherein, The point-of-view is indicated,Representing vectorsSum vectorThe dot product absolute value of (2), vectorAndRespectively belong to target point cloudsAnd source point cloud,AndThe parameters representing the gaussian function are represented by,AndRepresenting cloud at target pointAnd source point cloudFeature descriptors of corresponding point coordinates in the model;
s43, repeating the steps S41 and S42, and finding the maximum fitness function in the iteration times Corresponding rotation angleAs the optimum rotation angle.
Compared with the prior art, the invention has the following beneficial effects:
According to the method, filtering is performed on the source point cloud and the target point cloud, coarse registration of the source point cloud and the target point cloud is achieved through fitting of the central axis, then detection of linear bullet mark features is achieved through a Hough transformation method, finally an optimal rotation angle is found based on the detected linear bullet mark features, fine registration of the source point cloud and the target point cloud is achieved based on the optimal rotation angle, and based on the operation, accurate registration of bullet mark point cloud data can be achieved under the condition that the overlapping rate of the source point cloud and the target point cloud is 20% -60%.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute an undue limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a low overlap rate warhead point cloud registration method based on line feature detection according to an inventive embodiment of the present invention;
FIG. 2 is a schematic diagram of a low overlap rate warhead point cloud registration method based on line feature detection according to an inventive embodiment of the present invention;
FIG. 3 is a schematic diagram of finding the center of an indirect ball according to an embodiment of the invention;
fig. 4 is a schematic flow chart of finding an optimal rotation angle according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not to be construed as limiting the invention.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
In the description of the invention, it should be understood that the terms "center," "longitudinal," "transverse," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships that are based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the invention and simplify the description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be configured and operate in a particular orientation, and therefore should not be construed as limiting the invention. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", etc. may explicitly or implicitly include one or more such feature. In the description of the invention, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected, mechanically connected, electrically connected, directly connected, indirectly connected via an intervening medium, or in communication between two elements. The specific meaning of the above terms in the creation of the present invention can be understood by those of ordinary skill in the art in a specific case.
The invention will be described in detail below with reference to fig. 1-4 in connection with embodiments.
As shown in fig. 1 and fig. 2, the low overlap rate warhead point cloud registration method based on line feature detection provided by the inventive embodiment of the present invention includes the following steps:
And S1, carrying out voxel downsampling on the source point cloud and the target point cloud.
Because the acquired point cloud data on the surface of the bullet is more, the data quality of the point cloud data with noise can be reduced, the running time can be greatly increased due to the more point cloud data, in order to reduce the data of the point cloud data and filter the noise, a cyclic voxel filtering method is used for constructing a three-dimensional voxel grid, the source point cloud and the target point cloud are gridded, each cube after gridding is a voxel, the maximum value and the minimum value of the source point cloud and the target point cloud X, Y, Z axes (which are a conventional three-dimensional coordinate system in the point cloud, the Y axis is a coordinate axis in the vertical direction, and the X axis and the Z axis are two coordinate axes in the horizontal direction) are calculated, a three-dimensional boundary box is established according to the maximum value and the minimum value, the boundary box is divided into cubes with the size of the designated voxels, and all points in the cubes are replaced by the center of gravity point of each cube, so that the voxel downsampling of the source point cloud and the target point cloud is reduced.
A large amount of isolated point noise occupies operation and storage resources of a data processor, noise point filtering is carried out on bullet point cloud data, the calculated amount of data processing is greatly reduced, data blocking is prevented, and the subsequent point cloud processing efficiency is improved.
And S2, respectively carrying out central axis fitting on the source point cloud and the target point cloud after voxel downsampling, and registering the two fitted central axes to realize coarse registration of the source point cloud and the target point cloud.
As shown in fig. 3, a specific process of performing central axis fitting on the source point cloud and the target point cloud after voxel downsampling includes:
S21, layering a source point cloud and a target point cloud along a Y axis, and then arbitrarily selecting four non-coplanar points for each layer Calculation pointsSpherical center coordinates of the outer ball of (2)The calculation equation is as follows:
;
Wherein r represents the radius of the outer ball.
By constructing a linear equation, the optimal sphere center solving coordinate in the least square sense is found by SVD
S22, fitting the spherical center coordinates obtained by calculation of each layer into a straight line, and centering all point coordinates of the source point cloud and the target point cloud based on the fitted straight line to form a new point setSo that the point setEach point coordinate of (3),The coordinates of the i-th point are indicated,Representing the center point coordinates of the point cloud.
Centering all point coordinates of the source point cloud and the target point cloud by taking the end points of the fitted straight lines as origin points (0, 0), and actually moving the distribution of all point coordinates of the source point cloud and the target point cloud into a coordinate system taking the origin points (0, 0) as the center to form a new point set
S23, solving covariance matrixIs to find a direction vector based on the eigenvalues and eigenvectors of (a)So that the point setAt the position ofThe variance of the projection in the direction is maximized, wherein,N represents a point setThe number of midpoint coordinates, i.e. the point setComprising a number n of coordinates of the points,Representing a set of pointsIn finding the direction of the first principal componentThereafter, alongSelecting a series of point coordinates in the direction, and calculating the position of each centralized point coordinate in the directionProjection in direction,Representing a set of pointsCoordinates of the ith point in (a) at maximum and minimumUniformly select a series ofValues corresponding to each of the fitted straight lines are obtainedPoint coordinates of values
And finally, registering the central axis fitted by the source point cloud with the central axis fitted by the target point cloud to obtain a rough registration position of the source point cloud and the target point cloud with low overlapping rate on the coaxial line, wherein the specific process is as follows:
And S24, determining the direction vectors and the center points of the two central axes.
S25, calculating a rotation matrix by using the direction vectors of the two central axes.
S26, obtaining a translation vector according to the difference between the center point of the central axis of the source point cloud and the center point of the central axis of the target point cloud after rotation by rotating the central axis of the source point cloud.
And S27, combining the rotation matrix and the translation vector into a transformation matrix to realize rough registration of the source point cloud and the target point cloud.
And S3, extracting linear bullet mark features of the source point cloud and the target point cloud after coarse registration by using a Hough transformation method.
The specific process of step S3 includes:
s31, each point in the source point cloud and the target point cloud after voxel downsampling The value of (2) is defined asCalculated by the following formula:
wherein, Representation pointsK field covariance matrix eigenvalues;
If it is calculated that Greater than or equal to the set threshold value tau, then the point is identifiedIs a sharp characteristic point and is attributed to the target point cloudOr source point cloudIf it is calculated thatIs smaller than the set threshold value tau, the point is identifiedTo smooth characteristic points, not attributable to the cloud of target pointsOr source point cloudTherefore, the purpose of enhancing sharp characteristics of bullet mark point clouds is achieved.
And S32, discretizing the parameter space of the Hough transform into a regular icosahedron.
This step can effectively find linear bullet features.
S33, cloud target pointsOr source point cloudThe voting result of each sharp feature point in (a) is mapped to each face of the positive icosahedron to form a voting matrix of the parameter space.
And S34, performing linear bullet mark feature detection by using a modified Hough transform algorithm.
In step S34, the specific process of performing linear bullet feature detection using the modified hough transform algorithm includes:
S341, representing a straight line as a parameter (ρ, θ, φ) in the voting matrix, wherein ρ is the shortest distance from the point to the straight line, and θ and φ are direction angles.
S342, establishing an accumulator, voting each point in the source point cloud and the target point cloud in the accumulator according to the normal vector or the local characteristic of the point cloud, finding out the combination of parameters (rho, theta, phi) higher than the set ballot number through peak detection, obtaining the bullet mark linear characteristic or bullet mark curve characteristic, mapping the bullet mark linear characteristic or bullet mark curve characteristic back to a three-dimensional space, and finishing the detection of the linear bullet mark characteristic.
In the process of detecting the linear bullet mark features by using an improved Hough transform algorithm, an iteration method is utilized to optimize the estimation of the linear parameters, and in the process of each iteration, the least square is adopted as an error function to perform linear fitting, so that the current linear parameters are corrected, and the detection accuracy is improved.
The fine bullet mark features on the surface of the bullet can be accurately identified by using the Hough transformation method.
And S4, searching an optimal rotation angle based on linear bullet mark features extracted from the source point cloud and the target point cloud.
The specific process of step S4 includes:
s41 using a three-dimensional rotation matrix To the source point cloudRotating to obtain a rotated source point cloudWherein, the method comprises the steps of,Indicating the rotation angle.
S42, according to the target point cloudWith rotated source point cloudsCalculating fitness functionThe calculation formula is as follows:
wherein, Representing dot product; Representing vectors Sum vectorDot product absolute value, vectorAndRespectively belong to target point cloudsAnd source point cloudHas been normalized; And Representing cloud at target pointAnd source point cloudFeature descriptors of corresponding point coordinates in the model; And The parameters of the Gaussian function are represented, the form of the function is regulated, the index part inside the function is the square of the difference value of vector point multiplication, the index part is Gaussian item, certain 'rewards' are given to the matched pair of vectors, and meanwhile, the matched pair of vectors are hardly influenced.
S43, repeating the steps S41 and S42, and finding the maximum fitness function in the iteration timesCorresponding rotation angleAs the optimum rotation angle.
According to the flow shown in FIG. 4, the largest fitness function is found within the number of iterationsCorresponding toThe angle is the optimal rotation angle.
And S5, aligning the optimal rotation angle of the rotation of the source point cloud with the target point cloud, and finishing the fine registration of the source point cloud and the target point cloud.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present disclosure may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (8)

1. The low-overlap-rate warhead point cloud registration method based on line feature detection is characterized by comprising the following steps of:
s1, carrying out voxel downsampling on a source point cloud and a target point cloud;
S2, respectively carrying out central axis fitting on the source point cloud and the target point cloud after voxel downsampling, and registering the two fitted central axes to realize coarse registration of the source point cloud and the target point cloud;
s3, extracting linear bullet mark features of the source point cloud and the target point cloud after coarse registration by using a Hough transformation method;
s4, searching an optimal rotation angle based on linear bullet mark features extracted from a source point cloud and a target point cloud;
And S5, aligning the optimal rotation angle of the rotation of the source point cloud with the target point cloud, and finishing the fine registration of the source point cloud and the target point cloud.
2. The method for registration of low overlap rate warhead point clouds based on line feature detection as claimed in claim 1, wherein in step S1, the specific process of voxel downsampling the source point clouds and the target point clouds comprises:
And constructing a three-dimensional voxel grid by using a cyclic voxel filtering method, gridding the source point cloud and the target point cloud, wherein each cube after gridding is a voxel, and all points in each voxel are replaced by gravity center points, so that voxel downsampling is completed.
3. The low overlap rate warhead point cloud registration method based on line feature detection according to claim 1, wherein in step S2, the specific process of performing central axis fitting on the source point cloud and the target point cloud after voxel downsampling respectively includes:
s21, layering a source point cloud and a target point cloud along a Y axis in the vertical direction, and then arbitrarily selecting four non-coplanar points for each layer Calculation pointsSpherical center coordinates of the outer ball of (2)The calculation equation is as follows:
;
Wherein r represents the radius of the outer ball;
S22, fitting the spherical center coordinates obtained by calculation of each layer into a straight line, and centering all point coordinates of the source point cloud and the target point cloud based on the fitted straight line to form a new point set So that the point setEach point coordinate of (3),The coordinates of the i-th point are indicated,Representing the center point coordinates of the point cloud;
s23, solving covariance matrix Is to find a direction vector based on the eigenvalues and eigenvectors of (a)So that the point setAt the position ofThe variance of the projection in the direction is maximized, wherein,N represents a point setThe number of midpoint coordinates,Representing a set of pointsIn finding the direction of the first principal componentThereafter, alongSelecting a series of point coordinates in the direction, and calculating the position of each centralized point coordinate in the directionProjection in direction,Representing a set of pointsCoordinates of the ith point in (a) at maximum and minimumUniformly select a series ofValues corresponding to each of the fitted straight lines are obtainedPoint coordinates of values
4. The low-overlapping-rate warhead point cloud registration method based on line feature detection according to claim 1 or 3, wherein in step S2, the fitted two central axes are registered, and the specific process of implementing coarse registration of the source point cloud and the target point cloud comprises the following steps:
s24, determining the direction vectors and the center points of the two central axes;
S25, calculating a rotation matrix by using the direction vectors of the two central axes;
s26, obtaining a translation vector according to the difference between the center point of the central axis of the source point cloud and the center point of the central axis of the target point cloud after rotation by rotating the central axis of the source point cloud;
and S27, combining the rotation matrix and the translation vector into a transformation matrix to realize rough registration of the source point cloud and the target point cloud.
5. The low overlap rate warhead point cloud registration method based on line feature detection according to claim 1, wherein in step S3, the specific process of performing linear bullet feature extraction on the source point cloud and the target point cloud after coarse registration by using a hough transform method includes:
s31, each point in the source point cloud and the target point cloud after voxel downsampling The value of (2) is defined asCalculated by the following formula:
wherein, Representation pointsK field covariance matrix eigenvalues;
If it is calculated that Greater than or equal to the set threshold value tau, then the point is identifiedIs a sharp characteristic point and is attributed to the target point cloudOr source point cloud;
S32, discretizing the parameter space of the Hough transformation into a regular icosahedron;
S33, cloud target points Or source point cloudThe voting result of each sharp feature point in the regular icosahedron is mapped to each face of the regular icosahedron to form a voting matrix of the parameter space;
And S34, performing linear bullet mark feature detection by using a modified Hough transform algorithm.
6. The method of line feature detection-based low overlap rate warhead point cloud registration of claim 5, wherein in step S34, the specific process of using the modified hough transform algorithm for linear bullet feature detection comprises:
S341, representing a straight line as a parameter (rho, theta, phi) in a voting matrix, wherein rho is the shortest distance from a point to the straight line, and theta and phi are direction angles;
s342, establishing an accumulator, voting each point in the source point cloud and the target point cloud in the accumulator according to the normal vector or the local characteristic of the point cloud, and finding out the combination of parameters (rho, theta, phi) higher than the set ballot number through peak detection to obtain the bullet mark linear characteristic or the bullet mark curve characteristic.
7. The method for low overlap rate bullet point cloud registration based on line feature detection of claim 5, wherein the estimation of the linear parameters is optimized by an iterative method during the process of using the improved hough transform algorithm to detect the linear bullet features, and the current linear parameters are corrected by performing linear fitting by least square as an error function during each iteration.
8. The method of line feature detection-based low overlap rate warhead point cloud registration of claim 5, wherein in step S4, the specific process of finding the optimal rotation angle based on the linear bullet features extracted from the source point cloud and the target point cloud comprises:
s41 using a three-dimensional rotation matrix To the source point cloudRotating to obtain a rotated source point cloudWherein, the method comprises the steps of,Indicating the rotation angle;
s42, according to the target point cloud With rotated source point cloudsCalculating fitness functionThe calculation formula is as follows:
wherein, The point-of-view is indicated,Representing vectorsSum vectorThe dot product absolute value of (2), vectorAndRespectively belong to target point cloudsAnd source point cloud,AndThe parameters representing the gaussian function are represented by,AndRepresenting cloud at target pointAnd source point cloudFeature descriptors of corresponding point coordinates in the model;
s43, repeating the steps S41 and S42, and finding the maximum fitness function in the iteration times Corresponding rotation angleAs the optimum rotation angle.
CN202411674257.5A 2024-11-21 2024-11-21 Low-overlap warhead point cloud registration method based on line feature detection Active CN119359775B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411674257.5A CN119359775B (en) 2024-11-21 2024-11-21 Low-overlap warhead point cloud registration method based on line feature detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411674257.5A CN119359775B (en) 2024-11-21 2024-11-21 Low-overlap warhead point cloud registration method based on line feature detection

Publications (2)

Publication Number Publication Date
CN119359775A true CN119359775A (en) 2025-01-24
CN119359775B CN119359775B (en) 2025-11-25

Family

ID=94312397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411674257.5A Active CN119359775B (en) 2024-11-21 2024-11-21 Low-overlap warhead point cloud registration method based on line feature detection

Country Status (1)

Country Link
CN (1) CN119359775B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111415379A (en) * 2020-03-23 2020-07-14 南京大学 Three-dimensional point cloud data registration method based on cuckoo optimization
CN111768409A (en) * 2020-09-03 2020-10-13 腾讯科技(深圳)有限公司 Box-type structure detection method and device based on artificial intelligence
US20200327418A1 (en) * 2019-04-12 2020-10-15 Ultrahaptics Ip Ltd Using Iterative 3D-Model Fitting for Domain Adaptation of a Hand-Pose-Estimation Neural Network
CN114677418A (en) * 2022-04-18 2022-06-28 南通大学 A registration method based on point cloud feature point extraction
CN116309733A (en) * 2022-12-05 2023-06-23 河北农业大学 Multi-view point cloud registration method, device, equipment and computer-readable storage medium
US20230316563A1 (en) * 2022-04-05 2023-10-05 Bluewrist Inc. Systems and methods for pose estimation via radial voting based keypoint localization
CN117572454A (en) * 2023-11-15 2024-02-20 武汉万曦智能科技有限公司 Method and system for measuring safety clearance of field vehicle storage battery
CN117788539A (en) * 2024-02-28 2024-03-29 菲特(天津)检测技术有限公司 Point cloud data registration method and system and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200327418A1 (en) * 2019-04-12 2020-10-15 Ultrahaptics Ip Ltd Using Iterative 3D-Model Fitting for Domain Adaptation of a Hand-Pose-Estimation Neural Network
CN111415379A (en) * 2020-03-23 2020-07-14 南京大学 Three-dimensional point cloud data registration method based on cuckoo optimization
CN111768409A (en) * 2020-09-03 2020-10-13 腾讯科技(深圳)有限公司 Box-type structure detection method and device based on artificial intelligence
US20230316563A1 (en) * 2022-04-05 2023-10-05 Bluewrist Inc. Systems and methods for pose estimation via radial voting based keypoint localization
CN114677418A (en) * 2022-04-18 2022-06-28 南通大学 A registration method based on point cloud feature point extraction
CN116309733A (en) * 2022-12-05 2023-06-23 河北农业大学 Multi-view point cloud registration method, device, equipment and computer-readable storage medium
CN117572454A (en) * 2023-11-15 2024-02-20 武汉万曦智能科技有限公司 Method and system for measuring safety clearance of field vehicle storage battery
CN117788539A (en) * 2024-02-28 2024-03-29 菲特(天津)检测技术有限公司 Point cloud data registration method and system and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
QIWEN ZHANG: "Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection", 《MDPI》, 12 July 2024 (2024-07-12), pages 1 - 20 *
QIWEN ZHANG: "Low-Overlap Bullet Point Cloud Registration Algorithm Based on Line Feature Detection", 《MDPI》, 12 June 2024 (2024-06-12), pages 1 - 20 *
储;吴侗;王璐;: "基于体密度变化率的点云多平面检测算法", 计算机应用, no. 05, 1 May 2013 (2013-05-01) *

Also Published As

Publication number Publication date
CN119359775B (en) 2025-11-25

Similar Documents

Publication Publication Date Title
CN109887015B (en) A Point Cloud Automatic Registration Method Based on Local Surface Feature Histogram
CN111553409B (en) A Point Cloud Recognition Method Based on Voxel Shape Descriptors
CN110807781B (en) A point cloud simplification method that preserves details and boundary features
CN107038717B (en) A Method for Automatically Analyzing 3D Point Cloud Registration Errors Based on Stereo Grid
CN112381862B (en) Full-automatic registration method and device for CAD (computer-aided design) model and triangular mesh
US8274508B2 (en) Method for representing objects with concentric ring signature descriptors for detecting 3D objects in range images
CN103400388B (en) Method for eliminating Brisk key point error matching point pair by using RANSAC
CN114648445B (en) A multi-view high-resolution point cloud stitching method based on feature point extraction and fine registration optimization
CN112116553B (en) Passive three-dimensional point cloud model defect identification method based on K-D tree
CN106250881A (en) A target recognition method and system based on 3D point cloud data
CN103955939A (en) Boundary feature point registering method for point cloud splicing in three-dimensional scanning system
CN107886529A (en) A kind of point cloud registration method for three-dimensional reconstruction
CN106529591A (en) Improved MSER image matching algorithm
CN114200477A (en) A method for processing point cloud data of ground targets of laser 3D imaging radar
CN109949350A (en) A Multitemporal Point Cloud Automatic Registration Method Based on Morphological Invariant Features
CN114663373B (en) A point cloud registration method and device for part surface quality inspection
CN108830888B (en) Coarse matching method based on improved multi-scale covariance matrix characteristic descriptor
CN114972459A (en) Point cloud registration method based on low-dimensional point cloud local feature descriptor
CN110211129B (en) Low-coverage point cloud registration algorithm based on region segmentation
CN116758126A (en) A fast point cloud registration method based on false matching elimination of similar triangles
CN119006543A (en) Point cloud registration method based on neighborhood normal vector and curvature
CN117274339A (en) Point cloud registration method based on improved ISS-3DSC characteristics combined with ICP
CN107490346A (en) A kind of RFID multi-tags Network Three-dimensional measurement modeling method of view-based access control model
CN115527048B (en) A robust 3D point cloud feature matching method based on progressive consistency voting
CN116630662A (en) Feature point mismatching eliminating method applied to visual SLAM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant