[go: up one dir, main page]

CN116977677A - Image feature point matching screening method, device, equipment and medium based on clustering - Google Patents

Image feature point matching screening method, device, equipment and medium based on clustering Download PDF

Info

Publication number
CN116977677A
CN116977677A CN202310839410.4A CN202310839410A CN116977677A CN 116977677 A CN116977677 A CN 116977677A CN 202310839410 A CN202310839410 A CN 202310839410A CN 116977677 A CN116977677 A CN 116977677A
Authority
CN
China
Prior art keywords
feature point
image
temporary
feature
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310839410.4A
Other languages
Chinese (zh)
Other versions
CN116977677B (en
Inventor
秦超
黄哲
王孝宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN202310839410.4A priority Critical patent/CN116977677B/en
Publication of CN116977677A publication Critical patent/CN116977677A/en
Application granted granted Critical
Publication of CN116977677B publication Critical patent/CN116977677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

本发明涉及图像处理技术领域,尤其涉及一种基于聚类的图像特征点匹配筛选方法、装置、设备及介质,该方法包括提取第一图像内的第一特征点集合和第二图像内的第二特征点集合,对第一特征点集合和第二特征点集合进行特征点匹配,得到N组特征点匹配对,从N组特征点匹配对中随机选择,得到M个临时匹配对,根据临时匹配对计算得到临时旋转矩阵和临时平移向量,将M个临时旋转矩阵和M个临时平移向量分别聚类,得到参考旋转矩阵和参考平移向量,计算每组特征点匹配对的投影误差,将满足预设条件的特征点匹配对确定为目标点匹配对,通过变换参数的聚类结果筛选特征点匹配对,提高了特征点匹配的精度和准确率。

The present invention relates to the technical field of image processing, and in particular to a clustering-based image feature point matching and screening method, device, equipment and medium. The method includes extracting a first set of feature points in a first image and a third set of feature points in a second image. Two feature point sets, perform feature point matching on the first feature point set and the second feature point set, and obtain N sets of feature point matching pairs. Randomly select from the N sets of feature point matching pairs to obtain M temporary matching pairs. According to the temporary The matching pairs are calculated to obtain the temporary rotation matrix and the temporary translation vector. M temporary rotation matrices and M temporary translation vectors are clustered respectively to obtain the reference rotation matrix and reference translation vector. The projection error of each set of feature point matching pairs is calculated, which will satisfy The feature point matching pairs with preset conditions are determined as the target point matching pairs, and the feature point matching pairs are filtered through the clustering results of the transformation parameters, thereby improving the accuracy and precision of feature point matching.

Description

基于聚类的图像特征点匹配筛选方法、装置、设备及介质Image feature point matching and screening methods, devices, equipment and media based on clustering

技术领域Technical field

本发明涉及图像处理技术领域,尤其涉及一种基于聚类的图像特征点匹配筛选方法、装置、设备及介质。The present invention relates to the technical field of image processing, and in particular to a clustering-based image feature point matching and screening method, device, equipment and medium.

背景技术Background technique

目前,图像匹配技术已广泛应用于图像拼接、三维重建、定位导航、图像检索等多种任务中,以将两幅图像中具有相同或者相似属性的内容或结构进行像素上的识别与对齐,图像匹配技术基于图像特征点提取技术实现,现有的图像特征点提取技术通常采用如尺度不变特征变换(Scale-invariant feature transform,SIFT)算法、ORB(OrientedFast and Rotated Brief)算法等基于算子的提取方法,或者采用基于深度学习模型的提取方法。At present, image matching technology has been widely used in various tasks such as image splicing, three-dimensional reconstruction, positioning and navigation, image retrieval, etc. to identify and align the content or structure with the same or similar attributes in two images on a pixel basis. Matching technology is based on image feature point extraction technology. Existing image feature point extraction technology usually uses operator-based algorithms such as Scale-invariant feature transform (SIFT) algorithm, ORB (OrientedFast and Rotated Brief) algorithm. extraction method, or use an extraction method based on a deep learning model.

但是,在高精度的图像匹配需求下,需要得到两幅图像之间准确的特征点匹配关系,而上述方法均不能确保特征点匹配足够准确,也即图像特征点匹配的准确率较差,因此,如何提高图像特征点匹配的准确率,进而提高图像匹配的准确率成为了亟待解决的问题。However, under the requirement of high-precision image matching, it is necessary to obtain accurate feature point matching relationships between two images, and none of the above methods can ensure that the feature point matching is accurate enough, that is, the accuracy of image feature point matching is poor. Therefore, , how to improve the accuracy of image feature point matching, and then improve the accuracy of image matching, has become an urgent problem to be solved.

发明内容Contents of the invention

有鉴于此,本发明实施例提供了一种基于聚类的图像特征点匹配筛选方法、装置、设备及介质,以解决图像特征点匹配的准确率较差的问题。In view of this, embodiments of the present invention provide a clustering-based image feature point matching and screening method, device, equipment and medium to solve the problem of poor accuracy in image feature point matching.

第一方面,本发明实施例提供一种基于聚类的图像特征点匹配筛选方法,所述图像特征点匹配筛选方法包括:In a first aspect, embodiments of the present invention provide a clustering-based image feature point matching and screening method. The image feature point matching and screening method includes:

获取待匹配的第一图像和第二图像,提取所述第一图像内的特征点,得到第一特征点集合,提取所述第二图像内的特征点,得到第二特征点集合;Obtain the first image and the second image to be matched, extract feature points in the first image to obtain a first feature point set, extract feature points in the second image to obtain a second feature point set;

对所述第一特征点集合和所述第二特征点集合进行特征点匹配,得到N组特征点匹配对,所述特征点匹配对包括所述第一特征点集合中的一个特征点和所述第二特征点集合中的一个特征点,N为大于五的整数;Feature point matching is performed on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs, where the feature point matching pairs include one feature point in the first feature point set and all A feature point in the second feature point set, N is an integer greater than five;

从所述N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量,M为小于或者等于N的整数;M groups of temporary matching pairs are randomly selected from the N groups of feature point matching pairs, and based on each group of temporary matching pairs, the temporary rotation between the first image and the second image in the corresponding temporary matching pair is calculated. Matrix and temporary translation vector, get M temporary rotation matrices and M temporary translation vectors, M is an integer less than or equal to N;

将所述M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定所述第一聚类中心点为参考旋转矩阵,将所述M个临时平移向量进行聚类,得到第二聚类中心点,确定所述第二聚类中心点为参考平移向量;The M temporary rotation matrices are clustered to obtain the first cluster center point, the first cluster center point is determined as the reference rotation matrix, and the M temporary translation vectors are clustered to obtain the second clustering center point. Class center point, determine the second cluster center point as the reference translation vector;

根据所述参考旋转矩阵和所述参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,所述目标点匹配对用于计算出所述第一图像和所述第二图像之间的目标旋转矩阵和目标平移向量,以指示所述第一图像和所述第二图像进行图像匹配。According to the reference rotation matrix and the reference translation vector, the projection error of each set of feature point matching pairs is calculated, and the feature point matching pairs corresponding to the projection errors within the preset range are determined as target point matching pairs. The matching pair is used to calculate a target rotation matrix and a target translation vector between the first image and the second image to indicate image matching between the first image and the second image.

第二方面,本发明实施例提供一种基于聚类的图像特征点匹配筛选装置,所述图像特征点匹配筛选装置包括:In a second aspect, embodiments of the present invention provide an image feature point matching and screening device based on clustering. The image feature point matching and screening device includes:

特征点提取模块,用于获取待匹配的第一图像和第二图像,提取所述第一图像内的特征点,得到第一特征点集合,提取所述第二图像内的特征点,得到第二特征点集合;A feature point extraction module is used to obtain the first image and the second image to be matched, extract the feature points in the first image to obtain the first feature point set, extract the feature points in the second image to obtain the third Two feature point sets;

特征点匹配模块,用于对所述第一特征点集合和所述第二特征点集合进行特征点匹配,得到N组特征点匹配对,所述特征点匹配对包括所述第一特征点集合中的一个特征点和所述第二特征点集合中的一个特征点,N为大于五的整数;A feature point matching module, configured to perform feature point matching on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs, where the feature point matching pairs include the first feature point set One feature point in and one feature point in the second feature point set, N is an integer greater than five;

匹配对选择模块,用于从所述N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量,M为小于或者等于N的整数;A matching pair selection module, configured to randomly select M groups of temporary matching pairs from the N groups of feature point matching pairs, and calculate the first image and the first image in the corresponding temporary matching pair according to each group of temporary matching pairs. The temporary rotation matrix and temporary translation vector between the two images are obtained, and M temporary rotation matrices and M temporary translation vectors are obtained. M is an integer less than or equal to N;

聚类处理模块,用于将所述M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定所述第一聚类中心点为参考旋转矩阵,将所述M个临时平移向量进行聚类,得到第二聚类中心点,确定所述第二聚类中心点为参考平移向量;A clustering processing module is used to cluster the M temporary rotation matrices to obtain a first cluster center point, determine the first cluster center point as a reference rotation matrix, and perform processing on the M temporary translation vectors. Cluster, obtain the second cluster center point, and determine the second cluster center point as the reference translation vector;

匹配对确定模块,用于根据所述参考旋转矩阵和所述参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,所述目标点匹配对用于计算出所述第一图像和所述第二图像之间的目标旋转矩阵和目标平移向量,以指示所述第一图像和所述第二图像进行图像匹配。A matching pair determination module, configured to calculate the projection error of each set of feature point matching pairs according to the reference rotation matrix and the reference translation vector, and determine the feature point matching pair corresponding to the projection error within the preset range as the target point Matching pair, the target point matching pair is used to calculate the target rotation matrix and target translation vector between the first image and the second image to instruct the first image and the second image to perform image processing. match.

第三方面,本发明实施例提供一种计算机设备,所述计算机设备包括处理器、存储器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如第一方面所述的图像特征点匹配筛选方法。In a third aspect, embodiments of the present invention provide a computer device. The computer device includes a processor, a memory, and a computer program stored in the memory and executable on the processor. The processor executes the The computer program implements the image feature point matching and filtering method described in the first aspect.

第四方面,本发明实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如第一方面所述的图像特征点匹配筛选方法。In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium that stores a computer program. When the computer program is executed by a processor, the image feature point matching as described in the first aspect is implemented. Screening methods.

本发明实施例与现有技术相比存在的有益效果是:Compared with the prior art, the beneficial effects of the embodiments of the present invention are:

获取待匹配的第一图像和第二图像,提取第一图像内的特征点,得到第一特征点集合,提取第二图像内的特征点,得到第二特征点集合,对第一特征点集合和第二特征点集合进行特征点匹配,得到N组特征点匹配对,初始化选择次数为零,从所述N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量,将M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转矩阵,将M个临时平移向量进行聚类,得到第二聚类中心点,确定第二聚类中心点为参考平移向量,根据参考旋转矩阵和参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,通过聚类得到的参考旋转矩阵和参考平移矩阵对特征点匹配对进行筛选,从而保留可靠的目标点匹配对,提高了特征点匹配的精度和准确率,进而提高了图像匹配的准确率。Obtain the first image and the second image to be matched, extract the feature points in the first image to obtain the first feature point set, extract the feature points in the second image to obtain the second feature point set, and compare the first feature point set Perform feature point matching with the second set of feature points to obtain N sets of feature point matching pairs. The initial selection number is zero. M sets of temporary matching pairs are randomly selected from the N sets of feature point matching pairs. According to each set of temporary matching pairs, , calculate the temporary rotation matrix and temporary translation vector between the first image and the second image in the corresponding temporary matching pair, obtain M temporary rotation matrices and M temporary translation vectors, and combine the M temporary rotation matrices Perform clustering to obtain the first cluster center point, determine the first cluster center point as the reference rotation matrix, cluster the M temporary translation vectors, obtain the second cluster center point, and determine the second cluster center point as Reference translation vector, according to the reference rotation matrix and reference translation vector, calculate the projection error of each set of feature point matching pairs, and determine the feature point matching pair corresponding to the projection error within the preset range as the target point matching pair, which is obtained through clustering The reference rotation matrix and reference translation matrix filter the feature point matching pairs, thereby retaining reliable target point matching pairs, improving the precision and accuracy of feature point matching, and thus improving the accuracy of image matching.

附图说明Description of the drawings

为了更清楚地说明本发明实施例中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments or prior art will be briefly introduced below. Obviously, the drawings in the following description are only illustrative of the present invention. For some embodiments, for those of ordinary skill in the art, other drawings can be obtained based on these drawings without exerting creative efforts.

图1是本发明实施例一提供的一种基于聚类的图像特征点匹配筛选方法的一应用环境示意图;Figure 1 is a schematic diagram of an application environment of a clustering-based image feature point matching and screening method provided in Embodiment 1 of the present invention;

图2是本发明实施例一提供的一种基于聚类的图像特征点匹配筛选方法的流程示意图;Figure 2 is a schematic flow chart of a clustering-based image feature point matching and screening method provided in Embodiment 1 of the present invention;

图3是本发明实施例二提供的一种基于聚类的图像特征点匹配筛选装置的结构示意图;Figure 3 is a schematic structural diagram of a clustering-based image feature point matching and filtering device provided in Embodiment 2 of the present invention;

图4是本发明实施例三提供的一种计算机设备的结构示意图。Figure 4 is a schematic structural diagram of a computer device provided in Embodiment 3 of the present invention.

具体实施方式Detailed ways

以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本发明实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本发明。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本发明的描述。In the following description, specific details such as specific system structures and technologies are provided for the purpose of illustration rather than limitation, so as to provide a thorough understanding of the embodiments of the present invention. However, it will be apparent to those skilled in the art that the present invention may be practiced in other embodiments without these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the present invention in unnecessary detail.

应当理解,当在本发明说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。It will be understood that, when used in the description of the present invention and the appended claims, the term "comprising" indicates the presence of the described features, integers, steps, operations, elements and/or components but does not exclude one or more other The presence or addition of features, integers, steps, operations, elements, components and/or collections thereof.

还应当理解,在本发明说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。It will also be understood that the term "and/or" as used in the specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.

如在本发明说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。As used in the description of the present invention and the appended claims, the term "if" may be interpreted as "when" or "once" or "in response to determining" or "in response to detecting" depending on the context. ". Similarly, the phrase "if determined" or "if [the described condition or event] is detected" may be interpreted, depending on the context, to mean "once determined" or "in response to a determination" or "once the [described condition or event] is detected ]" or "in response to detection of [the described condition or event]".

另外,在本发明说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。In addition, in the description of the present specification and appended claims, the terms "first", "second", "third", etc. are only used to distinguish the description and shall not be understood as indicating or implying relative importance.

在本发明说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本发明的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形均是用于表示“包括但不限于”的含义,除非是本发明说明书中以其他方式另外特别强调。Reference in the specification of the invention to "one embodiment" or "some embodiments" or the like means that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the invention. Therefore, the phrases "in one embodiment", "in some embodiments", "in other embodiments", "in other embodiments", etc. appearing in different places in this specification are not necessarily References are made to the same embodiment, but rather to "one or more but not all embodiments" unless specifically stated otherwise. The terms "including", "includes", "having" and their variations are all used to mean "including but not limited to" unless otherwise specifically emphasized in the description of the present invention.

应理解,以下实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本发明实施例的实施过程构成任何限定。It should be understood that the sequence number of each step in the following embodiments does not mean the order of execution. The execution order of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.

为了说明本发明的技术方案,下面通过具体实施例来进行说明。In order to illustrate the technical solution of the present invention, specific examples will be described below.

本发明实施例一提供的一种基于聚类的图像特征点匹配筛选方法,可应用在如图1的应用环境中,客户端与服务端进行通信。其中,客户端包括但不限于掌上电脑、桌上型计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、云端终端设备、个人数字助理(personal digital assistant,PDA)等计算机设备。服务端可以是独立的服务器,也可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Delivery Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。The clustering-based image feature point matching and filtering method provided in Embodiment 1 of the present invention can be applied in the application environment as shown in Figure 1, where the client communicates with the server. Among them, clients include but are not limited to handheld computers, desktop computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, cloud terminal devices, personal digital assistants (personal digital assistants, PDAs), etc. Computer equipment. The server can be an independent server, or it can provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, and content delivery networks (Content Delivery Network). , CDN), as well as cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.

客户端可以与至少一个图像采集端进行连接,以从图像采集端获取采集到的待匹配图像,图像采集端包括但不限于摄像头、摄影机、录影机、手持摄影设备、具有摄影功能的移动设备等。由于本实施例涉及图像匹配技术,因此,在采用单个图像采集端时,该图像采集端的位姿是可以移动的,也即第一图像和第二图像时单个图像采集端在不同位姿下采集到的,例如由人为使用的具有摄影功能的移动设备、部署于移动轨道等可移动平台上的摄像头等,在采用多个图像采集端时,以两个图像采集端分别采集到的图像作为第一图像和第二图像。客户端可以应用于三维建模、智能导航、行人识别等多种应用场景下,以用于通过连接的图像采集端采集待匹配图像,以智能导航场景为例,图像采集端可以是指车辆搭载的视觉传感器,图像采集端可以用于采集道路图像信息并传输给客户端,客户端将采集到的待匹配的道路图像信息传输给车辆中以高算力芯片为基础的服务端进行图像匹配处理,从而实现道路场景建模、车道线识别、行人识别等具体应用功能。The client can connect to at least one image collection terminal to obtain the collected images to be matched from the image collection terminal. The image collection terminal includes but is not limited to cameras, video cameras, video recorders, handheld photography equipment, mobile devices with photography functions, etc. . Since this embodiment involves image matching technology, when a single image collection terminal is used, the posture of the image collection terminal can be moved, that is, the first image and the second image are collected by a single image collection terminal in different postures. For example, mobile devices with photography functions used by humans, cameras deployed on movable platforms such as moving tracks, etc., when multiple image collection terminals are used, the images collected by the two image collection terminals are used as the third image collection terminal. one image and a second image. The client can be used in various application scenarios such as 3D modeling, intelligent navigation, and pedestrian recognition to collect images to be matched through the connected image acquisition terminal. Taking the intelligent navigation scenario as an example, the image acquisition terminal can refer to the vehicle-mounted Using visual sensors, the image acquisition terminal can be used to collect road image information and transmit it to the client. The client transmits the collected road image information to be matched to the server in the vehicle based on high computing power chips for image matching processing. , thereby realizing specific application functions such as road scene modeling, lane line recognition, and pedestrian recognition.

参见图2,是本发明实施例一提供的一种基于聚类的图像特征点匹配筛选方法的流程示意图,上述图像特征点匹配筛选方法可以应用于图1中的服务端,服务端对应的计算机设备与客户端连接,以获取待匹配的第一图像和第二图像,服务端对应的计算机设备能够执行如投影矩阵运算、矩阵分解等计算任务,如图2所示,该图像特征点匹配筛选方法可以包括以下步骤:Refer to Figure 2, which is a schematic flow chart of a clustering-based image feature point matching and screening method provided in Embodiment 1 of the present invention. The above image feature point matching and screening method can be applied to the server in Figure 1, and the computer corresponding to the server The device is connected to the client to obtain the first image and the second image to be matched. The computer device corresponding to the server can perform computing tasks such as projection matrix operations and matrix decomposition. As shown in Figure 2, the image feature points are matched and filtered. Methods can include the following steps:

步骤S201,获取待匹配的第一图像和第二图像,提取第一图像内的特征点,得到第一特征点集合,提取第二图像内的特征点,得到第二特征点集合。Step S201: Obtain the first image and the second image to be matched, extract feature points in the first image to obtain a first set of feature points, and extract feature points in the second image to obtain a second set of feature points.

其中,第一图像和第二图像可以是指待匹配的两张图像,由于图像匹配任务的特点,第一图像和第二图像之间应当存在有相同的图像信息,以场景建模为例,第一图像和第二图像可以是以不同方位采集到的待建模场景区域的图像,并且待建模场景区域内存在参照物体同时存在于第一图像和第二图像中,也即第一图像和第二图像的图像信息存在部分重合。Among them, the first image and the second image may refer to the two images to be matched. Due to the characteristics of the image matching task, there should be the same image information between the first image and the second image. Taking scene modeling as an example, The first image and the second image may be images of the scene area to be modeled collected at different orientations, and there are reference objects in the scene area to be modeled that simultaneously exist in the first image and the second image, that is, the first image There is partial overlap with the image information of the second image.

特征点可以是指所属图像中具有鲜明特性并能够有效反映图像本质特征,或者能够标识所属图像中目标物体的点,特征点通常可以包括角点、边缘点、边缘端点、极值点等,通常情况下,特征点具有区别于图像内其他点的特点,例如特征点周围的灰度变化较其他点更为剧烈、特征点对应曲率变化较其他点更为剧烈等。Feature points can refer to points in the image that have distinctive characteristics and can effectively reflect the essential characteristics of the image, or that can identify the target object in the image. Feature points can usually include corner points, edge points, edge end points, extreme points, etc., usually In this case, the feature point has characteristics that are different from other points in the image. For example, the grayscale change around the feature point is more drastic than that of other points, the corresponding curvature change of the feature point is more drastic than that of other points, etc.

第一特征点集合包含从第一图像内提取到的至少五个特征点,第二特征点集合包含从第二图像内提取到的至少五个特征点。The first feature point set includes at least five feature points extracted from the first image, and the second feature point set includes at least five feature points extracted from the second image.

具体地,由于在图像采集设备内参已知的情况下,基于对极约束原理,需要至少五对匹配点以恢复图像采集设备之间的变换信息,因此限制第一特征点集合和第二特征点集合均应分别包含至少五个特征点,该图像采集设备之间的变换信息可以是指单个图像采集设备运动而导致的位姿差异,也可以是指两个图像采集设备之间的部署位姿差异。Specifically, since when the internal parameters of the image acquisition device are known, based on the epipolar constraint principle, at least five pairs of matching points are needed to restore the transformation information between the image acquisition devices, so the first feature point set and the second feature point set are limited Each set should contain at least five feature points respectively. The transformation information between the image acquisition devices can refer to the pose difference caused by the movement of a single image capture device, or it can refer to the deployment pose between two image capture devices. difference.

图像采集设备内参可以包括像素焦距和光心坐标信息,像素坐标系的原点通常与图像采集设备的零点并不一致,因此将光心到图像采集设备零点的偏移记为cx,cy,也即光心在像素坐标系上横轴偏移和纵轴偏移,将cx,cy作为光心坐标信息。The internal parameters of the image acquisition device may include pixel focal length and optical center coordinate information. The origin of the pixel coordinate system is usually not consistent with the zero point of the image acquisition device. Therefore, the offset from the optical center to the zero point of the image acquisition device is recorded as c x , cy , that is, The optical center is offset on the horizontal axis and the vertical axis on the pixel coordinate system, and c x and c y are used as the optical center coordinate information.

相应地,图像坐标系中的一点(x,y)到像素坐标系中的像素点(u,v)的关系可以表示为,而其中,图像坐标系中的一点(x,y)由图像采集设备的相机坐标系中的一点(X,Y,Z)转换得到,转换方式为k,l分别为每个像素点在水平和竖直方向上的尺寸,令将fx和fy作为像素焦距,则图像采集设备内参可以通过内参矩阵K来表示,具体可以表示为/> Correspondingly, the relationship between a point (x, y) in the image coordinate system and a pixel point (u, v) in the pixel coordinate system can be expressed as, Among them, a point (x, y) in the image coordinate system is converted from a point (X, Y, Z) in the camera coordinate system of the image acquisition device. The conversion method is: k and l are the dimensions of each pixel in the horizontal and vertical directions respectively, let Taking f x and f y as the pixel focal length, the internal parameters of the image acquisition device can be represented by the internal parameter matrix K, which can be expressed as/>

需要说明的是,若所提取到的第一特征点集合或者第二特征点集合中包含的特征点数量小于五个,则需要重新选择用于图像匹配的第一图像和第二图像,或者重新选择特征点提取方式,直至满足第一特征点集合和第二特征点集合均分别包含至少五个特征点的条件。It should be noted that if the number of feature points contained in the extracted first feature point set or the second feature point set is less than five, you need to reselect the first image and the second image for image matching, or re-select the first image and the second image for image matching. The feature point extraction method is selected until the condition that the first feature point set and the second feature point set each contain at least five feature points is met.

可选的是,提取第一图像内的特征点,得到第一特征点集合,提取第二图像内的特征点,得到第二特征点集合,包括:Optionally, extract feature points in the first image to obtain a first set of feature points, and extract feature points in the second image to obtain a second set of feature points, including:

采用特征点提取算子在第一图像内搜索,得到第一图像内的Q个第一特征点及其对应的第一特征描述子,Q为大于N的整数;Use the feature point extraction operator to search within the first image to obtain Q first feature points and their corresponding first feature descriptors in the first image, where Q is an integer greater than N;

采用特征点提取算子在第二图像内搜索,得到第二图像内的S个第二特征点及其对应的第二特征描述子,S为大于N的整数;Use the feature point extraction operator to search within the second image to obtain S second feature points and their corresponding second feature descriptors in the second image, where S is an integer greater than N;

由第一图像内的Q个第一特征点形成第一图像的第一特征点集合,由第二图像内的S个第二特征点形成第二图像的第二特征点集合。The Q first feature points in the first image form a first feature point set of the first image, and the S second feature points in the second image form a second feature point set of the second image.

其中,特征点提取算子可以用于提取图像内的特征点,在本实施例中,特征点提取算子可以采用Harris算子,实施者还可以采用SIFT、加速鲁棒特征(Speeded-Up RobustFeatures,SURF)、加速分割测试特征(Features from accelerated segment test,FAST)、ORB等特征提取方式进行特征点提取,在此不限制特征点提取算子的选择方式,实施者选用任一种特征点提取算子进行图像内特征点提取均在本发明保护范围之内。Among them, the feature point extraction operator can be used to extract feature points in the image. In this embodiment, the feature point extraction operator can use the Harris operator. The implementer can also use SIFT, Speeded-Up Robust Features , SURF), Features from accelerated segment test (FAST), ORB and other feature extraction methods for feature point extraction. There is no restriction on the selection method of the feature point extraction operator. The implementer can choose any feature point extraction method. Operators used to extract feature points in images are all within the scope of the present invention.

第一特征点可以是指第一图像内提取到的特征点,第一特征描述子可以是指对应第一特征点及其邻域像素的描述信息,第二特征点可以是指第二图像内提取到的特征点,第二特征描述子可以是指对应第二特征点及其邻域像素的描述信息。The first feature point may refer to the feature point extracted in the first image, the first feature descriptor may refer to the description information corresponding to the first feature point and its neighboring pixels, and the second feature point may refer to the feature point in the second image. The extracted feature point and the second feature descriptor may refer to the description information corresponding to the second feature point and its neighboring pixels.

具体地,由于后续提取到N组特征点匹配对,则每个特征点集合中均应至少包含N个特征点,而不同特征点集合提取到的特征点数量可能并不一致,因此无需限制Q和S相同,仅需保证Q和S均大于N即可。Specifically, since N groups of feature point matching pairs are subsequently extracted, each feature point set should contain at least N feature points, and the number of feature points extracted from different feature point sets may not be consistent, so there is no need to limit Q and S is the same, just ensure that both Q and S are greater than N.

本实施例中,提取到图像中的特征点和特征描述子,从而提取后续特征点匹配的准确率,同时保证后续基于匹配的特征点对能够正常执行筛选过程,提高整体图像特征点匹配筛选过程的效率。In this embodiment, feature points and feature descriptors in the image are extracted, thereby extracting the accuracy of subsequent feature point matching, while ensuring that subsequent matching-based feature point pairs can normally perform the screening process, and improving the overall image feature point matching and screening process. s efficiency.

上述获取待匹配的第一图像和第二图像,提取第一图像内的特征点,得到第一特征点集合,提取第二图像内的特征点,得到第二特征点集合的步骤,为后续特征点匹配提供基础,从而使得特征点匹配及筛选过程能够正常执行,从而提高特征点匹配筛选过程的效率。The above steps of obtaining the first image and the second image to be matched, extracting the feature points in the first image to obtain the first feature point set, extracting the feature points in the second image to obtain the second feature point set are the subsequent features. Point matching provides a foundation so that the feature point matching and screening process can be executed normally, thereby improving the efficiency of the feature point matching and screening process.

步骤S202,对第一特征点集合和第二特征点集合进行特征点匹配,得到N组特征点匹配对。Step S202: Perform feature point matching on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs.

其中,特征点匹配对包括第一特征点集合中的一个特征点和第二特征点集合中的一个特征点,N为大于五的整数。Wherein, the feature point matching pair includes one feature point in the first feature point set and one feature point in the second feature point set, and N is an integer greater than five.

可选的是,对第一特征点集合和第二特征点集合进行特征点匹配,得到N组特征点匹配对,包括:Optionally, feature point matching is performed on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs, including:

针对第一特征点集合中的任一第一特征点,将第一特征点的第一特征描述子分别和第二特征点集合中的每个第二特征点的第二特征描述子进行相似度计算,得到第二特征点集合中每个第二特征点对应的相似度值,确定所有相似度值中的最大值对应的第二特征点和第一特征点形成一组特征点匹配对;For any first feature point in the first feature point set, similarity is calculated between the first feature descriptor of the first feature point and the second feature descriptor of each second feature point in the second feature point set. Calculate to obtain the similarity value corresponding to each second feature point in the second feature point set, and determine that the second feature point corresponding to the maximum value among all similarity values and the first feature point form a set of feature point matching pairs;

遍历第一特征点集合中的所有第一特征点,得到N组特征点匹配对。Traverse all the first feature points in the first feature point set to obtain N sets of matching pairs of feature points.

其中,相似度值可以表征所针对的第一特征点的第一特征描述子和对应第二特征点的第二特征描述子之间的差异程度。The similarity value may represent the degree of difference between the first feature descriptor of the targeted first feature point and the second feature descriptor corresponding to the second feature point.

具体地,特征描述子可以采用向量表示,则特征描述子之间的相似度是容易计算的,计算方式可以采用欧式距离、汉明距离、余弦相似度等距离度量方式。Specifically, the feature descriptors can be represented by vectors, and the similarity between the feature descriptors is easy to calculate. The calculation method can use distance measurement methods such as Euclidean distance, Hamming distance, and cosine similarity.

以余弦相似度为例,余弦相似度的取值范围为[-1,1],所计算得到的余弦相似度越接近-1,说明用于计算的特征描述子之间的差异越大,也即对应的特征点之间越不相似,所计算得到的余弦相似度越接近1,说明用于计算的特征描述子之间的差异越小,也即对应的特征点之间越相似。Taking cosine similarity as an example, the value range of cosine similarity is [-1,1]. The closer the calculated cosine similarity is to -1, the greater the difference between the feature descriptors used for calculation. That is, the less similar the corresponding feature points are, the closer the calculated cosine similarity is to 1, which means the smaller the difference between the feature descriptors used for calculation, that is, the more similar the corresponding feature points are.

上述确定所有相似度值中的最大值对应的第二特征点和第一特征点形成一组特征点匹配对的方式可以是指暴力匹配的方式,在实施者实际进行特征点匹配时,还可以采用交叉匹配法、K近邻(K-Nearest Neighbor,KNN)匹配法、随机抽样一致(Random SampleConsensus,RANSAC)匹配法等。The above-mentioned method of determining that the second feature point corresponding to the maximum value among all similarity values and the first feature point form a set of feature point matching pairs may refer to a brute force matching method. When the implementer actually performs feature point matching, the method may also be The cross matching method, K-Nearest Neighbor (KNN) matching method, Random SampleConsensus (RANSAC) matching method, etc. are used.

其中,交叉匹配法可以是指进行两次匹配,初次匹配以一个第一特征点为基础,得到该第一特征点和一个第二特征点匹配,再次匹配以该第二特征点为基础,若仍然得到该第二特征点和该第一特征点匹配,则确定该第一特征点和该第二特征点是正确的匹配。Among them, the cross-matching method may refer to two matchings. The first matching is based on a first feature point to obtain a match between the first feature point and a second feature point. The second matching is based on the second feature point. If If the second feature point and the first feature point are still matched, it is determined that the first feature point and the second feature point are correctly matched.

KNN匹配法可以是指选择与所针对第一特征点最为相似的若干个第二特征点,若这些第二特征点之间的差异较大,则认为最相似的第二特征点为该第一特征点的正确匹配。The KNN matching method may refer to selecting several second feature points that are most similar to the targeted first feature point. If the difference between these second feature points is large, the most similar second feature point is considered to be the first Correct matching of feature points.

RANSAC匹配法可以是指随机选择若干个特征点匹配对进行投影矩阵计算,并用计算得到的投影矩阵计算图像中每对特征点匹配对的投影误差,寻找使得全局投影误差最小的投影矩阵,并确定该全局投影误差最小的投影矩阵对应的特征点匹配对为正确匹配的匹配对,但是,RANSAC通常需要重复计算多次,才能保证筛选得到的匹配对正确率较高,而且并不能保证筛选得到的匹配对完全正确。The RANSAC matching method can refer to randomly selecting several feature point matching pairs for projection matrix calculation, and using the calculated projection matrix to calculate the projection error of each pair of feature point matching pairs in the image, looking for the projection matrix that minimizes the global projection error, and determining The feature point matching pairs corresponding to the projection matrix with the smallest global projection error are correct matching pairs. However, RANSAC usually needs to be repeated multiple times to ensure that the filtered matching pairs have a high accuracy, and it cannot guarantee that the filtered matching pairs will be correct. The matching pair is exactly right.

需要说明的是,在此不限制特征点匹配方法的选择方式,实施者选用任一种特征点匹配方法进行图像特征点匹配处理均在本发明保护范围之内。It should be noted that the selection method of the feature point matching method is not limited here. Any feature point matching method selected by the implementer to perform image feature point matching processing is within the protection scope of the present invention.

本实施例中,通过特征描述子对特征点进行表征,进而进行特征点之间的相似度的计算,通过特征点匹配算法进行特征点匹配,从而得到初始特征点匹配对,便于为后续对特征点匹配对的筛选提供基础,有利于提高特征点匹配筛选过程的效率。In this embodiment, the feature points are characterized by feature descriptors, and then the similarity between the feature points is calculated. The feature points are matched through the feature point matching algorithm, thereby obtaining the initial feature point matching pair, which is convenient for subsequent feature matching. It provides a basis for the screening of point matching pairs, which is helpful to improve the efficiency of the feature point matching and screening process.

上述对第一特征点集合和第二特征点集合进行特征点匹配,得到N组特征点匹配对的步骤,通过特征点匹配算法进行特征点匹配,从而得到初始特征点匹配对,便于为后续对特征点匹配对的筛选提供基础,有利于提高特征点匹配筛选过程的效率。The above-mentioned steps of performing feature point matching on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs are performed. Feature point matching is performed through the feature point matching algorithm to obtain initial feature point matching pairs, which is convenient for subsequent pairs. It provides a basis for screening feature point matching pairs, which is beneficial to improving the efficiency of the feature point matching and screening process.

步骤S203,从N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的第一图像和第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量。Step S203, randomly select M groups of temporary matching pairs from N groups of feature point matching pairs, and calculate the temporary rotation matrix and temporary rotation matrix between the first image and the second image in the corresponding temporary matching pair according to each group of temporary matching pairs. Translate the vector to get M temporary rotation matrices and M temporary translation vectors.

其中,临时匹配对可以是指用于计算旋转矩阵和平移向量的特征点匹配对,临时旋转矩阵和临时平移向量可以是指由临时匹配对计算得到的相机外参信息,M为小于或者等于N的整数,由一组临时匹配对计算得到的一临时旋转矩阵和一临时平移向量存在对应关系。Among them, the temporary matching pair can refer to the matching pair of feature points used to calculate the rotation matrix and the translation vector. The temporary rotation matrix and the temporary translation vector can refer to the camera external parameter information calculated by the temporary matching pair. M is less than or equal to N. is an integer, and there is a corresponding relationship between a temporary rotation matrix calculated from a set of temporary matching pairs and a temporary translation vector.

具体地,在本实施例中,初始选择次数为零,从N组特征点匹配对中随机选择,得到对应选择次数的临时匹配对,随机选择可以是指等概率随机采样,基于该临时匹配对计算得到第一图像和第二图像之间的临时旋转矩阵和临时平移向量,也即对应选择次数的临时旋转矩阵和临时平移向量,并将选择次数增加一,以便于后续的迭代执行过程。Specifically, in this embodiment, the initial number of selections is zero, and random selection is made from N groups of feature point matching pairs to obtain temporary matching pairs corresponding to the number of selections. Random selection may refer to random sampling with equal probability. Based on the temporary matching pairs The temporary rotation matrix and temporary translation vector between the first image and the second image are calculated, that is, the temporary rotation matrix and the temporary translation vector corresponding to the number of selections, and the number of selections is increased by one to facilitate subsequent iterative execution processes.

在本实施例中,M的值可以设置为50,也即获取到50个临时旋转矩阵和50个临时平移向量,需要说明的是,为了避免随机选择时选择到相同的特征点匹配第作为临时匹配对,因此,在多次随机选择时,可以采用不放回随机采样的方式实现。In this embodiment, the value of M can be set to 50, that is, 50 temporary rotation matrices and 50 temporary translation vectors are obtained. It should be noted that in order to avoid randomly selecting the same feature point matching point as the temporary Matching pairs, therefore, can be implemented by random sampling without replacement when selecting randomly multiple times.

需要说明的是,从N组特征点匹配对中随机选择出M组临时匹配对的过程并不限于上述实现过程,实施者还可以选择概率采样、随机欠采样等其他随机选择方式,实施者选用任一种随机采样方法进行临时匹配对的选取均在本发明保护范围之内。It should be noted that the process of randomly selecting M sets of temporary matching pairs from N sets of feature point matching pairs is not limited to the above implementation process. The implementer can also choose other random selection methods such as probability sampling and random undersampling. The implementer chooses Any random sampling method for selecting temporary matching pairs is within the scope of the present invention.

可选的是,根据每组临时匹配对,计算得到对应临时匹配对内的第一图像和第二图像之间的临时旋转矩阵和临时平移向量,包括:Optionally, according to each group of temporary matching pairs, calculate the temporary rotation matrix and temporary translation vector between the first image and the second image in the corresponding temporary matching pair, including:

针对任一临时匹配对,根据对极约束和临时匹配对,计算得到临时匹配对内的第一图像和第二图像之间的本质矩阵;For any temporary matching pair, calculate the essential matrix between the first image and the second image within the temporary matching pair according to the epipolar constraint and the temporary matching pair;

将本质矩阵进行矩阵分解,得到第一图像和第二图像之间的临时旋转矩阵和临时平移向量。Perform matrix decomposition on the essential matrix to obtain a temporary rotation matrix and a temporary translation vector between the first image and the second image.

其中,对极约束可以用于确定一点在世界坐标系中的深度信息,也即上述示例(X,Y,Z)中的Z,设临时匹配对中包含的两点分别为p和p′,则对极约束可以表示为p′TFp=0,其中,F可以是指基础矩阵,而本质矩阵E和基础矩阵F的关系可以表示为F=K-TEK-1,其中,K可以是指相机内参矩阵,本质矩阵E和旋转矩阵R和平移向量t的关系可以表示为E=t^R,矩阵分解可以是指奇异值分解。Among them, the epipolar constraint can be used to determine the depth information of a point in the world coordinate system, that is, Z in the above example (X, Y, Z). Assume that the two points included in the temporary matching pair are p and p′, Then the epipolar constraint can be expressed as p′ T Fp=0, where F can refer to the basic matrix, and the relationship between the essential matrix E and the basic matrix F can be expressed as F=K -T EK -1 , where K can be Refers to the internal parameter matrix of the camera. The relationship between the essential matrix E, the rotation matrix R and the translation vector t can be expressed as E=t^R. Matrix decomposition can refer to singular value decomposition.

本实施例中,通过对极约束计算每个临时匹配对的临时旋转矩阵和临时平移向量,相较于常规方式计算量更小,计算更为简便,从而提高了整体图像匹配特征点筛选过程的效率。In this embodiment, the temporary rotation matrix and temporary translation vector of each temporary matching pair are calculated through epipolar constraints. Compared with the conventional method, the calculation amount is smaller and the calculation is simpler, thereby improving the efficiency of the overall image matching feature point screening process. efficiency.

上述从N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的第一图像和第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量步骤,通过迭代执行获取临时匹配对,并基于此计算得到对应临时匹配对的临时旋转矩阵和临时平移向量,为后续聚类处理提供了基础,提高了整体图像特征点匹配筛选的准确率。The above-mentioned M groups of temporary matching pairs are randomly selected from N groups of feature point matching pairs. According to each group of temporary matching pairs, the temporary rotation matrix and temporary translation vector between the first image and the second image in the corresponding temporary matching pair are calculated. , obtain M temporary rotation matrices and M temporary translation vector steps, obtain temporary matching pairs through iterative execution, and calculate based on this to obtain the temporary rotation matrix and temporary translation vector corresponding to the temporary matching pairs, which provides the basis for subsequent clustering processing. Improved the accuracy of overall image feature point matching and screening.

步骤S204,将M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转矩阵,将M个临时平移向量进行聚类,得到第二聚类中心点,确定第二聚类中心点为参考平移向量。Step S204, cluster the M temporary rotation matrices to obtain the first cluster center point, determine the first cluster center point as the reference rotation matrix, cluster the M temporary translation vectors, and obtain the second cluster center point. , determine the second cluster center point as the reference translation vector.

其中,第一聚类中心点可以是指M个临时旋转矩阵进行聚类的聚类结果的中心点,第二聚类中心点可以是指M个临时平移向量进行聚类的聚类结果的中心点,参考旋转矩阵和参考平移向量可以是指用于作为参考外参信息在后续对特征点匹配对进行筛选。Wherein, the first clustering center point may refer to the center point of the clustering result of M temporary rotation matrices, and the second clustering center point may refer to the center of the clustering result of M temporary translation vectors. Points, reference rotation matrices and reference translation vectors may be used as reference external parameter information for subsequent screening of matching pairs of feature points.

可选的是,将M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转矩阵,包括:Optionally, cluster the M temporary rotation matrices to obtain the first cluster center point, and determine the first cluster center point as the reference rotation matrix, including:

将每个临时旋转矩阵转换为临时旋转向量,根据临时旋转向量之间的欧式距离,对所有临时旋转向量进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转向量;Convert each temporary rotation matrix into a temporary rotation vector, cluster all temporary rotation vectors according to the Euclidean distance between temporary rotation vectors, obtain the first cluster center point, and determine the first cluster center point as the reference rotation vector ;

将参考旋转向量转换为参考旋转矩阵。Convert the reference rotation vector to the reference rotation matrix.

其中,可以根据李群和李代数之间的转换关系,将每个临时旋转矩阵转换为临时旋转向量,可以根据李群和李代数之间的转换关系,将参考旋转向量转换为参考旋转矩阵。Among them, each temporary rotation matrix can be converted into a temporary rotation vector according to the conversion relationship between Lie groups and Lie algebras, and the reference rotation vector can be converted into a reference rotation matrix according to the conversion relationship between Lie groups and Lie algebras.

李群和李代数之间的转换关系能够指导临时旋转矩阵转换为对应李代数上的元素,而李代数可以通过三维向量表示,也即将矩阵表示转换为了三维向量表示,从而可以在三维欧式空间进行聚类处理。The conversion relationship between Lie groups and Lie algebras can guide the conversion of temporary rotation matrices into corresponding elements of Lie algebras, and Lie algebras can be represented by three-dimensional vectors, that is, the matrix representation is converted into a three-dimensional vector representation, so that it can be performed in a three-dimensional Euclidean space Clustering processing.

欧式距离可以表示临时旋转向量之间的差异信息,聚类可以采用DBSCAN聚类算法。Euclidean distance can represent the difference information between temporary rotation vectors, and clustering can use the DBSCAN clustering algorithm.

具体地,DBSCAN聚类算法需要定义样本点半径和最少点数目,对于待聚类的所有点,根据样本点半径确定每个点的局部范围,若任一个点的局部范围内包含其他点的数量大于或者等于最少点数目,则以该点为核心点,以该核心点及其局部范围包含的其他点形成临时簇,对于任一个临时簇,若该临时簇内还包含其他核心点,则将各个核心点对应的临时簇合并为同一个簇,遍历所有临时簇,得到最终的聚类结果,也即多个合并后的簇作为多个聚类集合,需要说明的是,DBSCAN聚类算法是基于密度的聚类方法,无需提前设定聚类集合的数目。Specifically, the DBSCAN clustering algorithm needs to define the sample point radius and the minimum number of points. For all points to be clustered, the local range of each point is determined based on the sample point radius. If the local range of any point contains the number of other points is greater than or equal to the minimum number of points, then this point is taken as the core point, and a temporary cluster is formed with this core point and other points included in its local range. For any temporary cluster, if the temporary cluster also contains other core points, then The temporary clusters corresponding to each core point are merged into the same cluster, and all temporary clusters are traversed to obtain the final clustering result, that is, multiple merged clusters are used as multiple cluster sets. It should be noted that the DBSCAN clustering algorithm is Density-based clustering method does not need to set the number of cluster sets in advance.

需要说明的是,距离度量和聚类算法的实现并不限于上述过程,实施者还可以选择余弦距离等其他距离度量方式,以及K-means等其他聚类方式等,实施者选用任一种距离度量方式或者聚类方式进行参考旋转向量的获取均在本发明保护范围之内。It should be noted that the implementation of distance measurement and clustering algorithms is not limited to the above process. The implementer can also choose other distance measurement methods such as cosine distance, and other clustering methods such as K-means. The implementer can choose any distance Obtaining the reference rotation vector using a measurement method or a clustering method is within the scope of the present invention.

本实施例中,将临时旋转矩阵转换为三维向量进行聚类处理,使得聚类处理过程快速且高效,提高了聚类分析的可行性,为后续提供了基于聚类结果确定的参考旋转矩阵作为特征点匹配对筛选的基础。In this embodiment, the temporary rotation matrix is converted into a three-dimensional vector for clustering processing, which makes the clustering process fast and efficient, improves the feasibility of cluster analysis, and provides a reference rotation matrix determined based on the clustering results for subsequent use. Feature point matching is the basis for filtering.

可选的是,将M个临时平移向量进行聚类,得到第二聚类中心点,确定第二聚类中心点为参考平移向量,包括:Optionally, cluster the M temporary translation vectors to obtain the second cluster center point, and determine the second cluster center point as the reference translation vector, including:

将每个临时平移向量映射为单位球面上的临时平移点,根据临时平移点之间的球面距离,对所有临时平移点进行聚类,得到第二聚类中心点;Map each temporary translation vector to a temporary translation point on the unit sphere, cluster all temporary translation points according to the spherical distance between the temporary translation points, and obtain the second cluster center point;

将第二聚类中心点确定为参考平移向量。Determine the second cluster center point as the reference translation vector.

其中,单位球面可以是指半径为1的球面,临时平移点可以是指临时平移向量在单位球面上的映射点,球面距离可以表征临时平移点在球面上的距离。Among them, the unit sphere may refer to a sphere with a radius of 1, the temporary translation point may refer to the mapping point of the temporary translation vector on the unit sphere, and the spherical distance may represent the distance of the temporary translation point on the sphere.

具体地,虽然平移向量已经是三维向量,但是平移向量是通过本质矩阵分解得到的,缺乏尺度信息,通常情况下平移向量的模长始终为1,因此可以视作是单位球面上的点,聚类方式同样采用DBSCAN聚类算法,需要说明的是,在确定第二聚类中心点时,需要对其进行归一化操作,以确保其模长仍为1,也即仍处于单位球面上。Specifically, although the translation vector is already a three-dimensional vector, it is obtained through essential matrix decomposition and lacks scale information. Normally, the modulus length of the translation vector is always 1, so it can be regarded as a point on the unit sphere. The clustering method also uses the DBSCAN clustering algorithm. It should be noted that when determining the second cluster center point, it needs to be normalized to ensure that its module length is still 1, that is, it is still on the unit sphere.

本实施例中,将临时平移向量转换至单位球面上进行聚类处理,使得聚类处理过程更契合临时平移向量的数据特点,提高了聚类分析的准确性,为后续提供了基于聚类结果确定的参考平移向量作为特征点匹配对筛选的基础,提高整体图像特征点匹配筛选过程的准确率。In this embodiment, the temporary translation vector is converted to the unit sphere for clustering processing, so that the clustering process is more in line with the data characteristics of the temporary translation vector, improves the accuracy of cluster analysis, and provides subsequent clustering results based on The determined reference translation vector serves as the basis for feature point matching and screening, improving the accuracy of the overall image feature point matching and screening process.

上述将M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转矩阵,将M个临时平移向量进行聚类,得到第二聚类中心点,确定第二聚类中心点为参考平移向量的步骤,基于正确特征点匹配对计算出的旋转矩阵和平移向量相同的先验信息,通过聚类方式计算得到参考旋转矩阵和参考平移向量,为后续特征点匹配对筛选提供基础,提高了特征点匹配对筛选的效率和准确率。As above, the M temporary rotation matrices are clustered to obtain the first cluster center point. The first cluster center point is determined as the reference rotation matrix. The M temporary translation vectors are clustered to obtain the second cluster center point. Determine The second clustering center point is the reference translation vector. Based on the correct feature point matching pair, the calculated rotation matrix and the translation vector are the same a priori information. The reference rotation matrix and the reference translation vector are calculated through clustering, which are the subsequent features. Point matching provides a basis for screening and improves the efficiency and accuracy of feature point matching for screening.

步骤S205,根据参考旋转矩阵和参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对。Step S205: Calculate the projection error of each set of feature point matching pairs based on the reference rotation matrix and the reference translation vector, and determine the feature point matching pair corresponding to the projection error within the preset range as the target point matching pair.

其中,投影误差可以是指特征点匹配对按照参考旋转矩阵和参考平移向量进行投影时的像素位置偏差,预设范围可以用于根据投影误差筛选特征点匹配对,目标点匹配对用于计算出第一图像和第二图像之间的目标旋转矩阵和目标平移向量,以指示第一图像和第二图像进行图像匹配。Among them, the projection error can refer to the pixel position deviation when the feature point matching pair is projected according to the reference rotation matrix and the reference translation vector. The preset range can be used to filter the feature point matching pairs according to the projection error, and the target point matching pair is used to calculate A target rotation matrix and a target translation vector between the first image and the second image to indicate image matching between the first image and the second image.

可选的是,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,包括:Optionally, the feature point matching pairs corresponding to the projection error within the preset range are determined as the target point matching pairs, including:

计算得到所有投影误差的均值和方差,根据均值和方差,将方差和预设系数相乘,得到相乘结果,以均值和相乘结果的差值作为第一边界点,以均值和相乘结果的和作为第二边界点;Calculate the mean and variance of all projection errors. According to the mean and variance, multiply the variance and the preset coefficient to obtain the multiplication result. Use the difference between the mean and the multiplication result as the first boundary point, and use the mean and the multiplication result as the first boundary point. The sum of is used as the second boundary point;

由第一边界点和第二边界点形成预设范围,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对。A preset range is formed by the first boundary point and the second boundary point, and a feature point matching pair corresponding to the projection error within the preset range is determined as a target point matching pair.

其中,均值可以表示所有投影误差的平均数,方差可以表征所有投影误差的波动情况,预设系数可以用于调整预设范围,预设系数是可调节的参数,第一边界点可以是指预设范围的左边界点,第二边界点可以是指预设范围的右边界点。Among them, the mean can represent the average of all projection errors, the variance can represent the fluctuation of all projection errors, the preset coefficient can be used to adjust the preset range, the preset coefficient is an adjustable parameter, and the first boundary point can refer to the preset range. Assuming the left boundary point of the range, the second boundary point may refer to the right boundary point of the preset range.

具体地,以投影误差符合正态分布的先验,计算得到所有投影误差的方差和均值,将投影误差在均值附近的匹配点视为正确的匹配点,而均值附近则需要通过预设范围来确定。Specifically, based on the prior that the projection error conforms to the normal distribution, the variance and mean of all projection errors are calculated, and the matching point near the mean of the projection error is regarded as the correct matching point, while the vicinity of the mean needs to be determined by a preset range. Sure.

本实施例中,基于正态分布的先验对投影误差进行筛选,能够快速和准确地进行特征点匹配对的筛选,相较于常用的RANSAC筛选方法等,有效提高了图像特征点匹配筛选过程的效率。In this embodiment, the projection error is screened based on the prior of normal distribution, which can quickly and accurately screen feature point matching pairs. Compared with the commonly used RANSAC screening method, etc., it effectively improves the image feature point matching and screening process. s efficiency.

上述根据参考旋转矩阵和参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对的步骤,能够根据投影误差准确确定可靠的目标点匹配对,提高了匹配特征点筛选的精度和准确率。The above steps of calculating the projection error of each set of feature point matching pairs based on the reference rotation matrix and the reference translation vector, and determining the feature point matching pair corresponding to the projection error within the preset range as the target point matching pair can accurately calculate the projection error according to the Determine reliable target point matching pairs and improve the precision and accuracy of matching feature point screening.

本实施例中,通过聚类得到的参考旋转矩阵和参考平移矩阵对特征点匹配对进行筛选,从而保留可靠的目标点匹配对,提高了特征点匹配的精度和准确率,进而提高了图像匹配的准确率。In this embodiment, the reference rotation matrix and the reference translation matrix obtained by clustering are used to filter feature point matching pairs, thereby retaining reliable target point matching pairs, improving the precision and accuracy of feature point matching, and thus improving image matching. accuracy.

对应于上文实施例的基于聚类的图像特征点匹配筛选方法,图3示出了本发明实施例二提供的基于聚类的图像特征点匹配筛选装置的结构框图,上述图像特征点匹配筛选装置应用于服务端,服务端对应的计算机设备与客户端连接,以获取待匹配的第一图像和第二图像,服务端对应的计算机设备能够执行如投影矩阵运算、矩阵分解等计算任务,为了便于说明,仅示出了与本发明实施例相关的部分。Corresponding to the clustering-based image feature point matching and screening method in the above embodiment, Figure 3 shows a structural block diagram of the clustering-based image feature point matching and screening device provided in Embodiment 2 of the present invention. The above image feature point matching and screening method The device is applied to the server, and the computer equipment corresponding to the server is connected to the client to obtain the first image and the second image to be matched. The computer equipment corresponding to the server can perform computing tasks such as projection matrix operations, matrix decomposition, etc., in order to For convenience of explanation, only parts related to the embodiments of the present invention are shown.

参见图3,该图像特征点匹配筛选装置包括:Referring to Figure 3, the image feature point matching and filtering device includes:

特征点提取模块31,用于获取待匹配的第一图像和第二图像,提取第一图像内的特征点,得到第一特征点集合,提取第二图像内的特征点,得到第二特征点集合;The feature point extraction module 31 is used to obtain the first image and the second image to be matched, extract the feature points in the first image to obtain the first feature point set, and extract the feature points in the second image to obtain the second feature point. gather;

特征点匹配模块32,用于对第一特征点集合和第二特征点集合进行特征点匹配,得到N组特征点匹配对,特征点匹配对包括第一特征点集合中的一个特征点和第二特征点集合中的一个特征点,N为大于五的整数;The feature point matching module 32 is used to perform feature point matching on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs, where the feature point matching pairs include one feature point in the first feature point set and the second feature point set. One feature point in the set of two feature points, N is an integer greater than five;

匹配对选择模块33,用于从N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的第一图像和第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量,M为小于或者等于N的整数;The matching pair selection module 33 is used to randomly select M groups of temporary matching pairs from N groups of feature point matching pairs, and calculate the distance between the first image and the second image in the corresponding temporary matching pair according to each group of temporary matching pairs. Temporary rotation matrices and temporary translation vectors are used to obtain M temporary rotation matrices and M temporary translation vectors, where M is an integer less than or equal to N;

聚类处理模块34,用于将M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转矩阵,将M个临时平移向量进行聚类,得到第二聚类中心点,确定第二聚类中心点为参考平移向量;The clustering processing module 34 is used to cluster the M temporary rotation matrices to obtain the first cluster center point, determine the first cluster center point as the reference rotation matrix, cluster the M temporary translation vectors, and obtain the first cluster center point. Two cluster center points, determine the second cluster center point as the reference translation vector;

匹配对确定模块35,用于根据参考旋转矩阵和参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,目标点匹配对用于计算出第一图像和第二图像之间的目标旋转矩阵和目标平移向量,以指示第一图像和第二图像进行图像匹配。The matching pair determination module 35 is used to calculate the projection error of each set of feature point matching pairs based on the reference rotation matrix and the reference translation vector, and determine the feature point matching pair corresponding to the projection error within the preset range as the target point matching pair, The target point matching pair is used to calculate the target rotation matrix and the target translation vector between the first image and the second image to indicate image matching between the first image and the second image.

可选的是,上述特征点提取模块31包括:Optionally, the above feature point extraction module 31 includes:

第一特征点提取单元,用于采用特征点提取算子在第一图像内搜索,得到第一图像内的Q个第一特征点及其对应的第一特征描述子,Q为大于N的整数;The first feature point extraction unit is used to search within the first image using a feature point extraction operator to obtain Q first feature points in the first image and their corresponding first feature descriptors, where Q is an integer greater than N. ;

第二特征点提取单元,用于采用特征点提取算子在第二图像内搜索,得到第二图像内的S个第二特征点及其对应的第二特征描述子,S为大于N的整数;The second feature point extraction unit is used to search within the second image using a feature point extraction operator to obtain S second feature points in the second image and their corresponding second feature descriptors, where S is an integer greater than N. ;

集合形成单元,用于由第一图像内的Q个第一特征点形成第一图像的第一特征点集合,由第二图像内的S个第二特征点形成第二图像的第二特征点集合。A set forming unit configured to form a first feature point set of the first image from the Q first feature points in the first image, and to form a second feature point set of the second image from the S second feature points in the second image. gather.

可选的是,上述特征点匹配模块32包括:Optionally, the above feature point matching module 32 includes:

相似度计算单元,用于针对第一特征点集合中的任一第一特征点,将第一特征点的第一特征描述子分别和第二特征点集合中的每个第二特征点的第二特征描述子进行相似度计算,得到第二特征点集合中每个第二特征点对应的相似度值,确定所有相似度值中的最大值对应的第二特征点和第一特征点形成一组特征点匹配对;A similarity calculation unit, configured to compare the first feature descriptor of the first feature point with the first feature descriptor of each second feature point in the second feature point set for any first feature point in the first feature point set. Calculate the similarity between the two feature descriptors to obtain the similarity value corresponding to each second feature point in the second feature point set, and determine that the second feature point corresponding to the maximum value among all similarity values and the first feature point form a Group feature point matching pairs;

特征点遍历单元,用于遍历第一特征点集合中的所有第一特征点,得到N组特征点匹配对。The feature point traversal unit is used to traverse all the first feature points in the first feature point set to obtain N sets of feature point matching pairs.

可选的是,上述匹配对选择模块33包括:Optionally, the above matching pair selection module 33 includes:

本质矩阵计算单元,用于针对任一临时匹配对,根据对极约束和临时匹配对,计算得到临时匹配对内的第一图像和第二图像之间的本质矩阵;An essential matrix calculation unit, used for calculating, for any temporary matching pair, the essential matrix between the first image and the second image within the temporary matching pair according to the epipolar constraint and the temporary matching pair;

矩阵分解单元,用于将本质矩阵进行矩阵分解,得到第一图像和第二图像之间的临时旋转矩阵和临时平移向量。The matrix decomposition unit is used to perform matrix decomposition on the essential matrix to obtain a temporary rotation matrix and a temporary translation vector between the first image and the second image.

可选的是,上述聚类处理模块34包括:Optionally, the above-mentioned clustering processing module 34 includes:

矩阵转换单元,用于将每个临时旋转矩阵转换为临时旋转向量,根据临时旋转向量之间的欧式距离,对所有临时旋转向量进行聚类,得到第一聚类中心点,确定第一聚类中心点为参考旋转向量;The matrix conversion unit is used to convert each temporary rotation matrix into a temporary rotation vector, cluster all temporary rotation vectors according to the Euclidean distance between the temporary rotation vectors, obtain the first cluster center point, and determine the first cluster The center point is the reference rotation vector;

向量转换单元,用于将参考旋转向量转换为参考旋转矩阵。Vector conversion unit, used to convert the reference rotation vector into the reference rotation matrix.

可选的是,上述聚类处理模块34包括:Optionally, the above-mentioned clustering processing module 34 includes:

向量映射单元,用于将每个临时平移向量映射为单位球面上的临时平移点,根据临时平移点之间的球面距离,对所有临时平移点进行聚类,得到第二聚类中心点;The vector mapping unit is used to map each temporary translation vector to a temporary translation point on the unit sphere, cluster all temporary translation points according to the spherical distance between the temporary translation points, and obtain the second cluster center point;

向量确定单元,用于将第二聚类中心点确定为参考平移向量。The vector determination unit is used to determine the second cluster center point as the reference translation vector.

可选的是,上述匹配对确定模块35包括:Optionally, the above matching pair determination module 35 includes:

边界点计算单元,用于计算得到所有投影误差的均值和方差,根据均值和方差,将方差和预设系数相乘,得到相乘结果,以均值和相乘结果的差值作为第一边界点,以均值和相乘结果的和作为第二边界点;The boundary point calculation unit is used to calculate the mean and variance of all projection errors. According to the mean and variance, the variance and the preset coefficient are multiplied to obtain the multiplication result. The difference between the mean and the multiplication result is used as the first boundary point. , taking the sum of the mean and the multiplication result as the second boundary point;

范围确定单元,用于由第一边界点和第二边界点形成预设范围,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对。A range determination unit is configured to form a preset range from the first boundary point and the second boundary point, and determine a matching pair of feature points corresponding to the projection error within the preset range as a matching pair of target points.

需要说明的是,上述模块、单元之间的信息交互、执行过程等内容,由于与本发明方法实施例基于同一构思,其具体功能及带来的技术效果,具体可参见方法实施例部分,此处不再赘述。It should be noted that the information interaction and execution process between the above-mentioned modules and units are based on the same concept as the method embodiments of the present invention. For details of their specific functions and technical effects, please refer to the method embodiments section. No further details will be given.

图4为本发明实施例三提供的一种计算机设备的结构示意图。如图4所示,该实施例的计算机设备包括:至少一个处理器(图4中仅示出一个)、存储器以及存储在存储器中并可在至少一个处理器上运行的计算机程序,处理器执行计算机程序时实现上述任意各个基于聚类的图像特征点匹配筛选方法实施例中的步骤。FIG. 4 is a schematic structural diagram of a computer device provided in Embodiment 3 of the present invention. As shown in Figure 4, the computer device of this embodiment includes: at least one processor (only one is shown in Figure 4), a memory, and a computer program stored in the memory and executable on at least one processor. The processor executes The computer program implements the steps in any of the above embodiments of the clustering-based image feature point matching and screening method.

该计算机设备可包括,但不仅限于,处理器、存储器。本领域技术人员可以理解,图4仅仅是计算机设备的举例,并不构成对计算机设备的限定,计算机设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如还可以包括网络接口、显示屏和输入装置等。The computer device may include, but is not limited to, a processor and a memory. Those skilled in the art can understand that FIG. 4 is only an example of a computer device and does not constitute a limitation on the computer device. The computer device may include more or fewer components than shown in the figure, or may combine certain components, or use different components. , for example, it may also include a network interface, a display screen, an input device, etc.

所称处理器可以是CPU,该处理器还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific IntegratedCircuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA),或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器,或者该处理器也可以是任何常规的处理器等。The so-called processor can be a CPU, which can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field- Programmable Gate Array (FPGA), or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general-purpose processor may be a microprocessor, or the processor may be any conventional processor, etc.

存储器包括可读存储介质、内存储器等,其中,内存储器可以是计算机设备的内存,内存储器为可读存储介质中的操作系统和计算机可读指令的运行提供环境。可读存储介质可以是计算机设备的硬盘,在另一些实施例中也可以是计算机设备的外部存储设备,例如,计算机设备上配备的插接式硬盘、智能存储卡(Smart Media Card,SMC)、安全数字(Secure Digital,SD)卡、闪存卡(Flash Card)等。进一步地,存储器还可以既包括计算机设备的内部存储单元也包括外部存储设备。存储器用于存储操作系统、应用程序、引导装载程序(BootLoader)、数据以及其他程序等,该其他程序如计算机程序的程序代码等。存储器还可以用于暂时地存储已经输出或者将要输出的数据。The memory includes readable storage media, internal memory, etc., wherein the internal memory can be the memory of the computer device, and the internal memory provides an environment for the operation of the operating system and computer-readable instructions in the readable storage medium. The readable storage medium may be a hard disk of a computer device. In other embodiments, it may also be an external storage device of the computer device, such as a plug-in hard disk, a smart memory card (SMC), etc. equipped on the computer device. Secure Digital (SD) card, Flash Card, etc. Further, the memory may also include both internal storage units of the computer device and external storage devices. The memory is used to store operating systems, application programs, boot loaders (Boot Loaders), data, and other programs, such as program codes of computer programs. The memory can also be used to temporarily store data that has been output or is to be output.

所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元的划分进行举例说明,在实际应用中,实施者可以根据需要而将上述功能分配由不同的功能单元完成,也即,将装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本发明的保护范围。上述装置中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述方法实施例的步骤。其中,计算机程序包括计算机程序代码,计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。计算机可读介质至少可以包括:能够携带计算机程序代码的任何实体或装置、记录介质、计算机存储器、只读存储器(Read-Only Memory,ROM)、随机存取存储器(RandomAccess Memory,RAM)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践,计算机可读介质不可以是电载波信号和电信信号。Those skilled in the art can clearly understand that for the convenience and simplicity of description, only the division of the above functional units is used as an example. In practical applications, the implementer can allocate the above functions to different functional units according to needs. Complete, that is, divide the internal structure of the device into different functional units or modules to complete all or part of the functions described above. Each functional unit in the embodiment can be integrated into one processing unit, or each unit can exist physically alone, or two or more units can be integrated into one unit. The above integrated unit can be implemented in the form of hardware. , can also be implemented in the form of software functional units. In addition, the specific names of each functional unit and module are only for the convenience of distinguishing each other and are not used to limit the scope of the present invention. For the specific working processes of the units and modules in the above device, reference can be made to the corresponding processes in the foregoing method embodiments, which will not be described again here. Integrated units may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as independent products. Based on this understanding, the present invention can implement all or part of the processes in the methods of the above embodiments by instructing relevant hardware through a computer program. The computer program can be stored in a computer-readable storage medium, and the computer program can be processed after being processed. When the processor is executed, the steps of the above method embodiments can be implemented. Among them, the computer program includes computer program code, and the computer program code can be in the form of source code, object code, executable file or some intermediate form, etc. Computer-readable media may at least include: any entity or device capable of carrying computer program code, recording media, computer memory, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), electrical carrier wave Signals, telecommunications signals, and software distribution media. For example, U disk, mobile hard disk, magnetic disk or CD, etc. In some jurisdictions, subject to legislation and patent practice, computer-readable media may not be electrical carrier signals and telecommunications signals.

本发明实现上述实施例方法中的全部或部分流程,也可以通过一种计算机程序产品来完成,当计算机程序产品在计算机设备上运行时,使得计算机设备执行时实现可实现上述方法实施例中的步骤。The present invention can implement all or part of the processes in the above method embodiments, and can also be completed through a computer program product. When the computer program product is run on a computer device, the computer device can implement the steps in the above method embodiments when executed. step.

在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。In the above embodiments, each embodiment is described with its own emphasis. For parts that are not detailed or documented in a certain embodiment, please refer to the relevant descriptions of other embodiments.

本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。Those of ordinary skill in the art will appreciate that the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein can be implemented with electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered to be beyond the scope of the present invention.

在本发明所提供的实施例中,应该理解到,所揭露的装置/计算机设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/计算机设备实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。In the embodiments provided by the present invention, it should be understood that the disclosed apparatus/computer equipment and methods can be implemented in other ways. For example, the apparatus/computer equipment embodiments described above are only illustrative. For example, the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or components. can be combined or can be integrated into another system, or some features can be ignored, or not implemented. On the other hand, the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, which may be in electrical, mechanical or other forms.

作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。A unit described as a separate component may or may not be physically separate. A component shown as a unit may or may not be a physical unit, that is, it may be located in one place, or it may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.

以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围,均应包含在本发明的保护范围之内。The above embodiments are only used to illustrate the technical solutions of the present invention, but not to limit them; although the present invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that they can still modify the technical solutions of the foregoing embodiments. Modifications are made to the recorded technical solutions, or equivalent substitutions are made to some of the technical features; however, these modifications or substitutions do not cause the essence of the corresponding technical solutions to deviate from the spirit and scope of the technical solutions of each embodiment of the present invention, and should all be included in the present invention. within the scope of protection.

Claims (10)

1.一种基于聚类的图像特征点匹配筛选方法,其特征在于,所述图像特征点匹配筛选方法包括:1. A clustering-based image feature point matching and screening method, characterized in that the image feature point matching and screening method includes: 获取待匹配的第一图像和第二图像,提取所述第一图像内的特征点,得到第一特征点集合,提取所述第二图像内的特征点,得到第二特征点集合;Obtain the first image and the second image to be matched, extract feature points in the first image to obtain a first feature point set, extract feature points in the second image to obtain a second feature point set; 对所述第一特征点集合和所述第二特征点集合进行特征点匹配,得到N组特征点匹配对,所述特征点匹配对包括所述第一特征点集合中的一个特征点和所述第二特征点集合中的一个特征点,N为大于五的整数;Feature point matching is performed on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs, where the feature point matching pairs include one feature point in the first feature point set and all A feature point in the second feature point set, N is an integer greater than five; 从所述N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量,M为小于或者等于N的整数;M groups of temporary matching pairs are randomly selected from the N groups of feature point matching pairs, and based on each group of temporary matching pairs, the temporary rotation between the first image and the second image in the corresponding temporary matching pair is calculated. Matrix and temporary translation vector, get M temporary rotation matrices and M temporary translation vectors, M is an integer less than or equal to N; 将所述M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定所述第一聚类中心点为参考旋转矩阵,将所述M个临时平移向量进行聚类,得到第二聚类中心点,确定所述第二聚类中心点为参考平移向量;The M temporary rotation matrices are clustered to obtain the first cluster center point, the first cluster center point is determined as the reference rotation matrix, and the M temporary translation vectors are clustered to obtain the second clustering center point. Class center point, determine the second cluster center point as the reference translation vector; 根据所述参考旋转矩阵和所述参考平移向量,计算·每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,所述目标点匹配对用于计算出所述第一图像和所述第二图像之间的目标旋转矩阵和目标平移向量,以指示所述第一图像和所述第二图像进行图像匹配。According to the reference rotation matrix and the reference translation vector, the projection error of each set of feature point matching pairs is calculated, and the feature point matching pairs corresponding to the projection errors within the preset range are determined as target point matching pairs. The point matching pair is used to calculate a target rotation matrix and a target translation vector between the first image and the second image to indicate image matching between the first image and the second image. 2.根据权利要求1所述的图像特征点匹配筛选方法,其特征在于,所述提取所述第一图像内的特征点,得到第一特征点集合,提取所述第二图像内的特征点,得到第二特征点集合,包括:2. The image feature point matching and screening method according to claim 1, characterized in that: extracting feature points in the first image to obtain a first feature point set, and extracting feature points in the second image , get the second feature point set, including: 采用特征点提取算子在所述第一图像内搜索,得到所述第一图像内的Q个第一特征点及其对应的第一特征描述子,Q为大于N的整数;Use a feature point extraction operator to search within the first image to obtain Q first feature points and their corresponding first feature descriptors in the first image, where Q is an integer greater than N; 采用所述特征点提取算子在所述第二图像内搜索,得到所述第二图像内的S个第二特征点及其对应的第二特征描述子,S为大于N的整数;Use the feature point extraction operator to search within the second image to obtain S second feature points and their corresponding second feature descriptors in the second image, where S is an integer greater than N; 由所述第一图像内的Q个第一特征点形成所述第一图像的第一特征点集合,由所述第二图像内的S个第二特征点形成所述第二图像的第二特征点集合。The first feature point set of the first image is formed by Q first feature points in the first image, and the second feature point set of the second image is formed by S second feature points in the second image. Collection of feature points. 3.根据权利要求2所述的图像特征点匹配筛选方法,其特征在于,所述对所述第一特征点集合和所述第二特征点集合进行特征点匹配,得到N组特征点匹配对,包括:3. The image feature point matching and screening method according to claim 2, wherein the feature point matching is performed on the first feature point set and the second feature point set to obtain N groups of feature point matching pairs. ,include: 针对所述第一特征点集合中的任一第一特征点,将所述第一特征点的第一特征描述子分别和所述第二特征点集合中的每个第二特征点的第二特征描述子进行相似度计算,得到所述第二特征点集合中每个第二特征点对应的相似度值,确定所有相似度值中的最大值对应的第二特征点和所述第一特征点形成一组特征点匹配对;For any first feature point in the first feature point set, combine the first feature descriptor of the first feature point and the second feature point of each second feature point in the second feature point set. The feature descriptor performs similarity calculation to obtain the similarity value corresponding to each second feature point in the second feature point set, and determines the second feature point corresponding to the maximum value among all similarity values and the first feature points form a set of feature point matching pairs; 遍历所述第一特征点集合中的所有第一特征点,得到所述N组特征点匹配对。Traverse all first feature points in the first feature point set to obtain the N sets of matching pairs of feature points. 4.根据权利要求1所述的图像特征点匹配筛选方法,其特征在于,所述根据每组临时匹配对,计算得到对应临时匹配对内的所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量,包括:4. The image feature point matching and screening method according to claim 1, characterized in that, according to each group of temporary matching pairs, the distance between the first image and the second image in the corresponding temporary matching pair is calculated. The temporary rotation matrix and temporary translation vector include: 针对任一临时匹配对,根据对极约束和所述临时匹配对,计算得到所述临时匹配对内的所述第一图像和所述第二图像之间的本质矩阵;For any temporary matching pair, calculate the essential matrix between the first image and the second image within the temporary matching pair according to the epipolar constraint and the temporary matching pair; 将所述本质矩阵进行矩阵分解,得到所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量。Perform matrix decomposition on the essential matrix to obtain a temporary rotation matrix and a temporary translation vector between the first image and the second image. 5.根据权利要求1所述的图像特征点匹配筛选方法,其特征在于,所述将所述M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定所述第一聚类中心点为参考旋转矩阵,包括:5. The image feature point matching and screening method according to claim 1, wherein the M temporary rotation matrices are clustered to obtain a first cluster center point, and the first cluster center is determined. Points are reference rotation matrices, including: 将每个临时旋转矩阵转换为临时旋转向量,根据临时旋转向量之间的欧式距离,对所有临时旋转向量进行聚类,得到所述第一聚类中心点,确定所述第一聚类中心点为参考旋转向量;Convert each temporary rotation matrix into a temporary rotation vector, cluster all temporary rotation vectors according to the Euclidean distance between the temporary rotation vectors, obtain the first cluster center point, and determine the first cluster center point is the reference rotation vector; 将所述参考旋转向量转换为所述参考旋转矩阵。Convert the reference rotation vector into the reference rotation matrix. 6.根据权利要求1所述的图像特征点匹配筛选方法,其特征在于,所述将所述M个临时平移向量进行聚类,得到第二聚类中心点,确定所述第二聚类中心点为参考平移向量,包括:6. The image feature point matching and screening method according to claim 1, characterized in that the M temporary translation vectors are clustered to obtain a second cluster center point, and the second cluster center is determined. The point is the reference translation vector, including: 将每个临时平移向量映射为单位球面上的临时平移点,根据临时平移点之间的球面距离,对所有临时平移点进行聚类,得到所述第二聚类中心点;Map each temporary translation vector to a temporary translation point on the unit sphere, cluster all temporary translation points according to the spherical distance between the temporary translation points, and obtain the second cluster center point; 将所述第二聚类中心点确定为所述参考平移向量。The second cluster center point is determined as the reference translation vector. 7.根据权利要求1至6任一项所述的图像特征点匹配筛选方法,其特征在于,所述将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,包括:7. The image feature point matching and screening method according to any one of claims 1 to 6, wherein the feature point matching pair corresponding to the projection error within the preset range is determined as the target point matching pair, including : 计算得到所有投影误差的均值和方差,根据所述均值和所述方差,将所述方差和预设系数相乘,得到相乘结果,以所述均值和所述相乘结果的差值作为第一边界点,以所述均值和所述相乘结果的和作为第二边界点;The mean and variance of all projection errors are calculated. According to the mean and the variance, the variance is multiplied by a preset coefficient to obtain a multiplication result. The difference between the mean and the multiplication result is used as the first A boundary point, taking the sum of the mean value and the multiplication result as the second boundary point; 由所述第一边界点和所述第二边界点形成所述预设范围,将在所述预设范围内的投影误差对应的特征点匹配对确定为所述目标点匹配对。The preset range is formed by the first boundary point and the second boundary point, and a feature point matching pair corresponding to a projection error within the preset range is determined as the target point matching pair. 8.一种基于聚类的图像特征点匹配筛选装置,其特征在于,所述图像特征点匹配筛选装置包括:8. An image feature point matching and screening device based on clustering, characterized in that the image feature point matching and screening device includes: 特征点提取模块,用于获取待匹配的第一图像和第二图像,提取所述第一图像内的特征点,得到第一特征点集合,提取所述第二图像内的特征点,得到第二特征点集合;A feature point extraction module is used to obtain the first image and the second image to be matched, extract the feature points in the first image to obtain the first feature point set, extract the feature points in the second image to obtain the third Two feature point sets; 特征点匹配模块,用于对所述第一特征点集合和所述第二特征点集合进行特征点匹配,得到N组特征点匹配对,所述特征点匹配对包括所述第一特征点集合中的一个特征点和所述第二特征点集合中的一个特征点,N为大于五的整数;A feature point matching module, configured to perform feature point matching on the first feature point set and the second feature point set to obtain N sets of feature point matching pairs, where the feature point matching pairs include the first feature point set One feature point in and one feature point in the second feature point set, N is an integer greater than five; 匹配对选择模块,用于从所述N组特征点匹配对中随机选择出M组临时匹配对,根据每组临时匹配对,计算得到对应临时匹配对内的所述第一图像和所述第二图像之间的临时旋转矩阵和临时平移向量,得到M个临时旋转矩阵和M个临时平移向量,M为小于或者等于N的整数;A matching pair selection module, configured to randomly select M groups of temporary matching pairs from the N groups of feature point matching pairs, and calculate the first image and the first image in the corresponding temporary matching pair according to each group of temporary matching pairs. The temporary rotation matrix and temporary translation vector between the two images are obtained, and M temporary rotation matrices and M temporary translation vectors are obtained. M is an integer less than or equal to N; 聚类处理模块,用于将所述M个临时旋转矩阵进行聚类,得到第一聚类中心点,确定所述第一聚类中心点为参考旋转矩阵,将所述M个临时平移向量进行聚类,得到第二聚类中心点,确定所述第二聚类中心点为参考平移向量;A clustering processing module is used to cluster the M temporary rotation matrices to obtain a first cluster center point, determine the first cluster center point as a reference rotation matrix, and perform processing on the M temporary translation vectors. Cluster, obtain the second cluster center point, and determine the second cluster center point as the reference translation vector; 匹配对确定模块,用于根据所述参考旋转矩阵和所述参考平移向量,计算每组特征点匹配对的投影误差,将在预设范围内的投影误差对应的特征点匹配对确定为目标点匹配对,所述目标点匹配对用于计算出所述第一图像和所述第二图像之间的目标旋转矩阵和目标平移向量,以指示所述第一图像和所述第二图像进行图像匹配。A matching pair determination module, configured to calculate the projection error of each set of feature point matching pairs according to the reference rotation matrix and the reference translation vector, and determine the feature point matching pair corresponding to the projection error within the preset range as the target point Matching pair, the target point matching pair is used to calculate the target rotation matrix and target translation vector between the first image and the second image to instruct the first image and the second image to perform image processing. match. 9.一种计算机设备,其特征在于,所述计算机设备包括处理器、存储器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如权利要求1至7任一项所述的图像特征点匹配筛选方法。9. A computer device, characterized in that the computer device includes a processor, a memory and a computer program stored in the memory and executable on the processor. When the processor executes the computer program Implement the image feature point matching and screening method as described in any one of claims 1 to 7. 10.一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7任一项所述的图像特征点匹配筛选方法。10. A computer-readable storage medium, the computer-readable storage medium stores a computer program, characterized in that, when the computer program is executed by a processor, the image characteristics as claimed in any one of claims 1 to 7 are realized. Point match filtering method.
CN202310839410.4A 2023-07-07 2023-07-07 Image feature point matching screening method, device, equipment and medium based on clustering Active CN116977677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310839410.4A CN116977677B (en) 2023-07-07 2023-07-07 Image feature point matching screening method, device, equipment and medium based on clustering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310839410.4A CN116977677B (en) 2023-07-07 2023-07-07 Image feature point matching screening method, device, equipment and medium based on clustering

Publications (2)

Publication Number Publication Date
CN116977677A true CN116977677A (en) 2023-10-31
CN116977677B CN116977677B (en) 2024-10-25

Family

ID=88474171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310839410.4A Active CN116977677B (en) 2023-07-07 2023-07-07 Image feature point matching screening method, device, equipment and medium based on clustering

Country Status (1)

Country Link
CN (1) CN116977677B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080187249A1 (en) * 2006-11-13 2008-08-07 Yohsuke Konishi Image processing method, image processing apparatus, image reading apparatus, image forming apparatus, and recording medium
US20150206029A1 (en) * 2014-01-22 2015-07-23 Fujitsu Limited Image matching method and image processing system
CN107507223A (en) * 2017-07-28 2017-12-22 武汉工程大学 Method for tracking target based on multi-characters clusterl matching under dynamic environment
CN109934298A (en) * 2019-03-19 2019-06-25 安徽大学 A method and device for progressive graph matching based on clustering deformation graph
CN111340701A (en) * 2020-02-24 2020-06-26 南京航空航天大学 Circuit board image splicing method for screening matching points based on clustering method
CN113689535A (en) * 2021-08-20 2021-11-23 北京道达天际科技有限公司 Building model generation method and device based on unmanned aerial vehicle image
CN114118203A (en) * 2021-10-13 2022-03-01 北京旷视科技有限公司 Method, device and electronic device for image feature extraction and matching
US20220078385A1 (en) * 2019-08-29 2022-03-10 Iview Displays (Shenzhen) Company Ltd. Projection method based on augmented reality technology and projection equipment
US20220148302A1 (en) * 2019-08-30 2022-05-12 Zhejiang Sensetime Technology Development Co., Ltd. Method for visual localization and related apparatus
KR102442093B1 (en) * 2022-04-27 2022-09-13 주식회사 비엠이코리아 Methods for improving surface registration in surgical navigation systems

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080187249A1 (en) * 2006-11-13 2008-08-07 Yohsuke Konishi Image processing method, image processing apparatus, image reading apparatus, image forming apparatus, and recording medium
US20150206029A1 (en) * 2014-01-22 2015-07-23 Fujitsu Limited Image matching method and image processing system
CN107507223A (en) * 2017-07-28 2017-12-22 武汉工程大学 Method for tracking target based on multi-characters clusterl matching under dynamic environment
CN109934298A (en) * 2019-03-19 2019-06-25 安徽大学 A method and device for progressive graph matching based on clustering deformation graph
US20220078385A1 (en) * 2019-08-29 2022-03-10 Iview Displays (Shenzhen) Company Ltd. Projection method based on augmented reality technology and projection equipment
US20220148302A1 (en) * 2019-08-30 2022-05-12 Zhejiang Sensetime Technology Development Co., Ltd. Method for visual localization and related apparatus
CN111340701A (en) * 2020-02-24 2020-06-26 南京航空航天大学 Circuit board image splicing method for screening matching points based on clustering method
CN113689535A (en) * 2021-08-20 2021-11-23 北京道达天际科技有限公司 Building model generation method and device based on unmanned aerial vehicle image
CN114118203A (en) * 2021-10-13 2022-03-01 北京旷视科技有限公司 Method, device and electronic device for image feature extraction and matching
KR102442093B1 (en) * 2022-04-27 2022-09-13 주식회사 비엠이코리아 Methods for improving surface registration in surgical navigation systems

Also Published As

Publication number Publication date
CN116977677B (en) 2024-10-25

Similar Documents

Publication Publication Date Title
US8442307B1 (en) Appearance augmented 3-D point clouds for trajectory and camera localization
CN116485856A (en) Unmanned aerial vehicle image geographic registration method based on semantic segmentation and related equipment
CN114170516B (en) Vehicle weight recognition method and device based on roadside perception and electronic equipment
JP2022541559A (en) Visual positioning method and related equipment
CN112198878B (en) Instant map construction method and device, robot and storage medium
US12361566B2 (en) Method and apparatus for evaluating motion state of traffic tool, device, and medium
CN113793370B (en) Three-dimensional point cloud registration method and device, electronic equipment and readable medium
CN111161348B (en) Object pose estimation method, device and equipment based on monocular camera
WO2025102804A1 (en) Point cloud coarse registration method and apparatus, and device
CN111951211B (en) Target detection method, device and computer readable storage medium
CN114283343A (en) Map updating method, training method and equipment based on remote sensing satellite image
CN116678418A (en) An Improved Fast Loop Detection Method for Laser SLAM
CN116229406A (en) Lane line detection method, system, electronic equipment and storage medium
CN113255405A (en) Parking space line identification method and system, parking space line identification device and storage medium
WO2019100348A1 (en) Image retrieval method and device, and image library generation method and device
CN110120090B (en) Three-dimensional panoramic model construction method and device and readable storage medium
KR102249380B1 (en) System for generating spatial information of CCTV device using reference image information
CN112131902B (en) Closed-loop detection method and device, storage medium and electronic device
WO2021115154A1 (en) Portable device positioning data processing method and apparatus, device, and storage medium
CN116977677A (en) Image feature point matching screening method, device, equipment and medium based on clustering
CN115239776B (en) Point cloud registration method, device, equipment and medium
CN105956595A (en) Image feature extraction method and system
CN115063459A (en) Point cloud registration method and device and panoramic point cloud fusion method and system
CN113409365A (en) Image processing method, related terminal, device and storage medium
CN112288817A (en) Image-based three-dimensional reconstruction processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant