[go: up one dir, main page]

CN118999559B - Positioning method, device, computer equipment and storage medium for inspection drone - Google Patents

Positioning method, device, computer equipment and storage medium for inspection drone

Info

Publication number
CN118999559B
CN118999559B CN202411041027.5A CN202411041027A CN118999559B CN 118999559 B CN118999559 B CN 118999559B CN 202411041027 A CN202411041027 A CN 202411041027A CN 118999559 B CN118999559 B CN 118999559B
Authority
CN
China
Prior art keywords
point cloud
aerial vehicle
unmanned aerial
real
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411041027.5A
Other languages
Chinese (zh)
Other versions
CN118999559A (en
Inventor
戴澍
阳春秋
向巩
黄俊杰
杨文宇
宋玥
王雪迪
石韬
李静
姜卫星
向先波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Investigation Design and Research Institute Co Ltd SIDRI
Original Assignee
Shanghai Investigation Design and Research Institute Co Ltd SIDRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Investigation Design and Research Institute Co Ltd SIDRI filed Critical Shanghai Investigation Design and Research Institute Co Ltd SIDRI
Priority to CN202411041027.5A priority Critical patent/CN118999559B/en
Publication of CN118999559A publication Critical patent/CN118999559A/en
Application granted granted Critical
Publication of CN118999559B publication Critical patent/CN118999559B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明涉及无人机巡检技术领域,公开了巡检无人机的定位方法、装置、计算机设备及存储介质,该方法包括:获取待巡检区域内以预设激光雷达坐标系为基准的实时无人机局部地图;基于建图算法对待巡检区域的点云数据进行建图,生成三维点云全局地图;对实时无人机局部地图和三维点云全局地图进行点云配准,得到第一位姿变换矩阵;基于第一位姿变换矩阵将无人机的里程计信息更新至三维点云全局地图中,得到无人机的实时里程计位姿;将实时里程计位姿输入至预设无人机飞控坐标系,得到无人机实时位姿,对无人机进行定位。本发明基于第一位姿变换矩阵修正激光惯性里程计的累积误差,使无人机在三维点云全局地图范围内具备高精度重复定位的能力。

The present invention relates to the field of drone inspection technology, and discloses a positioning method, device, computer equipment, and storage medium for an inspection drone. The method comprises: obtaining a real-time drone local map based on a preset laser radar coordinate system in the area to be inspected; mapping the point cloud data of the area to be inspected based on a mapping algorithm to generate a three-dimensional point cloud global map; performing point cloud registration on the real-time drone local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix; updating the drone's odometer information to the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the drone's real-time odometer posture; inputting the real-time odometer posture into a preset drone flight control coordinate system to obtain the drone's real-time posture, and positioning the drone. The present invention corrects the accumulated error of the laser inertial odometer based on the first pose transformation matrix, so that the drone has the ability to repeatedly locate with high precision within the range of the three-dimensional point cloud global map.

Description

Positioning method and device of inspection unmanned aerial vehicle, computer equipment and storage medium
Technical Field
The invention relates to the technical field of unmanned aerial vehicle inspection, in particular to a positioning method, a positioning device, computer equipment and a storage medium of an unmanned aerial vehicle inspection.
Background
Based on environmental complexity and safety consideration of the inspection area, the positioning scheme applied to the inspection unmanned aerial vehicle at present is mainly based on fusion positioning of a global satellite and an inertial measurement unit (Inertial Measurement Unit, IMU for short) and based on a carrier phase difference technology, so that centimeter-level positioning accuracy of the unmanned aerial vehicle in an open and non-shielding environment is achieved. The satellite-based positioning scheme is greatly influenced by the environment, the transmission of satellite positioning signals can be influenced by the shielding of the environment, for example, positioning signals can be lost in the scenes such as dense vegetation, buildings, basements and the like, positioning information is finally lost, positioning accuracy is always kept at sub-meter level and the updating frequency is 1HZ, the problem that the positioning accuracy and the updating frequency are low exists, and the unmanned aerial vehicle positioning scene which is converted indoors and outdoors and has complex environment is difficult to apply.
The unmanned aerial vehicle positioning scheme based on the laser radar can output a centimeter-level precision laser inertial odometer in a certain time period due to excellent sensing distance and sensing precision of the laser radar and fusion of precise attitude estimation of the IMU, but the position errors of the laser inertial odometer can be accumulated continuously due to zero bias characteristics of the IMU, registration errors of laser frames and the like, and long-time high-precision positioning and repeated positioning are difficult to achieve. Therefore, the above positioning schemes are difficult to solve the problems of high-precision positioning of task repeatability and environment complexity of the inspection area caused by indoor and outdoor scene switching of the inspection unmanned aerial vehicle.
Disclosure of Invention
In view of the above, the invention provides a positioning method, a positioning device, a computer device and a storage medium of a patrol unmanned aerial vehicle, so as to solve the problem that the continuous accumulation of pose errors of a laser inertial odometer is difficult to realize long-time real-time positioning and repeated positioning.
In a first aspect, the present invention provides a positioning method for a patrol unmanned aerial vehicle, the method comprising:
acquiring a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in an area to be patrolled and examined;
Mapping the point cloud data of the area to be inspected based on a mapping algorithm to generate a three-dimensional point cloud global map;
performing point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map;
Updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map;
Inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map;
and positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
According to the positioning method of the inspection unmanned aerial vehicle, the real-time unmanned aerial vehicle local map taking the preset laser radar coordinate system as a reference in the area to be inspected is obtained, the area to be inspected is integrally mapped, the high-precision three-dimensional point cloud global map is generated, the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map are subjected to point cloud registration based on the map, a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map is obtained, and the accumulated error of the laser inertial odometer is corrected based on the first pose transformation matrix, so that the unmanned aerial vehicle has the capability of high-precision repeated positioning in the range of the three-dimensional point cloud global map, and the problems that the pose error of the laser inertial odometer is accumulated continuously and cannot be positioned and is difficult to be positioned in real time for a long time are solved.
In an alternative embodiment, acquiring a real-time unmanned aerial vehicle local map based on a preset laser radar coordinate system in an area to be patrolled and examined includes:
scanning an initial position of the unmanned aerial vehicle in the area to be inspected by using a laser radar to obtain current odometer information of the unmanned aerial vehicle;
and taking the initial point of the current odometer information as the origin of a preset laser radar coordinate system to generate a real-time unmanned aerial vehicle local map.
The positioning method of the inspection unmanned aerial vehicle adopts the laser radar to scan the initial position of the unmanned aerial vehicle in the area to be inspected to obtain the current odometer information of the unmanned aerial vehicle, takes the initial point of the current odometer information as the origin of the preset laser radar coordinate system to generate the real-time unmanned aerial vehicle local map, describes the pose state of the unmanned aerial vehicle in the preset laser radar coordinate system, achieves the purpose of generating the real-time unmanned aerial vehicle local map, and provides conditions for point cloud registration of the local map and the global map.
In an optional implementation manner, mapping the point cloud data of the area to be patrolled and examined based on a mapping algorithm, and generating the three-dimensional point cloud global map includes:
scanning by using a laser radar to obtain point cloud data of an area to be inspected;
and mapping the point cloud data based on a mapping algorithm to generate a three-dimensional point cloud global map.
The positioning method of the inspection unmanned aerial vehicle adopts the laser radar to scan to obtain the point cloud data of the area to be inspected, and performs mapping on the point cloud data based on a mapping algorithm to generate a three-dimensional point cloud global map, so that the purpose of generating a high-precision three-dimensional point cloud global map through integrated mapping of the point cloud data of the area to be inspected is realized.
In an optional embodiment, performing point cloud registration on a real-time unmanned aerial vehicle local map and a three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map includes:
performing rough registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain an initial pose transformation matrix;
Performing iterative computation on the initial pose transformation matrix, and obtaining an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix when the iteration termination condition is met;
and calculating to obtain a first pose transformation matrix based on the optimal rotation matrix and the optimal translation matrix.
The positioning method of the inspection unmanned aerial vehicle comprises the steps of carrying out rough registration on a real-time unmanned aerial vehicle local map and a three-dimensional point cloud global map to obtain an initial pose transformation matrix, carrying out iterative computation on the initial pose transformation matrix, obtaining an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix when iteration termination conditions are met, obtaining a first pose transformation matrix based on the optimal rotation matrix and the optimal translation matrix, carrying out point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map through the three-dimensional point cloud global map by using iterative computation, and outputting the first pose transformation matrix relative to the three-dimensional point cloud global map in real time, thereby providing conditions for the accumulated error of a laser odometer to be updated subsequently.
In an alternative embodiment, performing coarse registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain an initial pose transformation matrix includes:
calculating a point cloud overlapping area between the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map;
Judging whether the point cloud overlapping area meets a preset threshold, and when the point cloud overlapping area meets the preset threshold, acquiring an initial pose transformation matrix of the real-time unmanned aerial vehicle local map relative to the three-dimensional point cloud global map by adopting a manual estimation rough registration mode.
The method for positioning the inspection unmanned aerial vehicle calculates the point cloud overlapping area between the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map, judges whether the point cloud overlapping area meets a preset threshold, obtains an initial pose transformation matrix of the real-time unmanned aerial vehicle local map relative to the three-dimensional point cloud global map by adopting a manual estimation coarse registration mode when the point cloud overlapping area meets the preset threshold, provides a relatively accurate initial pose transformation matrix for the three-dimensional point cloud global map, provides a good transformation initial value for iterative calculation, and improves the calculation accuracy of the first pose transformation matrix.
In an optional embodiment, inputting the odometer pose into a preset unmanned aerial vehicle flight control coordinate system, and obtaining the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map includes:
Inputting the position and the posture of the odometer into a preset unmanned aerial vehicle flight control coordinate system to obtain a second position and posture transformation matrix;
and obtaining the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map based on the second pose transformation matrix and the odometer pose.
According to the positioning method of the inspection unmanned aerial vehicle, the odometer pose is input into the preset unmanned aerial vehicle flight control coordinate system to obtain the second pose transformation matrix, the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map is obtained based on the second pose transformation matrix and the odometer pose, the purpose of updating the odometer information in real time is achieved, and the unmanned aerial vehicle has high-precision repeated positioning capability in the three-dimensional point cloud global map range.
In an optional embodiment, the positioning method of the inspection unmanned aerial vehicle further includes:
And inputting the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map as observed quantity into an extended Kalman filter, and predicting and updating the pose of the unmanned aerial vehicle.
According to the positioning method for the inspection unmanned aerial vehicle, provided by the invention, the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map is input into the extended Kalman filtering as the observed quantity, the pose of the unmanned aerial vehicle is predicted and updated, more accurate odometer information is obtained through the prediction and updating of the unmanned aerial vehicle, and the positioning accuracy of the unmanned aerial vehicle is improved.
In a second aspect, the present invention provides a positioning device for a patrol unmanned aerial vehicle, the device comprising:
the acquisition module is used for acquiring a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in the region to be inspected;
The generating module is used for carrying out mapping on the point cloud data of the area to be inspected based on a mapping algorithm to generate a three-dimensional point cloud global map;
The point cloud registration module is used for carrying out point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map;
the updating module is used for updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map;
The conversion module is used for inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map;
And the positioning module is used for positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
In a third aspect, the invention provides a computer device, which comprises a memory and a processor, wherein the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions so as to execute the positioning method of the inspection unmanned aerial vehicle in the first aspect or any corresponding implementation mode.
In a fourth aspect, the present invention provides a computer readable storage medium, on which computer instructions are stored, the computer instructions being configured to cause a computer to perform the positioning method of the inspection unmanned aerial vehicle according to the first aspect or any one of the embodiments corresponding to the first aspect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a positioning method of a patrol unmanned aerial vehicle according to an embodiment of the invention;
FIG. 2 is a flow chart of another method of locating a drone according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method of locating a further drone according to an embodiment of the present invention;
FIG. 4 is a flow chart of a positioning method of a further inspection drone according to an embodiment of the present invention;
fig. 5 is a simplified schematic of a drone according to an embodiment of the present invention;
Fig. 6 is a side view of a drone structure according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a drone flight control coordinate system, a local map coordinate system, and a global map coordinate system according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a conversion relationship among a unmanned aerial vehicle flight control coordinate system, a local map coordinate system, and a global map coordinate system according to an embodiment of the present invention;
fig. 9 is a block diagram of a positioning device of a patrol unmanned aerial vehicle according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
According to an embodiment of the present invention, there is provided an embodiment of a positioning method for a drone, it being noted that the steps shown in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order different from that herein.
In this embodiment, a positioning method of a patrol unmanned aerial vehicle is provided, which may be used in a server, and fig. 1 is a flowchart of the positioning method of the patrol unmanned aerial vehicle according to an embodiment of the present invention, as shown in fig. 1, where the flowchart includes the following steps:
Step S101, acquiring a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in a region to be patrolled and examined.
Specifically, the schematic diagrams and the side views of the unmanned aerial vehicle are shown in fig. 5 and 6 respectively, in fig. 6, 1 is a laser radar, 2 is unmanned aerial vehicle flight control, and 3 is unmanned aerial vehicle onboard computer. First, three coordinate systems are constructed, as shown in fig. 7, including a unmanned aerial vehicle flight control coordinate system body (X '-Y' -Z '), a local map coordinate system local_map (X' -Y '-Z', also referred to as a lidar coordinate system), and a global map coordinate system global_map (X-Y-Z). The unmanned aerial vehicle flight control coordinate system body refers to a coordinate system of an airborne flight control IMU (Inertial Measurement Unit ) serving as a center of the unmanned aerial vehicle flight control coordinate system, the local map coordinate system local_map refers to a coordinate system of a laser radar built-in IMU serving as a local map coordinate system, and the global map coordinate system global_map refers to a coordinate system of an input target point cloud map serving as a global map coordinate system.
In this embodiment, the preset lidar coordinate system refers to a local map coordinate system local_map of an inertial measurement unit (Inertial Measurement Unit, abbreviated as IMU) built in the lidar. The initial position of the unmanned aerial vehicle in the area to be patrolled and examined can be obtained by taking a preset laser radar coordinate system as a reference, so that the real-time unmanned aerial vehicle local map is determined.
Step S102, mapping is carried out on the point cloud data of the area to be patrolled and examined based on a mapping algorithm, and a three-dimensional point cloud global map is generated.
Specifically, the mapping algorithm refers to an algorithm for realizing map construction and positioning through steps of feature point extraction, matching, optimization and the like based on sensor data such as a laser radar, a camera, an inertial measurement unit and the like. And mapping the point cloud data of the area to be inspected by adopting a mapping algorithm, and generating a three-dimensional point cloud global map.
And step S103, carrying out point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map.
Specifically, the point cloud registration refers to a process of aligning and matching two point clouds acquired under different coordinate systems. The method relates to estimating the spatial transformation relation of two point clouds, so that the two point clouds can be accurately corresponding in space. And carrying out point cloud registration on the point cloud in the real-time unmanned aerial vehicle local map and the point cloud of the three-dimensional point cloud global map to obtain the pose relation of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map, wherein the pose relation is represented by a first pose transformation matrix.
Step S104, updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map.
Specifically, the odometer information includes position and direction information of a movement track of the unmanned aerial vehicle, and the odometer information may represent a pose of the unmanned aerial vehicle. And updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map according to the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map, namely outputting the odometer information of the unmanned aerial vehicle in the three-dimensional point cloud global map.
Step S105, inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
Specifically, the real-time odometer pose is input to a preset unmanned aerial vehicle flight control coordinate system, namely, the real-time odometer pose of the middle unmanned aerial vehicle of the three-dimensional point cloud global map is converted into the real-time pose of the unmanned aerial vehicle flight control coordinate system, and the obtained real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map can reflect the actual pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
And S106, positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
Specifically, when the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map is obtained, the unmanned aerial vehicle in the area to be inspected can be positioned in real time.
According to the positioning method of the inspection unmanned aerial vehicle, the real-time unmanned aerial vehicle local map taking the preset laser radar coordinate system as a reference in the area to be inspected is obtained, the area to be inspected is integrally mapped, the high-precision three-dimensional point cloud global map is generated, the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map are subjected to point cloud registration based on the map, a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map is obtained, and the accumulated error of the laser inertial odometer is corrected based on the first pose transformation matrix, so that the unmanned aerial vehicle has the capability of high-precision repeated positioning in the range of the three-dimensional point cloud global map, and the problem that the continuous accumulation of pose errors of the laser inertial odometer is difficult to realize long-time real-time positioning and repeated positioning is solved.
In this embodiment, a positioning method of a patrol unmanned aerial vehicle is provided, which may be used in a server, and fig. 2 is a flowchart of the positioning method of the patrol unmanned aerial vehicle according to an embodiment of the present invention, as shown in fig. 2, where the flowchart includes the following steps:
Step S201, a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in an area to be inspected is obtained.
Specifically, the step S201 includes:
And step 2011, scanning the initial position of the unmanned aerial vehicle in the area to be inspected by adopting a laser radar to obtain the current odometer information of the unmanned aerial vehicle.
Specifically, an initial position of the unmanned aerial vehicle in the area to be patrolled and examined is provided by a laser inertial odometer (Lidar IternalOdometry), and data content of the initial position of the unmanned aerial vehicle is three-axis position of the unmanned aerial vehicle and attitude quaternion (Px, py, pz, q x,qy,qz, w) of the unmanned aerial vehicle. Wherein Px, py and Pz are positions of the unmanned aerial vehicle in an x axis, a y axis and a z axis respectively, q x,qy,qz and w are attitude quaternions of the unmanned aerial vehicle respectively (the quaternions are a numerical system of an expansion complex number and are composed of a real part and three imaginary parts). The laser inertial odometer is mainly obtained by fusing a laser radar and an inertial measurement unit IMU, sensing and acquiring the surrounding environment and the pose change of the unmanned aerial vehicle body in real time, and analyzing by corresponding algorithms, wherein the corresponding algorithms can refer to related technologies and are not described in detail herein. For example, after the processes of IMU and laser radar data preprocessing, initial alignment, state estimation, filtering optimization and the like, the initial attitude variable of the unmanned aerial vehicle is output.
Starting a laser inertial odometer in a region to be inspected, scanning an initial position of the unmanned aerial vehicle in the region to be inspected by adopting a laser radar in the laser inertial odometer, and outputting current odometer information of the unmanned aerial vehicle by using a laser radar coordinate system (local_map, namely a local map coordinate system) as a reference by the unmanned aerial vehicle, wherein an initial point of the odometer information is P 0 = (0,0,0,0,0,0,1).
Step S2012, a local map of the real-time unmanned aerial vehicle is generated by taking the initial point of the current odometer information as the origin of a preset laser radar coordinate system.
Specifically, the initial point P 0 = (0,0,0,0,0,0,1) of the current odometer information of the unmanned aerial vehicle is taken as the origin of the laser radar coordinate system, a real-time unmanned aerial vehicle local map is output, the real-time unmanned aerial vehicle local map is a local map composed of point cloud data PointCloud with data types of x, y, z and intensity, and is a local map P l (local_map) based on an odometer, and the output frequency of the real-time unmanned aerial vehicle local map is usually 10Hz.
Step S202, mapping is carried out on point cloud data of the area to be inspected based on a mapping algorithm, and a three-dimensional point cloud global map is generated.
Specifically, the step S202 includes:
in step S2021, the laser radar is used to scan to obtain the point cloud data of the area to be inspected.
Specifically, the laser radar scans the point cloud of the area to be inspected at a scanning frequency of 10Hz to 100Hz and stores the point cloud data.
Step S2022, mapping the point cloud data based on the mapping algorithm, and generating a three-dimensional point cloud global map.
Specifically, the mapping method can be real-time or off-line post-processing, and the mapping method can be output in the form of a robot operating system (ROS, robot Operating System) topic after the mapping is completed, that is, the three-dimensional point cloud global map P g (global_map) is output in the form of map data in the form of point cloud data (PointCloud 2), and the data types are generally "x, y, z, intensity".
And step S203, performing point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map. Please refer to step S103 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S204, updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map. Please refer to step S104 in the embodiment shown in fig. 1 in detail, which is not described herein.
Step S205, inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map. Please refer to step S105 in the embodiment shown in fig. 1 in detail, which is not described herein.
And S206, positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map. Please refer to step S106 in the embodiment shown in fig. 1 in detail, which is not described herein.
According to the positioning method for the inspection unmanned aerial vehicle, the laser radar is used for scanning the initial position of the unmanned aerial vehicle in the area to be inspected to obtain the current odometer information of the unmanned aerial vehicle, the initial point of the current odometer information is used as the origin of the preset laser radar coordinate system to generate the real-time unmanned aerial vehicle local map, the pose state of the unmanned aerial vehicle in the preset laser radar coordinate system is described, the purpose of generating the real-time unmanned aerial vehicle local map is achieved, and conditions are provided for point cloud registration of the local map and the global map. And mapping the point cloud data based on a mapping algorithm to generate a three-dimensional point cloud global map, thereby realizing the purpose of generating a high-precision three-dimensional point cloud global map by integrally mapping the point cloud data of the area to be inspected.
In this embodiment, a positioning method of a patrol unmanned aerial vehicle is provided, which may be used in a server, and fig. 3 is a flowchart of the positioning method of the patrol unmanned aerial vehicle according to an embodiment of the present invention, as shown in fig. 3, where the flowchart includes the following steps:
step S301, a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in a region to be patrolled and examined is obtained. Please refer to step S201 in the embodiment shown in fig. 2 in detail, which is not described herein.
Step S302, mapping is carried out on the point cloud data of the area to be patrolled and examined based on a mapping algorithm, and a three-dimensional point cloud global map is generated. Please refer to step S202 in the embodiment shown in fig. 2, which is not described herein.
Step S303, performing point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map.
Specifically, the step S303 includes:
step S3031, coarse registration is carried out on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map, and an initial pose transformation matrix is obtained.
In some optional embodiments, step S3031 includes:
And a step a1, calculating a point cloud overlapping area between the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map.
Specifically, counting point cloud data of a local map of the real-time unmanned aerial vehicle and point cloud data of a three-dimensional point cloud global map, calculating the total number of the point cloud data as target points, determining the corresponding relation between the point cloud data of the local map of the real-time unmanned aerial vehicle and the point cloud data of the three-dimensional point cloud global map, dividing the number of the point cloud data with the corresponding relation by the total number of the point clouds to obtain a point cloud overlapping region fitness, wherein a calculation formula of the point cloud overlapping region fitness and the point cloud corresponding relation is as follows:
fitness = number of point cloud data correspondences/target points
Wherein, L i={l1,l2,...,ln represents a point cloud P l(local_map),gi={g1,g2,...,gn in the real-time unmanned aerial vehicle local map, point cloud P g(globalmap);NL in the three-dimensional point cloud global map is the point sum of the point clouds L of the local map, i is an index variable, i=1, an initial value of an index is 1;R, a rotation matrix is represented, and t represents a translation matrix.
And a step a2 of judging whether the point cloud overlapping area meets a preset threshold value, and when the point cloud overlapping area meets the preset threshold value, obtaining an initial pose transformation matrix of the real-time unmanned aerial vehicle local map relative to the three-dimensional point cloud global map by adopting a manual estimation rough registration mode.
Specifically, when the mapping algorithm is started, the position of the origin of the built-in IMU coordinate system of the laser radar is the origin of the coordinate system of the global map of the three-dimensional point cloud, and when the unmanned aerial vehicle is positioned, the position of the unmanned aerial vehicle in the real environment is consistent with the origin of the coordinate system of the global map of the three-dimensional point cloud, so that a relatively accurate initial pose transformation matrix is provided for local map matchingWhere R 0 represents the initial rotation matrix and t 0 represents the initial translation matrix.
The manual estimation mode can be provided by rough manual estimation of a visual interface RVIZ (Robot Visualization tool, platform for three-dimensional visual of a robot system) of ROS (robot operating system), and the data types of the initial pose transformation matrix are three-axis position (p x,py,pz) and euler angle "yaw (roll), pitch (pitch), roll (yaw)" of the unmanned aerial vehicle. In the embodiment, the initialization pose of the real-time unmanned aerial vehicle local map in the three-dimensional point cloud global map is provided in a manual estimation mode, and when the pose estimation accuracy meets a preset threshold interval, the initialization is successful.
In the initialization process, the pose estimation accuracy is quantified by the value of the point cloud overlapping region fitness, the preset threshold value is set to be 0.95, and when fitness > =0.95, the pose initialization is successful, and the initial pose transformation matrix is obtainedTranslation matrix t 0=[px0 py0 pz0 ], rotation matrix R 0 is derived from the euler angle-rotation matrix transformation formula, which is as follows:
where α, β, γ correspond to roll (yaw), pitch (pitch) and yaw (roll) angles, respectively, in euler angles.
Through the Euler angle-rotation matrix transformation formula, the pose relation of the real-time unmanned aerial vehicle local map (lobal _map) relative to the three-dimensional point cloud global map (global_map) is converted into an initial transformation matrix
The positioning method of the inspection unmanned aerial vehicle calculates a point cloud overlapping area between a real-time unmanned aerial vehicle local map and a three-dimensional point cloud global map, judges whether the point cloud overlapping area meets a preset threshold, obtains an initial pose transformation matrix of the real-time unmanned aerial vehicle local map relative to the three-dimensional point cloud global map by adopting a manual estimation coarse registration mode when the point cloud overlapping area meets the preset threshold, provides a relatively accurate initial pose transformation matrix for the three-dimensional point cloud global map, provides a good transformation initial value for iterative calculation, and improves calculation accuracy of a first pose transformation matrix.
Step S3032, iterative computation is carried out on the initial pose transformation matrix, and when the iteration termination condition is met, an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix are obtained.
Specifically, after the rough registration is completed, a three-dimensional point cloud global map matching is performed by adopting a point cloud registration algorithm (ITERATIVE CLOSEST POINT, ICP) to obtain a relatively accurate transformation matrix T 1(R1,t1), iterative computation is performed on T 1(R1,t1), and an optimal rotation matrix R k and an optimal translation matrix T k are obtained by solving.
The solution process of the optimal rotation matrix R k and the optimal translation matrix t k is as follows:
Firstly, calculating the mass centers of two point clouds of a real-time unmanned aerial vehicle local map and a three-dimensional point cloud global map:
Mu l is the point cloud centroid of the real-time unmanned aerial vehicle local map, mu g is the point cloud centroid of the three-dimensional point cloud global map, N l、Ng is the total number of point clouds of the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map respectively, and l i、gi is the point cloud data of the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map respectively.
Subtracting centroid from original point cloud:
Wherein, the Representing a local map point cloud after removal of the centroid,Representing the global map point cloud after removal of the centroid.
Transforming and converting equation (1) into a form that removes the centroid:
Let μ l-Rμg -t=0:
because the rotation matrix R is an orthogonal matrix, i.e., R T r=0, the irrelevant terms are removed, and the method is simplified to obtain:
minimizing E (R, t) is equivalent to:
Represented by the trace of the matrix:
SVD decomposition (Singular Value Decomposition ) of H:
H=UΔVT (17);
setting a variable X:
X=VUT (18);
XH=VUTUΔVT=VΔVT (19);
R=X=VUT (20);
t=μl-Rμg (21);
And (3) carrying out iterative computation on the formula (20) and the formula (21), obtaining a current optimal rotation matrix R k and an optimal translation matrix t k after each iterative computation, then applying the transformation to the current source point cloud to continuously solve the optimal rotation matrix R and the optimal translation matrix t, and carrying out iteration continuously until the iteration termination condition ' fitness > =0.95 ' and the maximum iteration number of 20 ' are met.
Step S3033, a first pose transformation matrix is obtained based on the optimal rotation matrix and the optimal translation matrix.
Specifically, the optimal rotation matrix R k and the optimal translation matrix t k are combined to obtain a first pose transformation matrix
The method for positioning the inspection unmanned aerial vehicle comprises the steps of carrying out rough registration on a real-time unmanned aerial vehicle local map and a three-dimensional point cloud global map to obtain an initial pose transformation matrix, carrying out iterative computation on the initial pose transformation matrix, obtaining an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix when iteration termination conditions are met, obtaining a first pose transformation matrix based on the optimal rotation matrix and the optimal translation matrix, carrying out point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map through the three-dimensional point cloud global map by using iterative computation, and outputting the first pose transformation matrix relative to the three-dimensional point cloud global map in real time to provide conditions for accumulated errors of a laser odometer to be updated subsequently.
And step S304, updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map.
Specifically, as shown in fig. 8, the coordinate system where the real-time unmanned aerial vehicle local map is located is updated to the coordinate system where the three-dimensional point cloud global map is located, and the formula is as follows:
Secondly, updating the odometer information of the real-time unmanned aerial vehicle under the coordinate system of the local map to the coordinate system of the three-dimensional point cloud global map, wherein the formula is as follows:
Wherein O global is the real-time odometer pose of the unmanned aerial vehicle on the three-dimensional point cloud global map, And F local_map is a real-time unmanned aerial vehicle local map, and O local is odometer information in the real-time unmanned aerial vehicle local map.
And finally, obtaining the real-time odometer pose of the unmanned aerial vehicle on the three-dimensional point cloud global map.
And step S305, inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
Specifically, the step S305 includes:
Step S3051, inputting the pose of the odometer into a preset unmanned aerial vehicle flight control coordinate system to obtain a second pose transformation matrix.
Specifically, when a first pose transformation matrix of the laser radar under the global map coordinate system is obtainedThe pose conversion is further optimized under a unmanned aerial vehicle flight control coordinate system (body) to obtain a second pose conversion matrix T body_to_local.
And step S3052, obtaining the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map based on the second pose transformation matrix and the odometer pose.
Specifically, the unmanned aerial vehicle flight control is deployed at the central position of the unmanned aerial vehicle body, the gesture and the position of the unmanned aerial vehicle flight control can more truly reflect the actual gesture of the unmanned aerial vehicle under the three-dimensional point cloud global map, and when the laser radar and the unmanned aerial vehicle flight control are installed in hardware, the real-time gesture of the unmanned aerial vehicle under the three-dimensional point cloud global map is obtained based on the second gesture conversion matrix T body_to_local, and the formula is as follows:
Obody=Tbody_to_local*Oglobal (24);
o body is the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
According to the positioning method of the inspection unmanned aerial vehicle, the odometer pose is input into the preset unmanned aerial vehicle flight control coordinate system to obtain the second pose transformation matrix, the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map is obtained based on the second pose transformation matrix and the odometer pose, the purpose of updating the odometer information in real time is achieved, and the unmanned aerial vehicle has high-precision repeated positioning capability in the three-dimensional point cloud global map range.
And step S306, positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map. Please refer to step S206 in the embodiment shown in fig. 2, which is not described herein.
Step S307, the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map is used as an observed quantity to be input into the extended Kalman filtering, and the pose of the unmanned aerial vehicle is predicted and updated.
Specifically, when the positioning information based on global map matching is updated to the unmanned aerial vehicle flight control coordinate system (body), the position information is added into an Extended Kalman Filter (EKF) (Extended KalmanFilter, abbreviated as EKF) as an observed quantity, and further state estimation and update are performed on the pose of the unmanned aerial vehicle, and the relevant state quantity is as follows:
x24x1=[q,v,p,Δθb,Δvb,mNED,mb,vwind]T
Wherein x 24x1 is a state quantity, q is a rotational vector quaternion from an NED coordinate system (North, east, down world coordinate system) to an unmanned plane coordinate system Body, v is a Body speed in the NED system, P is a Body position in a global map coordinate system (global_map) of a three-dimensional point cloud, m NED is a magnetic field vector of the earth in the NED coordinate system, and v wind is a wind speed in the North East direction;
And obtaining more accurate odometer data O body through the prediction and updating of the extended Kalman filter EKF.
According to the positioning method for the inspection unmanned aerial vehicle, provided by the embodiment, the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map is input into the extended Kalman filtering as the observed quantity, the pose of the unmanned aerial vehicle is predicted and updated, more accurate odometer information is obtained through the prediction and updating of the unmanned aerial vehicle, and the positioning accuracy of the unmanned aerial vehicle is improved.
As one or more specific application embodiments of the embodiment of the present invention, a positioning method of the inspection unmanned aerial vehicle of the present embodiment is further described with reference to fig. 4, which is specifically as follows:
The flow of the positioning method of the inspection unmanned aerial vehicle is shown in fig. 4, and the method comprises the following steps:
1. constructing a coordinate system:
Three main coordinate systems are defined, including a unmanned aerial vehicle flight control coordinate system body, a local map coordinate system local_map (also referred to as a laser radar coordinate system), and a global map coordinate system global_map, as shown in fig. 7. The unmanned aerial vehicle flight control coordinate system body refers to a coordinate system of an airborne flight control IMU (Inertial Measurement Unit ) serving as a center of the unmanned aerial vehicle flight control coordinate system, the local map coordinate system local_map refers to a coordinate system of a laser radar built-in IMU serving as a local map coordinate system, and the global map coordinate system global_map refers to a coordinate system of an input target point cloud map serving as a global map coordinate system.
2. Acquiring an initial position of the unmanned aerial vehicle:
The initial position of the unmanned aerial vehicle is provided by a laser inertial odometer (Lidar IternalOdometry), and the data content is the three-axis position of the unmanned aerial vehicle and the attitude quaternion (Px, py, pz, q x,qy,qz, w) of the unmanned aerial vehicle. The laser inertial odometer is mainly obtained by fusing a laser radar and an IMU, sensing and acquiring the change of the surrounding environment and the body pose in real time and analyzing through a corresponding algorithm. And starting LIO nodes in the region to be inspected, and outputting odometer information by the unmanned aerial vehicle by taking a laser radar coordinate system (local_map, namely a local map coordinate system) as a reference, wherein an initial point of the odometer information is P 0 = (0,0,0,0,0,0,1).
The initial point P 0 = (0,0,0,0,0,0,1) of the current odometer information of the unmanned aerial vehicle is taken as the origin of the laser radar coordinate system, a real-time unmanned aerial vehicle local map is output, the real-time unmanned aerial vehicle local map is the point cloud data PointCloud2 with the data types of x, y, z and intensity, and is the local map P l (local_map) based on the odometer, and the output frequency of the real-time unmanned aerial vehicle local map is 10Hz. The real-time unmanned aerial vehicle local map data can be used when point cloud registration is carried out with the three-dimensional point cloud global map.
3. Loading a global map:
The laser radar performs point cloud scanning on the area to be inspected at a scanning frequency of 10Hz to 100Hz, stores point cloud data, performs mapping on the point cloud data of the area to be inspected based on a laser slam (Simulataneous Localization AND MAPPING, slam for short) mapping algorithm, generates an accurate three-dimensional point cloud global map P g (global_map), and outputs the map in the form of a robot operating system (ROS, robot Operating System) topic after the mapping is finished, namely, the three-dimensional point cloud global map P g (global_map) is output in the form of map data in a point cloud data (PointCloud 2) format, wherein the data type is generally "x, y, z, intensity (intensity).
4. Pose initialization:
the initialized pose is a relatively accurate actual pose of a local map coordinate system (local_map) origin of the real-time unmanned aerial vehicle in a three-dimensional point cloud global map (global_map), in this embodiment, when the map building algorithm is started, the coordinate system origin of the built-in IMU of the laser radar is the coordinate system origin of the three-dimensional point cloud global map (global_map), and when the unmanned aerial vehicle is positioned, the position of the unmanned aerial vehicle in the real environment is consistent with the coordinate system origin of the global map as much as possible, and the function of the pose transformation matrix is to provide a relatively accurate initial pose transformation matrix for local map matching
The manual estimation mode can be provided through rough manual estimation of a visual interface RVIZ (RobotVisualization tool) of an ROS (robot operating system) for three-dimensional visual platform of a robot system, and the data types are three-axis position (p x,py,pz) and Euler angle (roll), pitch (pitch) and roll (yaw) of the unmanned aerial vehicle. In the embodiment, the initialization pose of the real-time unmanned aerial vehicle local map in the three-dimensional point cloud global map is provided in a manual estimation mode, and when the pose estimation accuracy meets a preset threshold interval, the initialization is successful. The pose estimation accuracy is quantified by the value of the point cloud overlapping region fitness, the preset threshold value is set to be 0.95, and when fitness > =0.95, the pose is successfully initialized, and the initial pose transformation matrix is obtainedTranslation matrix t 0=[px0 py0pz0 ], rotation matrix R 0 is derived from euler angle-rotation matrix transformation formula (2).
Through Euler angle-rotation matrix transformation formula, the pose relation of a real-time unmanned aerial vehicle local map (lobal _map) relative to a three-dimensional point cloud global map (global_map) is converted into an initial transformation matrixWill beAnd (3) carrying out the calculation of E (R, t) by taking the formula (1), and when the E (R, t) is within an allowable range, indicating that the initialization is successful, and then entering a global map matching stage.
5. Global map matching:
The key point of global map matching is point cloud registration, namely, registering real-time point cloud data P l (local_map) obtained by 10HZ-100HZ scanning with accurate three-dimensional point cloud global map point cloud data P g (global_map) under a laser radar coordinate system, outputting a rotation matrix R and a translation matrix T for transforming P l to P g, carrying out three-dimensional point cloud global map matching by adopting an ICP point cloud registration algorithm (ITERATIVE CLOSEST POINT algorithm) after coarse registration in step 4 to obtain a relatively accurate transformation matrix T 1(R1,t1, and carrying out iterative computation on T 1(R1,t1 until an iteration termination condition 'fitness > = 0.95' and 'maximum iteration times 20' are met, and solving to obtain an optimal rotation matrix R k and an optimal translation matrix T k.
6. Updating the odometer:
firstly, updating a coordinate system of a real-time unmanned aerial vehicle local map to a coordinate system of a three-dimensional point cloud global map, wherein the formula is as follows:
Secondly, updating the odometer information of the real-time unmanned aerial vehicle under the coordinate system of the local map to the coordinate system of the three-dimensional point cloud global map, wherein the formula is as follows:
Wherein O global is the real-time odometer pose of the unmanned aerial vehicle on the three-dimensional point cloud global map, And F local_map is a real-time unmanned aerial vehicle local map, and O local is odometer information in the real-time unmanned aerial vehicle local map.
And finally, obtaining the real-time odometer pose of the unmanned aerial vehicle on the three-dimensional point cloud global map.
7. Inputting unmanned aerial vehicle flight control:
When a first pose transformation matrix of the laser radar under the global map coordinate system is obtained The pose conversion is further optimized under a unmanned aerial vehicle flight control coordinate system (body) to obtain a second pose conversion matrix T body_to_local. The unmanned aerial vehicle flight control is deployed at the central position of the unmanned aerial vehicle body, the gesture and the position of the unmanned aerial vehicle flight control can more truly reflect the actual gesture of the unmanned aerial vehicle under the three-dimensional point cloud global map, and when the laser radar and the unmanned aerial vehicle flight control are installed in hardware, the real-time gesture of the unmanned aerial vehicle under the three-dimensional point cloud global map is obtained based on the second gesture conversion matrix T body_to_local, and the formula is as follows:
Obody=Tbody_to_local*Oglobal (24);
o body is the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
According to the positioning method of the inspection unmanned aerial vehicle, the map is integrally built in the area to be inspected, the high-precision global point cloud map is generated, the point cloud registration of the local map and the global map is performed by using the iterative closest point algorithm ICP (IterativeClosest Point) based on the map, the pose of the laser inertial odometer relative to the global map is updated, and the high-precision positioning and repeated positioning of the unmanned aerial vehicle in the global map are realized.
The embodiment also provides a positioning device of the inspection unmanned aerial vehicle, which is used for realizing the embodiment and the preferred implementation, and the description is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
The embodiment provides a positioning device of a patrol unmanned aerial vehicle, as shown in fig. 9, including:
The acquisition module 901 is configured to acquire a real-time unmanned aerial vehicle local map in the area to be inspected with a preset laser radar coordinate system as a reference.
The generating module 902 is configured to map the point cloud data of the area to be inspected based on a mapping algorithm, and generate a three-dimensional point cloud global map.
The point cloud registration module 903 is configured to perform point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map, so as to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map;
the updating module 904 is configured to update odometer information in the local map of the real-time unmanned aerial vehicle to the three-dimensional point cloud global map based on the first pose transformation matrix, so as to obtain a real-time odometer pose of the unmanned aerial vehicle on the three-dimensional point cloud global map.
The conversion module 905 is configured to input the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system, so as to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
And the positioning module 906 is used for positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map.
In some alternative embodiments, the acquiring module 901 includes:
the first scanning unit is used for scanning the initial position of the unmanned aerial vehicle in the area to be inspected by using the laser radar to obtain the current odometer information of the unmanned aerial vehicle.
The first generation unit is used for generating a real-time unmanned aerial vehicle local map by taking the current odometer information initial point as a preset laser radar coordinate system origin.
In some alternative embodiments, the generating module 902 includes:
And the second scanning unit is used for scanning by adopting a laser radar to obtain point cloud data of the area to be inspected.
And the second generation unit is used for mapping the point cloud data based on a mapping algorithm and generating a three-dimensional point cloud global map.
In some alternative embodiments, the point cloud registration module 903 includes:
And the registration unit is used for performing rough registration on the local map of the real-time unmanned aerial vehicle and the three-dimensional point cloud global map to obtain an initial pose transformation matrix.
And the iteration unit is used for carrying out iterative computation on the initial pose transformation matrix, and obtaining an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix when the iteration termination condition is met.
And the calculating unit is used for calculating to obtain a first pose transformation matrix based on the optimal rotation matrix and the optimal translation matrix.
In some alternative embodiments, the registration unit comprises:
and the calculating subunit is used for calculating the point cloud overlapping area between the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map.
And the registration subunit is used for judging whether the point cloud overlapping area meets a preset threshold value, and when the point cloud overlapping area meets the preset threshold value, acquiring an initial pose transformation matrix of the real-time unmanned aerial vehicle local map relative to the three-dimensional point cloud global map by adopting a manual estimation rough registration mode.
In some alternative embodiments, the conversion module 905 includes:
And the input unit is used for inputting the pose of the odometer to a preset unmanned aerial vehicle flight control coordinate system to obtain a second pose transformation matrix.
And the computing unit is used for obtaining the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map based on the second pose transformation matrix and the odometer pose.
In some optional embodiments, the positioning device of the inspection unmanned aerial vehicle further includes:
And the prediction updating module is used for inputting the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map as observed quantity into the extended Kalman filtering, and predicting and updating the pose of the unmanned aerial vehicle.
Further functional descriptions of the above respective modules and units are the same as those of the above corresponding embodiments, and are not repeated here.
The positioning device of the inspection unmanned aerial vehicle in this embodiment is presented in the form of a functional unit, where the unit refers to an ASIC (Application SPECIFIC INTEGRATED Circuit) Circuit, a processor and a memory that execute one or more software or fixed programs, and/or other devices that can provide the above functions.
The embodiment of the invention also provides computer equipment, which is provided with the positioning device of the inspection unmanned aerial vehicle shown in the figure 9.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a computer device according to an alternative embodiment of the present invention, and as shown in fig. 10, the computer device includes one or more processors 10, a memory 20, and interfaces for connecting components, including a high-speed interface and a low-speed interface. The various components are communicatively coupled to each other using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the computer device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In some alternative embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple computer devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 10 is illustrated in fig. 10.
The processor 10 may be a central processor, a network processor, or a combination thereof. The processor 10 may further include a hardware chip, among others. The hardware chip may be an application specific integrated circuit, a programmable logic device, or a combination thereof. The programmable logic device may be a complex programmable logic device, a field programmable gate array, a general-purpose array logic, or any combination thereof.
Wherein the memory 20 stores instructions executable by the at least one processor 10 to cause the at least one processor 10 to perform the methods shown in implementing the above embodiments.
The memory 20 may include a storage program area that may store an operating system, application programs required for at least one function, and a storage data area that may store data created according to the use of the computer device, etc. In addition, the memory 20 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some alternative embodiments, memory 20 may optionally include memory located remotely from processor 10, which may be connected to the computer device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The memory 20 may comprise volatile memory, such as random access memory, or nonvolatile memory, such as flash memory, hard disk or solid state disk, or the memory 20 may comprise a combination of the above types of memory.
The computer device further comprises input means 30 and output means 40. The processor 10, memory 20, input device 30, and output device 40 may be connected by a bus or other means, for example in fig. 10.
The input device 30 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the computer apparatus, such as a touch screen, a keypad, a mouse, a trackpad, a touchpad, a pointer stick, one or more mouse buttons, a trackball, a joystick, and the like. The output means 40 may include a display device, auxiliary lighting means (e.g., LEDs), tactile feedback means (e.g., vibration motors), and the like. Such display devices include, but are not limited to, liquid crystal displays, light emitting diodes, displays and plasma displays. In some alternative implementations, the display device may be a touch screen.
The embodiments of the present invention also provide a computer readable storage medium, and the method according to the embodiments of the present invention described above may be implemented in hardware, firmware, or as a computer code which may be recorded on a storage medium, or as original stored in a remote storage medium or a non-transitory machine readable storage medium downloaded through a network and to be stored in a local storage medium, so that the method described herein may be stored on such software process on a storage medium using a general purpose computer, a special purpose processor, or programmable or special purpose hardware. The storage medium may be a magnetic disk, an optical disk, a read-only memory, a random-access memory, a flash memory, a hard disk, a solid state disk, or the like, and further, the storage medium may further include a combination of the above types of memories. It will be appreciated that a computer, processor, microprocessor controller or programmable hardware includes a storage element that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the methods illustrated by the above embodiments.
Although embodiments of the present invention have been described in connection with the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope of the invention as defined by the appended claims.

Claims (8)

1. A method for locating a patrol unmanned aerial vehicle, the method comprising:
acquiring a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in an area to be patrolled and examined;
Mapping the point cloud data of the area to be inspected based on a mapping algorithm to generate a three-dimensional point cloud global map;
performing point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map;
Updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map;
inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under a three-dimensional point cloud global map;
positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map;
Performing point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map, wherein the obtaining comprises the following steps:
performing rough registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain an initial pose transformation matrix;
performing iterative computation on the initial pose transformation matrix, and obtaining an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix when the iteration termination condition is met;
calculating to obtain a first pose transformation matrix based on the optimal rotation matrix and the optimal translation matrix;
performing coarse registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain an initial pose transformation matrix, wherein the method comprises the following steps of:
And calculating a point cloud overlapping area between the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map, wherein the calculation formula of the point cloud overlapping area and the point cloud corresponding relation is as follows:
fitness=number of point cloud data correspondences/target points;
Wherein, L i={l1,l2,...,ln represents the point cloud P l(local_map),gi={g1,g2,...,gn in the real-time unmanned aerial vehicle local map, the point cloud P g(globalmap);NL in the three-dimensional point cloud global map is the point sum of the local map point cloud L, i is an index variable, i=1, wherein, the initial value of the index is 1;R, t represents a rotation matrix, and t represents a translation matrix;
Judging whether the point cloud overlapping area meets a preset threshold value, and when the point cloud overlapping area meets the preset threshold value, obtaining an initial pose transformation matrix of the real-time unmanned aerial vehicle local map relative to the three-dimensional point cloud global map by adopting a manual estimation rough registration mode, wherein the initial pose transformation matrix is Where R 0 represents the initial rotation matrix and t 0 represents the initial translation matrix.
2. The method according to claim 1, wherein the acquiring the real-time unmanned aerial vehicle local map based on the preset laser radar coordinate system in the area to be patrolled comprises:
scanning an initial position of the unmanned aerial vehicle in the area to be inspected by using a laser radar to obtain current odometer information of the unmanned aerial vehicle;
and taking the initial point of the current odometer information as the origin of a preset laser radar coordinate system to generate a real-time unmanned aerial vehicle local map.
3. The method of claim 1, wherein mapping the point cloud data of the area to be inspected based on the mapping algorithm, generating a three-dimensional point cloud global map comprises:
scanning by using a laser radar to obtain point cloud data of an area to be inspected;
and mapping the point cloud data based on a mapping algorithm to generate a three-dimensional point cloud global map.
4. The method of claim 1, wherein inputting the odometer pose to a preset unmanned aerial vehicle flight control coordinate system to obtain a real-time pose of the unmanned aerial vehicle under a three-dimensional point cloud global map comprises:
Inputting the position and the posture of the odometer into a preset unmanned aerial vehicle flight control coordinate system to obtain a second position and posture transformation matrix;
and obtaining the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map based on the second pose transformation matrix and the odometer pose.
5. The method according to claim 1, wherein the method further comprises:
And inputting the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map as observed quantity into an extended Kalman filter, and predicting and updating the pose of the unmanned aerial vehicle.
6. A locating device for a patrol unmanned aerial vehicle, the device comprising:
the acquisition module is used for acquiring a real-time unmanned aerial vehicle local map taking a preset laser radar coordinate system as a reference in the region to be inspected;
The generating module is used for carrying out mapping on the point cloud data of the area to be inspected based on a mapping algorithm to generate a three-dimensional point cloud global map;
The point cloud registration module is used for carrying out point cloud registration on the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map to obtain a first pose transformation matrix of the unmanned aerial vehicle local map relative to the three-dimensional point cloud global map;
the updating module is used for updating the odometer information in the local map of the real-time unmanned aerial vehicle into the three-dimensional point cloud global map based on the first pose transformation matrix to obtain the real-time odometer pose of the unmanned aerial vehicle in the three-dimensional point cloud global map;
The conversion module is used for inputting the real-time odometer pose into a preset unmanned aerial vehicle flight control coordinate system to obtain the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map;
The positioning module is used for positioning the unmanned aerial vehicle based on the real-time pose of the unmanned aerial vehicle under the three-dimensional point cloud global map;
The point cloud registration module comprises:
The registration unit is used for performing rough registration on the local map of the real-time unmanned aerial vehicle and the three-dimensional point cloud global map to obtain an initial pose transformation matrix;
The iteration unit is used for carrying out iterative computation on the initial pose transformation matrix, and when the iteration termination condition is met, an optimal rotation matrix and an optimal translation matrix corresponding to the initial pose transformation matrix are obtained;
The computing unit is used for computing to obtain a first pose transformation matrix based on the optimal rotation matrix and the optimal translation matrix;
The registration unit includes:
The computing subunit is used for computing a point cloud overlapping area between the real-time unmanned aerial vehicle local map and the three-dimensional point cloud global map, and the computing formula of the point cloud overlapping area and the point cloud corresponding relation is as follows:
fitness=number of point cloud data correspondences/target points;
Wherein, L i={l1,l2,...,ln represents the point cloud P l(local_map),gi={g1,g2,...,gn in the real-time unmanned aerial vehicle local map, the point cloud P g(globalmap);NL in the three-dimensional point cloud global map is the point sum of the local map point cloud L, i is an index variable, i=1, wherein, the initial value of the index is 1;R, t represents a rotation matrix, and t represents a translation matrix;
The registration subunit is used for judging whether the overlapping area of the point clouds meets a preset threshold value, and when the overlapping area of the point clouds meets the preset threshold value, acquiring an initial pose transformation matrix of the local map of the real-time unmanned aerial vehicle relative to the three-dimensional point cloud global map by adopting a manual estimation rough registration mode, wherein the initial pose transformation matrix is that Where R 0 represents the initial rotation matrix and t 0 represents the initial translation matrix.
7. A computer device, comprising:
A memory and a processor, the memory and the processor are in communication connection, the memory stores computer instructions, and the processor executes the computer instructions, so as to execute the positioning method of the inspection unmanned aerial vehicle according to any one of claims 1 to 5.
8. A computer-readable storage medium having stored thereon computer instructions for causing a computer to perform the method of locating a drone as claimed in any one of claims 1 to 5.
CN202411041027.5A 2024-07-31 2024-07-31 Positioning method, device, computer equipment and storage medium for inspection drone Active CN118999559B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411041027.5A CN118999559B (en) 2024-07-31 2024-07-31 Positioning method, device, computer equipment and storage medium for inspection drone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411041027.5A CN118999559B (en) 2024-07-31 2024-07-31 Positioning method, device, computer equipment and storage medium for inspection drone

Publications (2)

Publication Number Publication Date
CN118999559A CN118999559A (en) 2024-11-22
CN118999559B true CN118999559B (en) 2025-09-30

Family

ID=93471925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411041027.5A Active CN118999559B (en) 2024-07-31 2024-07-31 Positioning method, device, computer equipment and storage medium for inspection drone

Country Status (1)

Country Link
CN (1) CN118999559B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119354205B (en) * 2024-12-25 2025-05-02 哈尔滨工业大学(威海) Unmanned fire-extinguishing robot navigation method and device based on 4D millimeter wave radar

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118169725A (en) * 2024-03-19 2024-06-11 浙江极氪智能科技有限公司 Vehicle positioning method, device, equipment and medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112014857B (en) * 2020-08-31 2023-04-07 上海宇航系统工程研究所 Three-dimensional laser radar positioning and navigation method for intelligent inspection and inspection robot
CN117553768A (en) * 2022-08-03 2024-02-13 北京氢源智能科技有限公司 Method and system for positioning and navigating drones based on laser inertial odometry

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118169725A (en) * 2024-03-19 2024-06-11 浙江极氪智能科技有限公司 Vehicle positioning method, device, equipment and medium

Also Published As

Publication number Publication date
CN118999559A (en) 2024-11-22

Similar Documents

Publication Publication Date Title
CN110595494B (en) Map error determination method and device
CN114018274B (en) Vehicle positioning method and device and electronic equipment
KR102382420B1 (en) Method and apparatus for positioning vehicle, electronic device and storage medium
JP6918885B2 (en) Relative position / orientation orientation method, relative position / orientation orientation device, equipment and medium
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
CN112880674B (en) A method, device, equipment and storage medium for positioning a traveling device
CN114494629B (en) Method, device, equipment and storage medium for constructing three-dimensional map
CN111427061A (en) Robot mapping method and device, robot and storage medium
CN113820735A (en) Method for determining location information, location measuring device, terminal and storage medium
CN113960614A (en) Elevation map construction method based on frame-map matching
CN117518196B (en) A laser radar motion compensation method, device, system, equipment and medium
JP2017524932A (en) Video-assisted landing guidance system and method
CN118999559B (en) Positioning method, device, computer equipment and storage medium for inspection drone
CN111121755A (en) Multi-sensor fusion positioning method, device, equipment and storage medium
Karam et al. Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping
Bender et al. A graph based bundle adjustment for ins-camera calibration
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
CN110779514A (en) Hierarchical Kalman fusion method and device for auxiliary attitude determination of bionic polarization navigation
CN117191023A (en) Unmanned system cluster relative positioning method based on extended Kalman filtering
CN114387352B (en) External parameter calibration method, device, equipment and storage medium
Shmatko et al. Estimation of rotation measurement error of objects using computer simulation
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment
CN117870650B (en) AR positioning map updating method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant