[go: up one dir, main page]

WO2017008224A1 - 一种移动物体的距离检测方法、装置及飞行器 - Google Patents

一种移动物体的距离检测方法、装置及飞行器 Download PDF

Info

Publication number
WO2017008224A1
WO2017008224A1 PCT/CN2015/083868 CN2015083868W WO2017008224A1 WO 2017008224 A1 WO2017008224 A1 WO 2017008224A1 CN 2015083868 W CN2015083868 W CN 2015083868W WO 2017008224 A1 WO2017008224 A1 WO 2017008224A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance value
distance
moving object
tested
aircraft
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2015/083868
Other languages
English (en)
French (fr)
Inventor
朱振宇
陈子寒
张芝源
刘渭锋
陈超彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN201580004349.5A priority Critical patent/CN106104203B/zh
Priority to PCT/CN2015/083868 priority patent/WO2017008224A1/zh
Publication of WO2017008224A1 publication Critical patent/WO2017008224A1/zh
Priority to US15/870,174 priority patent/US10591292B2/en
Anticipated expiration legal-status Critical
Priority to US16/817,205 priority patent/US20200208970A1/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/20Arrangements for acquiring, generating, sharing or displaying traffic information
    • G08G5/21Arrangements for acquiring, generating, sharing or displaying traffic information located onboard the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/20Arrangements for acquiring, generating, sharing or displaying traffic information
    • G08G5/22Arrangements for acquiring, generating, sharing or displaying traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/20Arrangements for acquiring, generating, sharing or displaying traffic information
    • G08G5/25Transmission of traffic-related information between aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/20Arrangements for acquiring, generating, sharing or displaying traffic information
    • G08G5/26Transmission of traffic-related information between aircraft and ground stations
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/50Navigation or guidance aids
    • G08G5/55Navigation or guidance aids for a single aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/50Navigation or guidance aids
    • G08G5/57Navigation or guidance aids for unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/70Arrangements for monitoring traffic-related situations or conditions
    • G08G5/72Arrangements for monitoring traffic-related situations or conditions for monitoring traffic
    • G08G5/723Arrangements for monitoring traffic-related situations or conditions for monitoring traffic from the aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/70Arrangements for monitoring traffic-related situations or conditions
    • G08G5/72Arrangements for monitoring traffic-related situations or conditions for monitoring traffic
    • G08G5/727Arrangements for monitoring traffic-related situations or conditions for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/80Anti-collision systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • G01C9/005Measuring inclination, e.g. by clinometers, by levels specially adapted for use in aircraft
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a method, device, and aircraft for detecting a distance of a moving object.
  • Embodiments of the present invention provide a method, a device, and an aircraft for detecting a distance of a moving object, which can perform distance detection more accurately.
  • a method for detecting a distance of a moving object includes:
  • the second distance value is determined as an actual distance between the moving object and the object to be tested.
  • the method further includes:
  • the movement of the moving object is controlled according to the actual distance.
  • the detecting the first distance value between the moving object and the object to be tested includes:
  • the first distance value between the moving object and the object to be tested is detected and acquired according to the sensed distance sensing data.
  • the detecting the first distance value between the moving object and the object to be tested includes:
  • the first distance value between the moving object and the object to be tested is detected and acquired according to the visually sensed image sensing data.
  • the detecting, according to the visually sensed image sensing data, the first distance value between the moving object and the object to be tested including:
  • the determining the feature points on the at least two images separately includes:
  • the determined feature point is a set of sparse feature points
  • the first distance value between the moving object and the object to be tested is calculated based on the determined feature point, including :
  • the determined feature point is a set of dense feature points
  • the first distance value between the moving object and the object to be tested is calculated based on the determined feature point, including :
  • the first distance value is an average distance value between the moving object and the object to be tested.
  • the detecting the first distance value between the moving object and the object to be tested includes:
  • the distance value is used as the first distance value
  • the first distance value between the moving object and the object to be tested is obtained by visual sensing.
  • controlling the movement of the moving object according to the actual distance comprises: performing an obstacle avoidance operation according to the actual distance; wherein the obstacle avoiding operation comprises: limiting a moving speed, and circumventing the to-be-tested The operation of the object.
  • controlling the movement of the moving object according to the actual distance comprises: performing a target tracking operation according to the actual distance; wherein the target tracking operation comprises: controlling the moving object to move The distance between the moving object and the object to be tested is within a constrained tracking distance threshold.
  • the embodiment of the present invention further provides a distance detecting device for a moving object, including:
  • a detecting module configured to detect a first distance value between the moving object and the object to be tested, where the object to be tested is in a target direction of the moving object
  • a processing module configured to acquire an inclination angle of the moving object in a target direction, and calculate a second distance value of the moving object to the object to be tested according to the acquired angle and the first distance value And determining the second distance value as an actual distance between the moving object and the object to be tested.
  • the device further includes:
  • control module configured to control movement of the moving object according to the actual distance.
  • the detecting module is configured to detect and acquire a first distance value between the moving object and the object to be tested according to the sensed distance sensing data.
  • the detecting module is configured to detect and acquire a first distance value between the moving object and the object to be tested according to the visually sensed image sensing data.
  • the detecting module includes:
  • An acquiring unit configured to acquire at least two images in a target direction
  • a comparing unit configured to respectively determine feature points on the at least two images, and perform feature point comparison, and compare and determine feature points associated with each other between the at least two images;
  • a determining unit configured to calculate a first distance value between the moving object and the object to be tested based on the determined feature point.
  • the comparing unit is configured to acquire an inclination angle of the moving object in a target direction, and determine an effective area on the at least two images according to the acquired angle, and respectively determine that the corresponding image is effective. Feature points are identified in the area.
  • the comparing unit compares and determines a set of sparse feature points
  • the determining unit is specifically configured to calculate a first distance value between the moving object and the object to be tested based on the determined sparse feature points.
  • the comparing unit compares and determines a set of dense feature points
  • the determining unit is specifically configured to calculate a first distance value between the moving object and the object to be tested based on the determined dense feature point.
  • the first distance value is an average distance value between the moving object and the object to be tested.
  • the detecting module includes:
  • a first acquiring unit configured to acquire distance sensing data sensed by the distance sensing manner, and calculate the distance sensing data to obtain a distance value
  • a distance value determining unit configured to use the distance value as the first distance value if the calculated distance value is less than a preset distance threshold
  • a second acquiring unit configured to acquire, by using a visual sensing manner, a first distance value between the moving object and the object to be tested, if the calculated distance value is greater than a preset distance threshold.
  • control module is specifically configured to perform an obstacle avoidance operation according to the actual distance, where the obstacle avoidance operation includes: limiting a moving speed, and evading an operation of the object to be tested.
  • control module is specifically configured to perform a target tracking operation according to the actual distance; wherein the target tracking operation includes: controlling the moving object to move the moving object and the object to be tested The distance between the constraints is within the threshold of the constrained tracking distance.
  • an embodiment of the present invention provides an aircraft, including: a power device, a flight controller, and a sensor, where:
  • the sensor is configured to sense distance data between the aircraft and the object to be tested
  • the flight controller is configured to determine, according to distance data of the sensor, a first distance value between the aircraft and an object to be tested, the object to be tested is in a target direction of the aircraft; and acquiring the aircraft a tilt angle in the target direction, and calculating a second distance value of the aircraft to the object to be tested according to the obtained tilt angle and the first distance value;
  • the flight controller is further configured to issue a control command to the power device to control the power device to move the aircraft according to the second distance value.
  • an embodiment of the present invention provides another aircraft, including: a power device, a flight controller, and a sensor, where:
  • the sensor is configured to detect a first distance value between the aircraft and an object to be tested, where the object to be tested is in a target direction of the moving object;
  • the flight controller is configured to acquire an inclination angle of the aircraft in a target direction, and calculate, according to the acquired tilt angle and the first distance value, a second of the aircraft to the object to be tested Distance value
  • the flight controller is further configured to issue a control command to the power device to control the power device to move the aircraft according to the second distance value.
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • FIG. 1 is a schematic flow chart of a method for detecting a distance of a moving object according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of determining a distance between a moving object and an object to be tested according to an embodiment of the present invention
  • FIG. 3 is a schematic flow chart of another method for detecting a distance of a moving object according to an embodiment of the present invention.
  • FIG. 4 is a schematic flow chart of a method for acquiring a distance value according to an embodiment of the present invention
  • FIG. 5 is a schematic structural diagram of a sensor assembly for sensing distance according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a binocular camera ranging principle according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a calculation principle of binocular camera ranging according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a distance detecting device for a moving object according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of one of the detection modules of the embodiment of the present invention.
  • FIG. 10 is another schematic structural diagram of a detection module according to an embodiment of the present invention.
  • FIG. 11 is a schematic structural view of an aircraft according to an embodiment of the present invention.
  • Fig. 12 is a schematic structural view of another aircraft according to an embodiment of the present invention.
  • the embodiment of the present invention corrects the directly detected distance by the tilt angle of the moving object, thereby obtaining a more accurate distance value, which is convenient for implementing the subsequent based on the distance.
  • the value is used for movement control such as obstacle avoidance and tracking monitoring.
  • FIG. 1 is a schematic flowchart of a method for detecting a distance of a moving object according to an embodiment of the present invention.
  • the method may be performed by a processor, and may be performed by a mobile controller in a moving object, such as an unmanned aerial vehicle. Flight controller in.
  • the method includes:
  • S101 Detect a first distance value between the moving object and the object to be tested, where the object to be tested is in a target direction of the moving object.
  • the target direction may refer to the front of the movement of the moving object, and specifically may refer to the direction A (the flight direction) as shown in FIG. 2 .
  • the moving object may be provided with a single or binocular vision sensor, a distance sensor composed of sensors such as ultrasonic, infrared, radar, and/or laser, and other sensors capable of sensing the distance.
  • the first distance value D1 is calculated and acquired by the sensed data sensed by the visual sensor, and/or the distance sensor mentioned above.
  • the visual sensor or the distance sensor can actually sense and calculate a distance value to an area, and the distance from the moving object to a plurality of feature points on a certain plane of the object to be tested can be obtained. Therefore, the first The distance value D1 is a special value, which may be an average value of the distance values to a plurality of points, or may be a distance value of the feature points in which the features are relatively most obvious and the calculated distance values are the most accurate among the plurality of feature points.
  • the first distance value D1 between the moving object and the object to be tested may be detected and acquired according to the sensed distance sensing data.
  • the first distance value D1 between the moving object and the object to be tested is detected and acquired according to the visually sensed image sensing data.
  • S102 Acquire an inclination angle of the moving object in a target direction, and calculate a second distance value of the moving object to the object to be tested according to the acquired tilt angle and the first distance value.
  • the tilt angle a may be sensed from a sensor such as a gyroscope or an acceleration sensor mounted on the moving object, and the tilt angle a may refer to an angle between a moving direction of the moving object and a horizontal plane, that is, the The angle between the target direction and the horizontal plane.
  • a sensor such as a gyroscope or an acceleration sensor mounted on the moving object
  • the tilt angle a may refer to an angle between a moving direction of the moving object and a horizontal plane, that is, the The angle between the target direction and the horizontal plane.
  • the first distance value D1 and the tilt angle a may be calculated by a trigonometric function, and the correction of the first distance value D1 is completed to obtain the second The distance value is D2.
  • S103 Determine the second distance value as an actual distance between the moving object and the object to be tested.
  • the actual distance value may be directly outputted to the connected controller, so that the controller controls the movement of the moving object according to the actual distance value. Or output the actual distance to the user for display to the user.
  • the moving object is controlled to complete the movement operation such as obstacle avoidance and tracking monitoring based on the actual distance D2. For example, when the actual distance D2 is less than the distance threshold, the movement needs to be stopped. Avoid hitting the object to be tested, or plan a new route to bypass the object to be tested. Or, for example, when the actual distance D2 is greater than the distance threshold, the movement needs to be continued, so that the object to be tested is always within the distance threshold, and the purpose of monitoring is achieved.
  • the above-mentioned visual sensor or distance sensor generally detects the distance between the corresponding sensor and the object to be tested. Therefore, after calculating the sensed data to obtain the distance value, it may be added.
  • the last fixed value is D.
  • the distance value calculated based on the sensing data can also be directly used as the first distance value D1 between the moving object and the object to be tested, and the distance threshold is configured based on the fixed value only for subsequent movement control. .
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • FIG. 3 is a schematic flowchart of another method for detecting a distance of a moving object according to an embodiment of the present invention.
  • the method in the embodiment of the present invention may be executed by a processor, and may be specifically implemented by a mobile controller in a moving object.
  • a flight controller in an unmanned aerial vehicle To perform, for example, a flight controller in an unmanned aerial vehicle.
  • the method includes:
  • S301 Acquire distance sensing data sensed by the distance sensing method, and calculate the distance sensing data to obtain a distance value.
  • the distance sensing method includes performing distance sensing using sensing data of a distance sensor including ultrasonic, infrared, radar, and/or laser sensors.
  • S302 Determine whether the calculated distance value is less than a preset distance threshold.
  • the distance threshold is set according to the effective detection distance of the ultrasonic equidistance sensor. Under the current condition, the effective detection distance of the ultrasonic wave is about 3 m, then the preset distance threshold is used when the ultrasonic wave is used as the distance sensor sensing distance. When 3 m is selected, when the distance sensed based on the ultrasonic distance sensing method is less than 3 m, the ultrasonically sensed distance value is used as the first distance value, that is, S303 described below is performed. Otherwise, using the vision sensor, the following S304 is performed.
  • S304 Acquire an inclination angle of the moving object in a target direction, and calculate a second distance value of the moving object to the object to be tested according to the acquired tilt angle and the first distance value.
  • the tilt angle may be sensed from a sensor such as a gyroscope or an acceleration sensor mounted on the moving object, and the tilt angle may refer to an angle between a moving direction of the moving object and a horizontal plane, that is, the target direction.
  • the angle with the horizontal plane For example, when the aircraft is flying forward during the flight, a tilting situation as shown in FIG. 2 may occur, at which time the relevant sensor in the aircraft may sense the tilt and the rotation to obtain the tilt angle.
  • S305 Determine the second distance value as an actual distance between the moving object and the object to be tested.
  • the movement of the moving object can be directly controlled according to the actual distance.
  • the controlling the movement of the moving object according to the actual distance may include: performing an obstacle avoidance operation according to the actual distance; wherein the obstacle avoiding operation comprises: limiting a moving speed, and evading an operation of the object to be tested.
  • the moving object moves at a high speed, when the object to be measured is detected in front, the moving speed of the currently moving object and the distance at which the moving object is stopped at the speed are known.
  • the distance to the object to be tested is less than the braking distance plus the distance threshold of the safety distance, the moving object enters the state of emergency braking to prevent the aircraft from hitting the obstacle.
  • the moving object detects that the object to be tested appears in front when the distance is within the safe distance (less than the distance threshold), the aircraft will automatically retreat to the outside of the safe distance, and the closer the distance, the faster the retreat speed.
  • the controlling the movement of the moving object according to the actual distance may further include: performing a target tracking operation according to the actual distance; wherein the target tracking operation comprises: controlling the moving object to move the moving object to The distance between the objects to be tested is within a constrained tracking distance threshold.
  • FIG. 4 it is a schematic flowchart of a method for acquiring a distance value according to an embodiment of the present invention.
  • the embodiment of the present invention calculates a first distance value mentioned in the foregoing embodiment based on a binocular vision sensor. Specifically, the method includes:
  • S401 Acquire at least two images in the target direction.
  • FIG. 5 A schematic structural view of a sensor assembly capable of sensing a distance is shown in FIG. 5, which includes two visual sensors of a camera 501 and a camera 502, and also shows a distance sensor, that is, an ultrasonic transmitting probe 503 and an ultrasonic receiving probe 504. .
  • the sensor assembly may be disposed on each face of the moving object to detect a distance value of the object to be tested in each of the moving directions of the moving object as needed.
  • the pixel corresponding to each pixel can be found on the other camera in each of the images acquired by any one of the cameras.
  • the first distance value can be further calculated based on the two corresponding pixels and their coordinates in the respective images, the focal length of the camera, and the distance between the two cameras.
  • S402 Determine feature points on the at least two images respectively, perform feature point comparison, and compare and determine feature points associated with each other between the at least two images.
  • the method of extracting feature points can be FAST (Features from accelerated segment test), SIFT (Scale-invariant feature transform), SURF (Speeded Up Robust Features), etc.
  • FAST Features from accelerated segment test
  • SIFT Scale-invariant feature transform
  • SURF Speeded Up Robust Features
  • the algorithm is implemented. Taking FAST as an example, the FAST feature algorithm calculates the feature points using the Binary robust independent elementary features operator.
  • the comparison between the two images determines that the feature points associated with each other are determined by binocular feature point matching. Based on the feature description sub-brief of the feature points calculated by the FAST feature algorithm, the feature points that are associated with each other are obtained by comparing the similarities of the pairs of the respective pixels of the two pictures.
  • the feature points associated with the association include the pixel points of one of the measured points on the object to be tested respectively on the two binocular visual images.
  • S403 Calculate a first distance value between the moving object and the object to be tested based on the determined feature point.
  • the distance values of the respective feature points are calculated based on the triangular positioning manner, thereby obtaining the first distance value.
  • a schematic diagram of binocular vision ranging is shown.
  • T is the distance between two cameras
  • x l and x r are the coordinates of the P point projected on the two images, that is, the coordinates of the above-mentioned associated feature points on the respective images.
  • the determining the feature points on the at least two images may include: obtaining an inclination angle of the moving object in a target direction; determining, according to the acquired angle, effective on the at least two images Regions, and feature points are determined in the effective regions of the corresponding images, respectively.
  • the tilt angle of the moving object can be acquired by an IMU (Inertial Measurement Unit). Since the camera is tilted downward with the moving object, the image captured by the camera is an area including the front of the moving object and inclined downward, and the moving object The area in front is the upper part of the image acquired by the camera. After the tilt angle is obtained by the IMU, the proportion of the moving object in front of the image can be estimated, thereby determining the effective area for calculation. If the camera is tilted upward with the moving object, the object in front of the moving object should be located below the image, and the effective area can also be obtained based on the tilt angle of the IMU.
  • IMU Inertial Measurement Unit
  • a set of relatively sparse feature points can be obtained by the above-mentioned FAST algorithm, and the first distance value can be calculated based on the determined sparse feature points to calculate a first distance value between the moving object and the object to be tested.
  • a set of dense feature points may also be obtained by a block matching block matching + speckle filter speckled filter.
  • the moving object may be calculated based on the determined dense feature points to the object to be tested. The first distance value.
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • the distance detecting device and the aircraft of the moving object of the embodiment of the present invention will be described in detail below.
  • FIG. 8 is a schematic structural diagram of a distance detecting device for a moving object according to an embodiment of the present invention.
  • the device in the embodiment of the present invention may be disposed in a processor or in a mobile controller of a moving object, for example, In the flight controller of the aircraft, specifically, the device includes:
  • the detecting module 1 is configured to detect a first distance value between the moving object and the object to be tested, where the object to be tested is in a target direction of the moving object.
  • a processing module 2 configured to acquire an inclination angle of the moving object in a target direction, and calculate a second distance from the moving object to the object to be tested according to the acquired angle and the first distance value And determining the second distance value as an actual distance between the moving object and the object to be tested.
  • the apparatus of the embodiment of the present invention may further include a control module 3, configured to control movement of the moving object according to the actual distance.
  • a control module 3 configured to control movement of the moving object according to the actual distance.
  • the target direction may refer to the front of the moving object moving.
  • the moving object may be provided with a single or binocular vision sensor, a distance sensor composed of sensors such as ultrasonic, infrared, radar, and/or laser, and other sensors capable of sensing the distance.
  • the detecting module 1 may calculate and acquire the first distance value by sensing data sensed by a reference visual sensor, and/or a distance sensor or the like.
  • the detecting module 1 can obtain a distance between the moving object and a plurality of feature points on a certain plane of the object to be tested. Therefore, the first distance value is a special value, and specifically may be a plurality of points. The average of the distance values, or the distance value of the feature points that are the most obvious among the plurality of feature points, and the calculated distance value is the most accurate.
  • the detecting module 1 may specifically detect and acquire a first distance value between the moving object and the object to be tested according to the sensed distance sensing data. Alternatively, the detecting module 1 detects and acquires a first distance value between the moving object and the object to be tested according to the visually sensed image sensing data.
  • the processing module 2 may obtain a tilt angle from the sensing data of the sensor such as a gyroscope or an acceleration sensor mounted on the moving object, where the tilt angle may refer to an angle between a moving direction of the moving object and a horizontal plane. That is, the angle between the target direction and the horizontal plane. For example, when the aircraft flies forward during the flight, a tilting situation may occur. At this time, the processing module 2 can sense the tilt and the rotation through the relevant sensors in the aircraft, thereby obtaining the tilt angle.
  • the sensing data of the sensor such as a gyroscope or an acceleration sensor mounted on the moving object
  • the tilt angle may refer to an angle between a moving direction of the moving object and a horizontal plane. That is, the angle between the target direction and the horizontal plane.
  • the processing module 2 can sense the tilt and the rotation through the relevant sensors in the aircraft, thereby obtaining the tilt angle.
  • the processing module 2 may calculate the first distance value and the tilt angle by using a trigonometric function, and complete the correction of the first distance value to obtain the second The distance value, that is, the actual distance between the moving object and the object to be tested.
  • the processing module 2 can directly output to other controllers or users, or directly control the moving object to complete obstacle avoidance based on the actual distance. Tracking monitoring and other moving operations, for example, when the actual distance is less than the distance threshold, it is necessary to stop moving to avoid hitting the object to be tested, or to plan a new route to bypass the object to be tested. Or, for example, when the actual distance is greater than the distance threshold, the movement needs to be continued, so that the object to be tested is always within the distance threshold, and the purpose of monitoring is achieved.
  • the above-mentioned visual sensor or distance sensor generally detects the distance between the corresponding sensor and the object to be tested. Therefore, after calculating the sensed data to obtain the distance value, it may be added. The last fixed value.
  • the distance value calculated based on the sensing data may be directly used as the first distance value between the moving object and the object to be tested, and the distance threshold is configured based on the fixed value only during subsequent movement control.
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • the embodiment of the present invention further provides a schematic structural diagram of another distance detecting device for a moving object, and the device of the embodiment of the present invention may be disposed in a processor or disposed in a mobile controller of a moving object, for example, In the flight controller of the aircraft, specifically, the device includes the above-described detection module 1, processing module 2, and control module 3.
  • the detecting module 1 is specifically configured to detect and acquire a first distance value between the moving object and the object to be tested according to the sensed distance sensing data.
  • Distance sensors include ultrasonic, radar, infrared and other distance sensors.
  • the detecting module 1 is specifically configured to detect and acquire a first distance value between the moving object and the object to be tested according to the visually sensed image sensing data.
  • Visual sensing includes a visual sensing system based on two cameras.
  • the detecting module 1 of the embodiment of the present invention may specifically include:
  • An obtaining unit 11 configured to acquire at least two images in a target direction
  • the comparing unit 12 is configured to respectively determine feature points on the at least two images, and perform feature point comparison, and compare and determine feature points associated with each other between the at least two images;
  • the determining unit 13 is configured to calculate a first distance value between the moving object and the object to be tested based on the determined feature point.
  • the comparing unit 12 is specifically configured to acquire an inclination angle of the moving object in a target direction, and determine an effective area on the at least two images according to the acquired angle, and respectively valid the corresponding image. Feature points are identified in the area.
  • the first distance value in the embodiment of the present invention may be an average distance value between the moving object and the object to be tested.
  • the detecting module 1 is specifically configured to detect the first distance value by using a distance sensing manner and a visual sensing manner.
  • the detecting module 1 can determine which distance detecting mode is used according to the distance detected by the distance sensor and/or the visual sensor. Referring to FIG. 10, the detecting module 1 of the embodiment of the present invention is further provided. Specifically, it may include:
  • the first obtaining unit 14 is configured to acquire distance sensing data sensed by the distance sensing manner, and calculate the distance sensing data to obtain a distance value;
  • the distance value determining unit 15 is configured to use the distance value as the first distance value if the calculated distance value is less than a preset distance threshold;
  • the second obtaining unit 16 is configured to acquire, by using a visual sensing manner, a first distance value between the moving object and the object to be tested, if the calculated distance value is greater than a preset distance threshold.
  • control module 3 is specifically configured to perform an obstacle avoidance operation according to the actual distance.
  • the obstacle avoidance operation includes: limiting a moving speed and evading an operation of the object to be tested.
  • control module 3 is specifically configured to perform a target tracking operation according to the actual distance; wherein the target tracking operation comprises: controlling the moving object to move the moving object and the object to be tested The distance between them is within the constrained tracking distance threshold.
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • FIG. 11 is a schematic structural diagram of an aircraft according to an embodiment of the present invention.
  • the aircraft of the embodiment of the present invention includes various existing racks, power supplies, and the like, and further includes a power device 100, a flight controller 200, and Sensor 300.
  • the sensor 300 is configured to sense distance data between the aircraft and the object to be tested;
  • the flight controller 200 is configured to determine, according to the distance data of the sensor 300, a first distance value between the aircraft and the object to be tested, the object to be tested is in a target direction of the aircraft; Oblique angle of the aircraft in the target direction, and calculating a second distance value of the aircraft to the object to be tested according to the obtained tilt angle and the first distance value;
  • the flight controller 200 is further configured to issue a control command to the power device 100 to control the power device 100 to move the aircraft according to the second distance value.
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • FIG. 12 it is a schematic structural diagram of another aircraft according to an embodiment of the present invention.
  • the aircraft of the embodiment of the present invention includes various existing racks, power supplies, and the like, and further includes: a power device 400 and a flight controller. 500, sensor 600, wherein:
  • the sensor 600 is configured to detect a first distance value between the aircraft and an object to be tested, where the object to be tested is in a target direction of the moving object;
  • the flight controller 500 is configured to acquire an inclination angle of the aircraft in a target direction, and calculate, according to the acquired inclination angle and the first distance value, the first to the object to be tested Two distance values;
  • the flight controller 500 is further configured to issue a control command to the power device 400 to control the power device 400 to move the aircraft according to the second distance value.
  • the embodiment of the invention can correct the directly detected distance value to obtain a more accurate distance value, so as to more accurately complete the movement control of the moving object.
  • the related apparatus and method disclosed may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be another division manner for example, multiple units or components may be used. Combinations can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer processor to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Astronomy & Astrophysics (AREA)
  • Geometry (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

一种移动物体的距离检测方法、装置及飞行器,所述方法包括检测移动物体与待测物体之间的第一距离值(S101),所述待测物体在所述移动物体的目标方向上;获取所述移动物体在所述目标方向上的倾斜角度,并根据获取的所述倾斜角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值(S102);将所述第二距离值确定为所述移动物体与所述待测物体之间的实际距离(S103),根据所述第二距离值控制所述移动物体的移动。能够对直接检测到的距离值进行修正,得到更为准确的距离值,更为准确地完成对所述移动物体的移动控制。

Description

一种移动物体的距离检测方法、装置及飞行器 技术领域
本发明涉及电子技术领域,尤其涉及一种移动物体的距离检测方法、装置及飞行器。
背景技术
随着科技的发展,能够遥控控制的智能可移动设备,如遥控机器人、遥控汽车、无人飞行器等,已进入军事、民用领域协助用户完成各种监测、巡查等任务。
在可移动设备的移动过程中,需要实时的对某个物体进行检测或跟踪,这时一般需要检测移动设备到该物体之间的距离。目前的距离检测存在不准确的问题,会影响移动设备的判断,导致检测或跟踪不准确。
发明内容
本发明实施例提供了一种移动物体的距离检测方法、装置及飞行器,可以较为准确地进行距离检测。
一方面,一种移动物体的距离检测方法,包括:
检测移动物体与待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值;
将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
其中可选地,所述方法还包括:
根据所述实际距离控制所述移动物体的移动。
其中可选地,所述检测移动物体与待测物体之间的第一距离值,包括:
根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值。
其中可选地,所述检测移动物体与待测物体之间的第一距离值,包括:
根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值。
其中可选地,所述根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值,包括:
在目标方向上获取至少两张图像;
分别确定出所述至少两张图像上的特征点,并进行特征点比较,在所述至少两张图像之间比较确定出彼此关联的特征点;
基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中可选地,所述分别确定出所述至少两张图像上的特征点,包括:
获取所述移动物体在目标方向上的倾斜角度;
根据获取的角度确定出所述至少两张图像上的有效区域,并分别在对应图像的有效区域中确定出特征点。
其中可选地,所述确定出的特征点为一组稀疏特征点,所述基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值,包括:
基于确定出的稀疏特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中可选地,所述确定出的特征点为一组稠密特征点,所述基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值,包括:
基于确定出的稠密特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中可选地,所述第一距离值为所述移动物体与待测物体之间的平均距离值。
其中可选地,所述检测移动物体与待测物体之间的第一距离值,包括:
获取通过距离感测方式感测到的距离感测数据,并对所述距离感测数据进行计算得到距离值;
若计算得到的距离值小于预设的距离阈值,则将该距离值作为第一距离值;
若计算得到的距离值大于预设的距离阈值,则通过视觉感测方式获取所述移动物体与所述待测物体之间的第一距离值。
其中可选地,所述根据所述实际距离控制所述移动物体的移动,包括:根据所述实际距离执行避障操作;其中,所述避障操作包括:限制移动速度、规避所述待测物体的操作。
其中可选地,所述根据所述实际距离控制所述移动物体的移动,包括:根据所述实际距离执行目标跟踪操作;其中,所述目标跟踪操作包括:控制所述移动物体移动以使所述移动物体与所述待测物体之间的距离在约束的跟踪距离阈值内。
另一方面,本发明实施例还提供了一种移动物体的距离检测装置,包括:
检测模块,用于检测移动物体与待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
处理模块,用于获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值,并将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
其中可选地,所述装置还包括:
控制模块,用于根据所述实际距离控制所述移动物体的移动。
其中可选地,所述检测模块,具体用于根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值。
其中可选地,所述检测模块,具体用于根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值。
其中可选地,所述检测模块包括:
获取单元,用于在目标方向上获取至少两张图像;
比较单元,用于分别确定出所述至少两张图像上的特征点,并进行特征点比较,在所述至少两张图像之间比较确定出彼此关联的特征点;
确定单元,用于基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中可选地,所述比较单元,具体用于获取所述移动物体在目标方向上的倾斜角度;根据获取的角度确定出所述至少两张图像上的有效区域,并分别在对应图像的有效区域中确定出特征点。
其中可选地,所述比较单元比较确定出一组稀疏特征点;
所述确定单元,具体用于基于确定出的稀疏特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中可选地,所述比较单元比较确定出一组稠密特征点;
所述确定单元,具体用于基于确定出的稠密特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中可选地,所述第一距离值为所述移动物体与待测物体之间的平均距离值。
其中可选地,所述检测模块包括:
第一获取单元,用于获取通过距离感测方式感测到的距离感测数据,并对所述距离感测数据进行计算得到距离值;
距离值确定单元,用于若计算得到的距离值小于预设的距离阈值,则将该距离值作为第一距离值;
第二获取单元,用于若计算得到的距离值大于预设的距离阈值,则通过视觉感测方式获取所述移动物体与所述待测物体之间的第一距离值。
其中可选地,所述控制模块具体用于根据所述实际距离执行避障操作;其中,所述避障操作包括:限制移动速度、规避所述待测物体的操作。
其中可选地,所述控制模块具体用于根据所述实际距离执行目标跟踪操作;其中,所述目标跟踪操作包括:控制所述移动物体移动以使所述移动物体与所述待测物体之间的距离在约束的跟踪距离阈值内。
再一方面,本发明实施例还提供了一种飞行器,包括:动力装置、飞行控制器、传感器,其中:
所述传感器,用于感测飞行器到待测物体之间的距离数据;
所述飞行控制器,用于根据所述传感器的距离数据确定出所述飞行器到待测物体之间的第一距离值,所述待测物体在该飞行器的目标方向上;获取所述飞行器在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述飞行器到所述待测物体的第二距离值;
所述飞行控制器,还用于向所述动力装置发出控制指令,以控制所述动力装置使所述飞行器根据所述第二距离值移动。
又一方面,本发明实施例还提供了另一种飞行器,包括:动力装置、飞行控制器、传感器,其中:
所述传感器,用于检测所述飞行器到待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
所述飞行控制器,用于获取所述飞行器在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述飞行器到所述待测物体的第二距离值;
所述飞行控制器,还用于向所述动力装置发出控制指令,以控制所述动力装置使所述飞行器根据所述第二距离值移动。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
附图说明
图1是本发明实施例的一种移动物体的距离检测方法的流程示意图;
图2是本发明实施例的移动物体和待测物体之间距离确定的示意图;
图3是本发明实施例的另一种移动物体的距离检测方法的流程示意图;
图4是本发明实施例的获取距离值的方法的流程示意图;
图5是本发明实施例的感测距离的传感器组件结构示意图;
图6是本发明实施例的双目摄像头测距原理的示意图;
图7是本发明实施例的双目摄像头测距的计算原理示意图;
图8是本发明实施例的一种移动物体的距离检测装置的结构示意图;
图9是本发明实施例的检测模块的其中一种结构示意图;
图10是本发明实施例的检测模块的其中另一种结构示意图;
图11是本发明实施例的一种飞行器的结构示意图;
图12是本发明实施例的另一种飞行器的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例在检测移动物体与待测物体之间的距离时,会通过移动物体的倾斜角度来对直接检测到的距离进行修正,从而获取更为准确的距离值,方便实现后续基于该距离值进行避障、跟踪监视等移动控制。
请参见图1,是本发明实施例的一种移动物体的距离检测方法的流程示意图,所述方法可以由一个处理器执行,具体可以由移动物体中的移动控制器来执行,例如无人飞行器中的飞行控制器。具体的,所述方法包括:
S101:检测移动物体与待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上。
所述目标方向可以是指所述移动物体移动的正前方,具体可以是指如图2所示的方向A(飞行方向)。所述移动物体上可以配置单、双目视觉传感器,如超声波、红外、雷达、和/或激光等传感器构成的距离传感器,以及其他可以感测距离的传感器。所述第一距离值D1则由上述提及的视觉传感器、和/或距离传感器等所感测的感测数据来计算获取。
所述视觉传感器或距离传感器实际可以感测并计算出到一个区域的距离值,可以得到所述移动物体到所述待测物体某个平面上多个特征点的距离,因此,所述第一距离值D1为一个特殊的值,具体可为到多个点的距离值的平均值,或者可为多个特征点中特征相对最明显、计算得到的距离值最为准确的特征点的距离值。
具体可以根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值D1。或者,根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值D1。
S102:获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值。
所述倾斜角度a可以从所述移动物体上挂载的陀螺仪、加速度传感器等传感器感测得到,所述倾斜角度a可以是指所述移动物体的移动方向与水平面的夹角,即所述目标方向与水平面的夹角。例如飞行器在飞行过程中向前飞行时,会出现如图2所示的倾斜的情况,此时,飞行器中相关传感器可以感测到这个倾斜、旋转,从而得到所述倾斜角度a。
在得到所述第一距离值D1和倾斜角度a后,可通过三角函数对所述第一距离值D1和倾斜角度a进行计算,完成对所述第一距离值D1的修正得到所述第二距离值D2。
S103:将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
在确定出所述移动物体到所述待测物体之间的实际距离后,可以直接输出该实际距离值到相连的控制器中,以使控制器根据该实际距离值控制所述移动物体的移动;或者输出该实际距离给用户,以显示给用户查看。
也可以直接根据所述实际距离控制所述移动物体的移动。在得到修正后更为准确的实际距离D2后,再基于该实际距离D2控制所述移动物体完成避障、跟踪监视等移动操作,例如当所述实际距离D2小于距离阈值时,需要停止移动以避免撞上所述待测物体,或者规划新的路线绕开该待测物体。或者例如当所述实际距离D2大于距离阈值时,需要继续移动,使所述待测物体一直处于距离阈值内,达到监视的目的。
需要说明的是,上述提及的视觉传感器或距离传感器一般检测到的是对应传感器到待测物体之间的距离,因此,在对感测到的感测数据进行计算得到距离值后,可以加上一个固定值如D。当然,也可以直接将基于感测数据计算得到的距离值直接作为所述移动物体与待测物体之间的第一距离值D1,仅仅在后续的作移动控制时,基于固定值来配置距离阈值。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
再请参见图3,是本发明实施例的另一种移动物体的距离检测方法的流程示意图,本发明实施例的所述方法可以由一个处理器执行,具体可以由移动物体中的移动控制器来执行,例如无人飞行器中的飞行控制器。具体的,所述方法包括:
S301:获取通过距离感测方式感测到的距离感测数据,并对所述距离感测数据进行计算得到距离值。
所述距离感测方式包括使用距离传感器的感测数据来完成距离感测,距离传感器包括超声波、红外、雷达、和/或激光等传感器。
S302:判断计算得到的距离值是否小于预设的距离阈值。
该距离阈值是根据超声波等距离传感器的有效探测距离进行设置的,在目前条件下,超声波的有效探测距离为3m左右,那么,在将超声波作为距离传感器感测距离时,所预设的距离阈值选择3m,当基于超声波距离感测方式感测到的距离小于3m时,则使用超声波感测的距离值作为第一距离值,即执行下述的S303。否则,使用视觉传感器,执行下述的S304。
S303:若计算得到的距离值小于预设的距离阈值,则将该距离值作为第一距离值。
S303:若计算得到的距离值大于预设的距离阈值,则通过视觉感测方式获取所述移动物体与所述待测物体之间的第一距离值。
S304:获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值。
所述倾斜角度可以从所述移动物体上挂载的陀螺仪、加速度传感器等传感器感测得到,所述倾斜角度可以是指所述移动物体的移动方向与水平面的夹角,即所述目标方向与水平面的夹角。例如飞行器在飞行过程中向前飞行时,会出现如图2所示的倾斜的情况,此时,飞行器中相关传感器可以感测到这个倾斜、旋转,从而得到所述倾斜角度。
在得到所述第一距离值和倾斜角度后,可通过三角函数对所述第一距离值和倾斜角度进行计算,完成对所述第一距离值的修正得到所述第二距离值。即D2=cos(a)*D1,其中,D2为第二距离值,D1为第一距离值,a为倾斜角度。
S305:将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
具体可以直接根据所述实际距离控制所述移动物体的移动。所述根据所述实际距离控制所述移动物体的移动可以包括:根据所述实际距离执行避障操作;其中,所述避障操作包括:限制移动速度、规避所述待测物体的操作。
根据获取的实际距离,当移动物体越靠近所述待测物体时,速度会越来越慢,直到降低的设定的最小速度。当移动物体高速移动时,检测到前方有待测物体时,已知当前移动物体的移动速度和在该速度下刹停移动物体所要用的距离。当到所述待测物体的距离小于刹车距离加上安全距离的距离阈值时,移动物体进入紧急刹车的状态,防止飞机撞到障碍物。当移动物体检测到前方有待测物体出现时,当距离处于安全距离(小于距离阈值)以内,飞机会自动后退到安全距离外面,而且距离越近,回退速度越快。
所述根据所述实际距离控制所述移动物体的移动还可以包括:根据所述实际距离执行目标跟踪操作;其中,所述目标跟踪操作包括:控制所述移动物体移动以使所述移动物体与所述待测物体之间的距离在约束的跟踪距离阈值内。
再请参见图4,是本发明实施例的获取距离值的方法的流程示意图,本发明实施例是基于双目视觉传感器来计算得到上述实施例中所提及的第一距离值。具体的,所述方法包括:
S401:在目标方向上获取至少两张图像。
能够感测距离的传感器组件的结构示意图如图5所示,其中包括了摄像头501、摄像头502两个视觉传感器,并且图5中还示出了距离传感器,即超声波发送探头503和超声波接收探头504。该传感器组件可以设置在移动物体的各个面上,以根据需要检测移动物体可能的各个运动方向上待测物体的距离值。
通过摄像头501和摄像头502获取的图像,可以对任何一个摄像头获取的图像中的每一个像素,在另外一个摄像头上找到和每一个像素对应的像素。基于两个相对应的像素及其在各个图像中的坐标、摄像头的焦距、两个摄像头之间的距离,可以进一步计算得到第一距离值。
S402:分别确定出所述至少两张图像上的特征点,并进行特征点比较,在所述至少两张图像之间比较确定出彼此关联的特征点。
在双目视觉获取的图像上,可以提取一些特征显著的像素,提取特征点的方法可以用FAST(Features from accelerated segment test),SIFT(Scale-invariant feature transform),SURF(Speeded Up Robust Features)等算法实现,以FAST为例,FAST特征算法计算出特征点用brief(Binary robust independent elementary features)算子进行描述。
在两张图像之间比较确定出彼此关联的特征点是通过双目特征点匹配来确定的。基于FAST特征算法计算出的特征点的特征描述子brief,通过比较两张图片的各个像素的brief的相似程度,获取彼此关联匹配的特征点。关联匹配的特征点包括待测物体上某一个被测点分别在两张双目视觉图像上的像素点。
S403:基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值。
在确定出关联匹配的特征点后,基于三角定位的方式来计算得到各个特征点的距离值,进而得到所述第一距离值。如图6和图7所示,示出了双目视觉测距的原理图。图中示出的移动物体到某个被测点P的距离值的计算方式为:Z=fT/(xl-xr),其中Z为待计算的距离值,f是摄像头的焦距,T是两个摄像头之间的距离,xl、xr是P点投影在两个图像上的坐标,即上述提及的关联特征点在各个图像上的坐标。在计算得到多个被测点的距离后,通过求平均值等方式得到所述第一距离值。
进一步地,为了减小关联特征点的计算量,可以对两张视觉图像上的区域进行一定的筛选。具体的,所述分别确定出所述至少两张图像上的特征点具体可以包括:获取所述移动物体在目标方向上的倾斜角度;根据获取的角度确定出所述至少两张图像上的有效区域,并分别在对应图像的有效区域中确定出特征点。
移动物体的倾斜角度可以通过IMU(Inertial measurement unit,惯性测量单元)获取,由于摄像头随着移动物体向下倾斜,那么其拍摄到的图像是包括移动物体前方而倾斜向下的区域,而移动物体前方的区域在所述摄像头获取到的图像的上部分,通过IMU得到倾斜角度后,可以估计移动物体前方在图像中的比例,从而确定出有效区域进行计算。而如果摄像头随着移动物体向上倾斜,则移动物体前方的物体应位于图像的下方,基于IMU的倾斜角度同样可以得到有效区域。
另外,通过上述的FAST等算法可以得到一组相对稀疏特征点,计算第一距离值时可以基于确定出的稀疏特征点来计算所述移动物体到所述待测物体之间的第一距离值。
还可以通过块匹配block matching+斑点滤波器speckled filter的方法得到一组稠密特征点,计算第一距离值时可以基于确定出的稠密特征点来计算所述移动物体到所述待测物体之间的第一距离值。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
下面对本发明实施例的移动物体的距离检测装置和飞行器进行详细描述。
请参见图8,是本发明实施例的一种移动物体的距离检测装置的结构示意图,本发明实施例的所述装置可以设置在处理器中,或者设置在移动物体的移动控制器中,例如飞行器的飞行控制器中,具体的,所述装置包括:
检测模块1,用于检测移动物体与待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上。
处理模块2,用于获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值,将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
可选地,本发明实施例的所述装置还可以进一步包括控制模块3,用于根据所述实际距离控制所述移动物体的移动。
所述目标方向可以是指所述移动物体移动的正前方。所述移动物体上可以配置单、双目视觉传感器,如超声波、红外、雷达、和/或激光等传感器构成的距离传感器,以及其他可以感测距离的传感器。所述检测模块1可通过提及的视觉传感器、和/或距离传感器等所感测的感测数据来计算获取所述第一距离值。
所述检测模块1可以得到所述移动物体到所述待测物体某个平面上多个特征点的距离,因此,所述第一距离值为一个特殊的值,具体可为到多个点的距离值的平均值,或者可为多个特征点中特征相对最明显、计算得到的距离值最为准确的特征点的距离值。
所述检测模块1具体可以根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值。或者,所述检测模块1根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值。
所述处理模块2可以从所述移动物体上挂载的陀螺仪、加速度传感器等传感器的感测数据得到倾斜角度,所述倾斜角度可以是指所述移动物体的移动方向与水平面的夹角,即所述目标方向与水平面的夹角。例如飞行器在飞行过程中向前飞行时,会出现倾斜的情况,此时,所述处理模块2通过飞行器中相关传感器可以感测到这个倾斜、旋转,从而得到所述倾斜角度。
在得到所述第一距离值和倾斜角度后,所述处理模块2可通过三角函数对所述第一距离值和倾斜角度进行计算,完成对所述第一距离值的修正得到所述第二距离值,即所述移动物体到待测物体之间的实际距离。
在得到修正后更为准确的实际距离后,所述处理模块2可以直接输出给其他控制器或者用户端,也可以直接通过所述控制模块3基于该实际距离控制所述移动物体完成避障、跟踪监视等移动操作,例如当所述实际距离小于距离阈值时,需要停止移动以避免撞上所述待测物体,或者规划新的路线绕开该待测物体。或者例如当所述实际距离大于距离阈值时,需要继续移动,使所述待测物体一直处于距离阈值内,达到监视的目的。
需要说明的是,上述提及的视觉传感器或距离传感器一般检测到的是对应传感器到待测物体之间的距离,因此,在对感测到的感测数据进行计算得到距离值后,可以加上一个固定值。当然,也可以直接将基于感测数据计算得到的距离值直接作为所述移动物体与待测物体之间的第一距离值,仅仅在后续的作移动控制时,基于固定值来配置距离阈值。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
进一步地,本发明实施例还提供了另一种移动物体的距离检测装置的结构示意图,本发明实施例的所述装置可以设置在处理器中,或者设置在移动物体的移动控制器中,例如飞行器的飞行控制器中,具体的,所述装置包括上述的检测模块1、处理模块2以及控制模块3。
进一步可选地,在本发明实施例中,所述检测模块1,具体用于根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值。距离传感器包括超声波、雷达、红外等距离传感器。
进一步可选地,所述检测模块1,具体用于根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值。视觉感测包括基于两个摄像头构成的视觉感测系统。
进一步可选地,请参见图9,本发明实施例的所述检测模块1在通过视觉感测距离时,具体可以包括:
获取单元11,用于在目标方向上获取至少两张图像;
比较单元12,用于分别确定出所述至少两张图像上的特征点,并进行特征点比较,在所述至少两张图像之间比较确定出彼此关联的特征点;
确定单元13,用于基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值。
其中具体的,所述比较单元12,具体用于获取所述移动物体在目标方向上的倾斜角度;根据获取的角度确定出所述至少两张图像上的有效区域,并分别在对应图像的有效区域中确定出特征点。
进一步可选地,本发明实施例中所述第一距离值可以为所述移动物体与待测物体之间的平均距离值。
进一步可选地,所述检测模块1具体用于通过距离感测方式和视觉感测方式检测得到所述第一距离值。
进一步可选地,所述检测模块1可以根据检测到距离传感器和/或视觉传感器检测到的距离,判定具体使用哪种距离检测方式,请参见图10,本发明实施例的所述检测模块1具体可以包括:
第一获取单元14,用于获取通过距离感测方式感测到的距离感测数据,并对所述距离感测数据进行计算得到距离值;
距离值确定单元15,用于若计算得到的距离值小于预设的距离阈值,则将该距离值作为第一距离值;
第二获取单元16,用于若计算得到的距离值大于预设的距离阈值,则通过视觉感测方式获取所述移动物体与所述待测物体之间的第一距离值。
进一步可选地,所述控制模块3具体用于根据所述实际距离执行避障操作;其中,所述避障操作包括:限制移动速度、规避所述待测物体的操作。
进一步可选地,所述控制模块3具体用于根据所述实际距离执行目标跟踪操作;其中,所述目标跟踪操作包括:控制所述移动物体移动以使所述移动物体与所述待测物体之间的距离在约束的跟踪距离阈值内。
需要说明的是,本发明实施例中各个功能模块和单元的具体实现可参考本文中其他地方中相关方法或功能模块的具体实现的描述。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
再请参见图11,是本发明实施例的一种飞行器的结构示意图,本发明实施例的所述飞行器包括现有的各种机架,电源等,还包括动力装置100、飞行控制器200以及传感器300。
所述传感器300,用于感测飞行器到待测物体之间的距离数据;
所述飞行控制器200,用于根据所述传感器300的距离数据确定出所述飞行器到待测物体之间的第一距离值,所述待测物体在该飞行器的目标方向上;获取所述飞行器在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述飞行器到所述待测物体的第二距离值;
所述飞行控制器200,还用于向所述动力装置100发出控制指令,以控制所述动力装置100使所述飞行器根据所述第二距离值移动。
进一步地,本发明实施例中所述飞行控制器200和传感器300的具体实现可参考图1至图10对应实施例中相关方法步骤、功能模块的具体实现描述。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
再请参见图12,是本发明实施例的另一种飞行器的结构示意图,本发明实施例的所述飞行器包括现有的各种机架,电源等,还包括:动力装置400、飞行控制器500、传感器600,其中:
所述传感器600,用于检测所述飞行器到待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
所述飞行控制器500,用于获取所述飞行器在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述飞行器到所述待测物体的第二距离值;
所述飞行控制器500,还用于向所述动力装置400发出控制指令,以控制所述动力装置400使所述飞行器根据所述第二距离值移动。
进一步地,本发明实施例中所述飞行控制器500和传感器600的具体实现可参考图1至图10对应实施例中相关方法步骤、功能模块的具体实现描述。
本发明实施例能够对直接检测到的距离值进行修正,得到更为准确的距离值,方便更为准确地完成对所述移动物体的移动控制。
在本发明所提供的几个实施例中,应该理解到,所揭露的相关装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得计算机处理器(processor)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (26)

  1. 一种移动物体的距离检测方法,其特征在于:包括:
    检测移动物体与待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
    获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值;
    将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
  2. 如权利要求1所述的方法,其特征在于:还包括:
    根据所述第二距离值控制所述移动物体的移动。
  3. 如权利要求1所述的方法,其特征在于:所述检测移动物体与待测物体之间的第一距离值,包括:
    根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值。
  4. 如权利要求1所述的方法,其特征在于:所述检测移动物体与待测物体之间的第一距离值,包括:
    根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值。
  5. 如权利要求4所述的方法,其特征在于:所述根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值,包括:
    在目标方向上获取至少两张图像;
    分别确定出所述至少两张图像上的特征点,并进行特征点比较,在所述至少两张图像之间比较确定出彼此关联的特征点;
    基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值。
  6. 如权利要求5所述的方法,其特征在于:所述分别确定出所述至少两张图像上的特征点,包括:
    获取所述移动物体在目标方向上的倾斜角度;
    根据获取的角度确定出所述至少两张图像上的有效区域,并分别在对应图像的有效区域中确定出特征点。
  7. 如权利要求5或6所述的方法,其特征在于:所述确定出的特征点为一组稀疏特征点,所述基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值,包括:
    基于确定出的稀疏特征点来计算所述移动物体到所述待测物体之间的第一距离值。
  8. 如权利要求5或6所述的方法,其特征在于:所述确定出的特征点为一组稠密特征点,所述基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值,包括:
    基于确定出的稠密特征点来计算所述移动物体到所述待测物体之间的第一距离值。
  9. 如权利要求1至6任一项所述的方法,其特征在于:所述第一距离值为所述移动物体与待测物体之间的平均距离值。
  10. 如权利要求1所述的方法,其特征在于:所述检测移动物体与待测物体之间的第一距离值,包括:
    获取通过距离感测方式感测到的距离感测数据,并对所述距离感测数据进行计算得到距离值;
    若计算得到的距离值小于预设的距离阈值,则将该距离值作为第一距离值;
    若计算得到的距离值大于预设的距离阈值,则通过视觉感测方式获取所述移动物体与所述待测物体之间的第一距离值。
  11. 如权利要求1至10任一项所述的方法,其特征在于:所述根据所述实际距离控制所述移动物体的移动,包括:根据所述实际距离执行避障操作;其中,所述避障操作包括:限制移动速度.规避所述待测物体的操作。
  12. 如权利要求1至10任一项所述的方法,其特征在于:所述根据所述实际距离控制所述移动物体的移动,包括:根据所述实际距离执行目标跟踪操作;其中,所述目标跟踪操作包括:控制所述移动物体移动以使所述移动物体与所述待测物体之间的距离在约束的跟踪距离阈值内。
  13. 一种移动物体的距离检测装置,其特征在于:包括:
    检测模块,用于检测移动物体与待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
    处理模块,用于获取所述移动物体在目标方向上的倾斜角度,并根据所述获取的角度和所述第一距离值,计算得到所述移动物体到所述待测物体的第二距离值,并将所述第二距离值确定为所述移动物体与待测物体之间的实际距离。
  14. 如权利要求13所述的装置,其特征在于:还包括:
    控制模块,用于根据所述实际距离控制所述移动物体的移动。
  15. 如权利要求13所述的装置,其特征在于:
    所述检测模块,具体用于根据距离感测得到的距离感测数据来检测并获取移动物体到待测物体之间的第一距离值。
  16. 如权利要求13所述的装置,其特征在于:
    所述检测模块,具体用于根据视觉感测的图像感测数据来检测并获取移动物体与待测物体之间的第一距离值。
  17. 如权利要求16所述的装置,其特征在于:所述检测模块包括:
    获取单元,用于在目标方向上获取至少两张图像;
    比较单元,用于分别确定出所述至少两张图像上的特征点,并进行特征点比较,在所述至少两张图像之间比较确定出彼此关联的特征点;
    确定单元,用于基于所述确定出的特征点来计算所述移动物体到所述待测物体之间的第一距离值。
  18. 如权利要求17所述的装置,其特征在于:
    所述比较单元,具体用于获取所述移动物体在目标方向上的倾斜角度;根据获取的角度确定出所述至少两张图像上的有效区域,并分别在对应图像的有效区域中确定出特征点。
  19. 如权利要求17或18所述的装置,其特征在于:所述比较单元比较确定出一组稀疏特征点;
    所述确定单元,具体用于基于确定出的稀疏特征点来计算所述移动物体到所述待测物体之间的第一距离值。
  20. 如权利要求17或18所述的装置,其特征在于:所述比较单元比较确定出一组稠密特征点;
    所述确定单元,具体用于基于确定出的稠密特征点来计算所述移动物体到所述待测物体之间的第一距离值。
  21. 如权利要求13至20任一项所述的装置,其特征在于:所述第一距离值为所述移动物体与待测物体之间的平均距离值。
  22. 如权利要求13所述的装置,其特征在于:所述检测模块包括:
    第一获取单元,用于获取通过距离感测方式感测到的距离感测数据,并对所述距离感测数据进行计算得到距离值;
    距离值确定单元,用于若计算得到的距离值小于预设的距离阈值,则将该距离值作为第一距离值;
    第二获取单元,用于若计算得到的距离值大于预设的距离阈值,则通过视觉感测方式获取所述移动物体与所述待测物体之间的第一距离值。
  23. 如权利要求13至22任一项所述的装置,其特征在于:所述控制模块具体用于根据所述实际距离执行避障操作;其中,所述避障操作包括:限制移动速度.规避所述待测物体的操作。
  24. 如权利要求13至22任一项所述的装置,其特征在于:所述控制模块具体用于根据所述实际距离行目标跟踪操作;其中,所述目标跟踪操作包括:控制所述移动物体移动以使所述移动物体与所述待测物体之间的距离在约束的跟踪距离阈值内。
  25. 一种飞行器,其特征在于:包括:动力装置、飞行控制器、传感器,其中:
    所述传感器,用于感测飞行器到待测物体之间的距离数据;
    所述飞行控制器,用于根据所述传感器的距离数据确定出所述飞行器到待测物体之间的第一距离值,所述待测物体在该飞行器的目标方向上;获取所述飞行器在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述飞行器到所述待测物体的第二距离值;
    所述飞行控制器,还用于向所述动力装置发出控制指令,以控制所述动力装置使所述飞行器根据所述第二距离值移动。
  26. 一种飞行器,其特征在于:包括:动力装置、飞行控制器、传感器,其中:
    所述传感器,用于检测所述飞行器到待测物体之间的第一距离值,所述待测物体在该移动物体的目标方向上;
    所述飞行控制器,用于获取所述飞行器在目标方向上的倾斜角度,并根据所述获取的倾斜角度和所述第一距离值,计算得到所述飞行器到所述待测物体的第二距离值;
    所述飞行控制器,还用于向所述动力装置发出控制指令,以控制所述动力装置使所述飞行器根据所述第二距离值移动。
PCT/CN2015/083868 2015-07-13 2015-07-13 一种移动物体的距离检测方法、装置及飞行器 Ceased WO2017008224A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201580004349.5A CN106104203B (zh) 2015-07-13 2015-07-13 一种移动物体的距离检测方法、装置及飞行器
PCT/CN2015/083868 WO2017008224A1 (zh) 2015-07-13 2015-07-13 一种移动物体的距离检测方法、装置及飞行器
US15/870,174 US10591292B2 (en) 2015-07-13 2018-01-12 Method and device for movable object distance detection, and aerial vehicle
US16/817,205 US20200208970A1 (en) 2015-07-13 2020-03-12 Method and device for movable object distance detection, and aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/083868 WO2017008224A1 (zh) 2015-07-13 2015-07-13 一种移动物体的距离检测方法、装置及飞行器

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/870,174 Continuation US10591292B2 (en) 2015-07-13 2018-01-12 Method and device for movable object distance detection, and aerial vehicle

Publications (1)

Publication Number Publication Date
WO2017008224A1 true WO2017008224A1 (zh) 2017-01-19

Family

ID=57216277

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/083868 Ceased WO2017008224A1 (zh) 2015-07-13 2015-07-13 一种移动物体的距离检测方法、装置及飞行器

Country Status (3)

Country Link
US (2) US10591292B2 (zh)
CN (1) CN106104203B (zh)
WO (1) WO2017008224A1 (zh)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106104203B (zh) * 2015-07-13 2018-02-02 深圳市大疆创新科技有限公司 一种移动物体的距离检测方法、装置及飞行器
WO2018090205A1 (en) * 2016-11-15 2018-05-24 SZ DJI Technology Co., Ltd. Method and system for image-based object detection and corresponding movement adjustment maneuvers
CN106842177B (zh) * 2016-12-30 2021-03-09 易瓦特科技股份公司 获取标准避障距离的方法及装置
CN106774413A (zh) * 2016-12-30 2017-05-31 易瓦特科技股份公司 应用于飞行避障的方法及系统
CN106774407A (zh) * 2016-12-30 2017-05-31 易瓦特科技股份公司 避障方法及装置
CN106871906B (zh) * 2017-03-03 2020-08-28 西南大学 一种盲人导航方法、装置及终端设备
CN110268355A (zh) * 2017-05-03 2019-09-20 深圳市元征科技股份有限公司 智能勘测设备的控制方法、设备及存储介质
JP6795457B2 (ja) * 2017-06-02 2020-12-02 本田技研工業株式会社 車両制御システム、車両制御方法、および車両制御プログラム
WO2018227350A1 (zh) * 2017-06-12 2018-12-20 深圳市大疆创新科技有限公司 无人机返航控制方法、无人机和机器可读存储介质
CN109213187A (zh) * 2017-06-30 2019-01-15 北京臻迪科技股份有限公司 一种无人机的位移确定方法、装置及无人机
CN108184096B (zh) * 2018-01-08 2020-09-11 北京艾恩斯网络科技有限公司 一种机场跑滑区全景监控装置、系统及方法
WO2019144286A1 (zh) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 障碍物检测方法、移动平台及计算机可读存储介质
KR102061514B1 (ko) * 2018-03-02 2020-01-02 주식회사 만도 객체 감지 장치 및 방법
CN108820215B (zh) * 2018-05-21 2021-10-01 南昌航空大学 一种自主寻找目标的自动空投无人机
CN109031197B (zh) * 2018-06-19 2019-07-09 哈尔滨工业大学 一种基于超视距传播模型的辐射源直接定位方法
EP3837492A1 (en) * 2018-08-21 2021-06-23 SZ DJI Technology Co., Ltd. Distance measuring method and device
CN109202903B (zh) * 2018-09-13 2021-05-28 河南机电职业学院 分拣机器人工作站输送链CountsPerMeter参数和基座标系校准的方法
CN108873001B (zh) * 2018-09-17 2022-09-09 江苏金智科技股份有限公司 一种精准评判机器人定位精度的方法
WO2020106984A1 (en) * 2018-11-21 2020-05-28 Eagle View Technologies, Inc. Navigating unmanned aircraft using pitch
CN109435886B (zh) * 2018-12-05 2021-02-23 广西农业职业技术学院 一种汽车安全驾驶辅助检测方法
CN109866230A (zh) * 2019-01-17 2019-06-11 深圳壹账通智能科技有限公司 客服机器人控制方法、装置、计算机设备及存储介质
DE112020001434T5 (de) * 2019-03-27 2021-12-23 Sony Group Corporation Datenverarbeitungsvorrichtung, datenverarbeitungsverfahren und programm
WO2021195886A1 (zh) * 2020-03-30 2021-10-07 深圳市大疆创新科技有限公司 距离确定方法、可移动平台及计算机可读存储介质
TWI728770B (zh) * 2020-04-01 2021-05-21 財團法人工業技術研究院 飛行載具及應用其之方向偵測方法
US11946771B2 (en) 2020-04-01 2024-04-02 Industrial Technology Research Institute Aerial vehicle and orientation detection method using same
WO2021237535A1 (zh) * 2020-05-27 2021-12-02 深圳市大疆创新科技有限公司 碰撞处理方法、设备及介质
CN111897356A (zh) * 2020-08-10 2020-11-06 深圳市道通智能航空技术有限公司 一种避障方法、装置及无人飞行器
US11810346B2 (en) * 2021-06-07 2023-11-07 Goodrich Corporation Land use for target prioritization

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101968354A (zh) * 2010-09-29 2011-02-09 清华大学 基于激光探测和图像识别的无人直升机测距方法
EP2511781A1 (de) * 2011-04-14 2012-10-17 Hexagon Technology Center GmbH System und Verfahren zur Steuerung eines unbemannten Fluggeräts
CN104111058A (zh) * 2013-04-16 2014-10-22 杰发科技(合肥)有限公司 车距测量方法及装置、车辆相对速度测量方法及装置
CN104656665A (zh) * 2015-03-06 2015-05-27 云南电网有限责任公司电力科学研究院 一种新型无人机通用避障模块及步骤
CN104730533A (zh) * 2015-03-13 2015-06-24 陈蔼珊 一种移动终端及基于移动终端的测距方法、系统

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2985329B1 (fr) * 2012-01-04 2015-01-30 Parrot Procede de pilotage intuitif d'un drone au moyen d'un appareil de telecommande.
US9769365B1 (en) * 2013-02-15 2017-09-19 Red.Com, Inc. Dense field imaging
CN105247593B (zh) * 2014-04-17 2017-04-19 深圳市大疆创新科技有限公司 飞行禁区的飞行控制
US10453348B2 (en) * 2015-06-15 2019-10-22 ImageKeeper LLC Unmanned aerial vehicle management
CN106104203B (zh) * 2015-07-13 2018-02-02 深圳市大疆创新科技有限公司 一种移动物体的距离检测方法、装置及飞行器
EP3328731B1 (en) * 2015-07-28 2020-05-06 Margolin, Joshua Multi-rotor uav flight control method
US10017237B2 (en) * 2015-12-29 2018-07-10 Qualcomm Incorporated Unmanned aerial vehicle structures and methods
KR20170138797A (ko) * 2016-06-08 2017-12-18 엘지전자 주식회사 드론
JP2018146546A (ja) * 2017-03-09 2018-09-20 エアロセンス株式会社 情報処理システム、情報処理装置および情報処理方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101968354A (zh) * 2010-09-29 2011-02-09 清华大学 基于激光探测和图像识别的无人直升机测距方法
EP2511781A1 (de) * 2011-04-14 2012-10-17 Hexagon Technology Center GmbH System und Verfahren zur Steuerung eines unbemannten Fluggeräts
CN104111058A (zh) * 2013-04-16 2014-10-22 杰发科技(合肥)有限公司 车距测量方法及装置、车辆相对速度测量方法及装置
CN104656665A (zh) * 2015-03-06 2015-05-27 云南电网有限责任公司电力科学研究院 一种新型无人机通用避障模块及步骤
CN104730533A (zh) * 2015-03-13 2015-06-24 陈蔼珊 一种移动终端及基于移动终端的测距方法、系统

Also Published As

Publication number Publication date
US20180156610A1 (en) 2018-06-07
US20200208970A1 (en) 2020-07-02
US10591292B2 (en) 2020-03-17
CN106104203A (zh) 2016-11-09
CN106104203B (zh) 2018-02-02

Similar Documents

Publication Publication Date Title
WO2017008224A1 (zh) 一种移动物体的距离检测方法、装置及飞行器
WO2016074169A1 (zh) 一种对目标物体的检测方法、检测装置以及机器人
WO2018124688A1 (ko) 충돌 회피용 드론 제어장치
WO2015194868A1 (ko) 광각 카메라가 탑재된 이동 로봇의 주행을 제어하기 위한 장치 및 그 방법
WO2022039404A1 (ko) 광시야각의 스테레오 카메라 장치 및 이를 이용한 깊이 영상 처리 방법
WO2015093828A1 (ko) 스테레오 카메라 및 이를 구비한 차량
WO2019135437A1 (ko) 안내 로봇 및 그의 동작 방법
WO2020046038A1 (ko) 로봇 및 그의 제어 방법
WO2020141900A1 (ko) 이동 로봇 및 그 구동 방법
WO2015093823A1 (ko) 차량 운전 보조 장치 및 이를 구비한 차량
WO2021157904A1 (en) Electronic apparatus and controlling method thereof
WO2020138950A1 (ko) 전자 장치 및 그 제어 방법
WO2022035054A1 (ko) 로봇 및 이의 제어 방법
WO2015147371A1 (ko) 차량의 위치 보정 장치 및 방법과 이를 이용한 차량 위치 보정 시스템 및 무인 운행이 가능한 차량
WO2019194544A1 (en) Method and system for handling 360 degree image content
EP3545387A1 (en) Method and device for providing an image
WO2017034321A1 (ko) 카메라를 구비하는 장치에서 사진 촬영 지원 기법 및 그 장치
WO2018182066A1 (ko) 이미지에 동적 효과를 적용하는 방법 및 장치
WO2020080734A1 (ko) 얼굴 인식 방법 및 얼굴 인식 장치
WO2016086380A1 (zh) 一种物体检测方法、装置及遥控移动设备、飞行器
WO2020189831A1 (ko) 자율주행 차량의 모니터링 및 제어 방법
WO2019004531A1 (ko) 사용자 신호 처리 방법 및 이러한 방법을 수행하는 장치
WO2020204291A1 (ko) 전자 장치 및 그의 발열 제어 방법
WO2022114410A1 (ko) 초음파 센서를 이용하여 다채널 영상들을 효율적으로 저장하는 전자 장치 및 그 동작방법
WO2022055117A1 (ko) 전자 장치 및 이의 제어 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15897952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15897952

Country of ref document: EP

Kind code of ref document: A1