WO2021078663A1 - Détection de véhicule aérien - Google Patents
Détection de véhicule aérien Download PDFInfo
- Publication number
- WO2021078663A1 WO2021078663A1 PCT/EP2020/079308 EP2020079308W WO2021078663A1 WO 2021078663 A1 WO2021078663 A1 WO 2021078663A1 EP 2020079308 W EP2020079308 W EP 2020079308W WO 2021078663 A1 WO2021078663 A1 WO 2021078663A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- aerial vehicle
- aircraft
- captured
- image sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D45/00—Aircraft indicators or protectors not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
- G01C11/08—Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/04—Systems determining the presence of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to detection of an aerial vehicle. Particularly, although not exclusively, the present invention relates to the detection of an aerial vehicle by an aircraft.
- UAVs unmanned aerial vehicles
- a first aspect of the present invention provides a method of detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight.
- the method comprises receiving image data representing a first image captured by a first aircraft-mounted image sensor having a first field of view and processing the image data to determine whether an external aerial vehicle candidate is present in a target space of the first captured image; receiving image data representing a second image captured by a second aircraft-mounted image sensor having a second field of view, which encompasses the target space, and processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image; and generating a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
- the images are processed to identify the presence of an external aerial vehicle candidate by comparing the images to one or more stored representations of existing aerial vehicles.
- the one or more stored representations of aerial vehicles is determined by a classifier which is trained to recognise different types of aerial vehicles using supervised training procedures based on images from a library of aerial vehicle images. A number of libraries already exist and therefore this may reduce the time and complexity of training the image processor.
- the image data is captured using a 360° image sensor. This may provide an increased field of vision compared to regular image sensors and may result in a larger area of the surface of the aircraft captured.
- the method may comprise receiving an indication that the external aerial vehicle candidate is present based on an external image captured by a ground-based image sensor.
- a ground-based image sensor will provide additional verification that the UAV is present in the vicinity of the aircraft in flight.
- processing the image data to determine whether the external aerial vehicle candidate is present in the second captured image is triggered in response to a determination that an external aerial vehicle candidate is present in a target space of the first captured image.
- This may provide the advantage of minimising the image sensor processing power and associated resources unless an external aerial vehicle is suspected or determined in by another image sensor.
- a location of the external aerial vehicle is triangulated using the first and second captured images.
- this provides location data about the external aerial vehicle which can be used to help minimise the risk the external aerial vehicle poses.
- a second aspect of the present invention provides a machine-readable storage medium executable by a processor to implement the method according to the first aspect of the present invention.
- a third aspect of the present invention provides a system for detecting the presence of an external aerial vehicle in the vicinity of an aircraft in flight.
- the system comprises a first image sensor device having a first field of view to capture a first image comprising an external aerial vehicle candidate in a target space of the first image that is in the vicinity of an aircraft; a second image sensor device having a second field of view, which encompasses the target space, to capture a second image comprising the external aerial vehicle candidate; and a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
- a processor to generate a signal indicating that an external aerial vehicle is determined to be present in the vicinity of the aircraft if the external aerial vehicle candidate is determined to be present in both the first captured image and in the second captured image.
- an improved set of data for the authorities is generated with this system. This can lead to a more efficient management of the airspace in the vicinity of the detected UAV and reduce the risks that UAVs pose.
- At least one image sensor is aircraft mounted.
- external aerial vehicles in the vicinity of the aircraft is detected with use of on-board image sensors.
- the field of view of the image sensors may be outwardly facing from the aircraft.
- At least one image sensor is ground mounted. Such an arrangement provides added protection against external aerial vehicles that are spotted near to an airfield or other ground-based locations.
- a fourth aspect of the present invention is an aircraft comprising the system according to the third aspect of the present invention.
- a fifth aspect of the present invention is a processor and stored program code, and at least a first image sensor and a second image sensor, to perform the method of the first aspect of the present invention.
- Figure 1A is an illustrative plan view of an aircraft, according to an example
- Figure IB is an illustrative side elevation of an aircraft, according to an example
- Figure 2 is an illustrative side elevation of an aircraft in flight, according to an example
- Figure 3 is another illustrative side elevation of an aircraft in flight, according to an example
- Figure 4 illustrates two overhead images of a scene superimposed on one another, according to an example
- Figure 5 is an illustrative view of a scenario, according to an example; and [0022] Figure 6 is a process flow chart of a method, according to an example.
- That present invention takes into account that UAVs are readily available for anyone to purchase and there is little guidance or rules relating to their ownership or use. Where rules exist, they may not be internationally recognised or applied. There have been reported incidents involving commercial aircraft and suspected UAVs, which have resulted in the shutdown of major airports. These are extremely disruptive incidents and come at a large cost to flight operators and flight passengers alike. There have even been incidents when the presence of a UAV has been reported, causing subsequent disruption, without the presence even being verified. Such is the seriousness of the threat UAVs pose that a mere alleged sighting can down aircraft for long periods of time.
- Figure 1A illustrates a plan view of an aircraft 102 in flight
- Figure IB illustrates a side elevation of the aircraft 102 in flight.
- the figures show image sensors mounted at various positions on the aircraft.
- An image sensor 104, 107 is mounted at a position on the leading edge of each horizontal stabiliser and another image sensor 106 is mounted at a position at the top of the vertical stabiliser.
- Another image sensor 108, 116 is mounted on the leading edge of each wing tip.
- An image sensor 110, 118 is mounted on the underside of each wing and an image sensor 112, 119 is mounted atop each engine.
- An additional image sensor 128 is mounted on the top of the fuselage above the cabin area.
- an image sensor is any kind of device that is able to capture an image.
- the device may operate in colour or monochrome, and may operate in visible or near IR (or IR) regions of the electromagnetic spectrum.
- Such a device typically captures and stores images digitally, and is controllable to communicate captured image data to a local or remote processor for image processing.
- Known imaging sensors for example in digital cameras that are adapted for use in adverse (i.e. in flight) conditions, are suitable for use in examples herein.
- the plurality of image sensors may be controlled by one or more processors (not pictured).
- the processor, or processors may be mounted within the fuselage of the aircraft 102.
- each image sensor may include a co-located processor that performs at least some control and/or image processing.
- the image sensors may be controlled centrally.
- the image sensors may be powered by local power connections taken from the aircraft power network. Control signals and image data may be communicated to and from image sensors via wired or wireless connections.
- the processor(s) are arranged to process images and/or video captured by the plurality of image sensors to identify external aerial vehicle candidates, such as UAVs.
- At least some of the image sensors may have a wide or a panoramic field of view, for example greater than 160° horizontally and/or greater than 75° vertically. What each image sensor can see for any given field of view is of course dictated by where the image sensor is mounted on the aircraft and in which direction it is directed. At least one of the image sensors may have a 360° field of view horizontally and 90° or greater vertically. Image sensors may be fixed, for example as applied in catadioptric cameras, and derive their wide fields of view from fixed elements, such as lenses or mirrors. Other image sensors may be movable, such as rotatable, to achieve their fields of view. In any case the image sensors may be interconnected and be in communication with one another either directly or via a central system.
- Connectivity may use a wireless protocol, such as an Internet of Things (IoT) protocol such as Bluetooth, WiFi, ZigBee, MQTT IoT, CoAP, DSS, NFC, Cellular, AMQP, RFID, Z-Wave, EnOcean and the like.
- IoT Internet of Things
- the fields of view of various image sensors mounted on the plane 102 overlap giving multiple viewpoints of a vicinity or space around the aircraft 102.
- the fields of view are arranged to be outwardly- looking away from the aircraft 102, so that all regions around the aircraft are visible.
- Figures 1 A and IB illustrate a resultant field of view 126 that encompasses the entire area around the aircraft 102.
- the aircraft 102 is in Figure IB is depicted being approached by a UAV 130.
- the UAV 130 poses a threat to the safety of the aircraft and is likely to cause disruption if it is not dealt with in an efficient manner.
- Dealing with the UAV 130 may include, for example, recording its detection, location, velocity, alerting the pilot and notifying other aircraft and aircraft controllers at airports in the vicinity. This can give an improved set of data for the authorities, which can lead to a more efficient management of the airspace in the vicinity of the detected UAV.
- FIG. 2 illustrates a side elevation of the aircraft 102 in flight.
- the UAV 130 is approaching the aircraft 102.
- the UAV 130 is within a field of view 134 of the image sensor 132.
- Each of the image sensors captures images which are stored and processed in near-real-time to determine whether a UAV candidate is present.
- a UAV that is spotted by one image sensor is referred to herein as a candidate, whereas the same candidate, if it is spotted by more than one image sensor, is determined to be a UAV sighting.
- the image sensor 132 determines that there is the UAV candidate 130 in the vicinity, referred to herein as a target space, of the aircraft 102.
- the processor controlling the image sensor 132 notifies other image sensors, having overlapping fields of view with the image sensor 132, to scan their respective target areas in their captured images to determine if the UAV candidate 130 is also present therein.
- FIG. 3 shows the image sensor 114 having a field of view 136 which overlaps with the field of view 134.
- the field of view 136 provides an alternate view of the UAV 130 and that is used to verify or confirm that a UAV is in the vicinity of the aircraft 102. Therefore, image sensor 114, if it also comprises an image of the UAV candidate, is used to confirm the presence of the UAV 130. Based on the successful detection and confirmation of the UAV 130, an output indicating the presence of the UAV (as opposed to a ‘UAV candidate’) 130 is generated, along with any other pertinent details that has been surmised, such as size, distance and velocity.
- any of the plurality of image sensors with overlapping fields of view may be used to confirm that the UAV 130 is in the external area of the aircraft 102.
- any two of the plurality of image sensors may be used to triangulate the location of the UAV 130.
- Figure 3 illustrates other objects beneath the aircraft 102.
- the objects include a first tree 140, a second tree 142 and a car 144.
- the car 144 may be stationary or moving.
- the objects are in the field of view of at least one image sensor.
- FIG. 4 illustrates two overhead images of the ground 138 captured at two different times.
- the images are superimposed over one another.
- the two images were captured, one after the other, for instance 0.2s apart, by the same aircraft-mounted image sensor.
- Each image includes a first tree 141, a second tree 142, a road 143, a vehicle 144 on the road and a UAV 130.
- Each object in the image is designated with a first reference number (e.g. 141 for the first tree) that denotes the position of the object in the first image, and a second reference numeral (e.g. 141 ’ for the first tree) that denotes the position of the object in the second image.
- the objects are at least initially assumed to be on the ground 138.
- Non-moving objects may be identified by reference to the respective locations in the consecutive images and with knowledge of the ground velocity and altitude of the aircraft.
- non-moving object may be identified by reference to libraries of similar images (e.g. such as for roads and trees), by using a trained classifier, or by reference to satellite images and maps of a respective landscape.
- the processor is arranged to compare the two images and determine that certain matching objects have not moved (e.g. the trees and the road), whereas certain other objects (e.g. the car 144 and the UAV 130) have moved.
- the speed or velocity of the objects that are moving can be determined by reference to their different positions in the images relative to the static objects, and with the knowledge of the ground velocity and altitude of the aircraft. For example, dl is estimated to be about 1.8m, whereas d2 is estimated to be about 6x that distance, or 9m.
- a car travelling 1.8m in 0.2 seconds has a ground speed of 32.4kmh 1 . Meanwhile, the UAV has a calculated apparent ground speed of 194.4 kmh 1 .
- the processor is arranged to determine that an object moving at such a high apparent ground speed (for example, a threshold apparent ground speed may be higher than 120 kmh 1 or higher than 150 kmh 1 ) is in flight and, in fact, nearer to the aircraft.
- a threshold apparent ground speed may be higher than 120 kmh 1 or higher than 150 kmh 1
- the UAV 130 has moved the greatest distance across the field of view of the image sensor and it is determined not to be moving as it should be if it is a moving object on the ground 138 such as, for example, a car.
- the processor controlling the image sensor differentiates between objects on the ground that are moving as they should be, given knowledge of the altitude and ground velocity of the aircraft, and objects that are not moving as they should be. In the latter case, if it is clear that the objects are moving further/faster than a typical ground object (static or moving relatively slowly), the processor deduces that they could be UAVs.
- UAV candidate has been identified by two image sensors, and is determined to be a UAV
- triangulation can be performed (given a known spacing on the aircraft between the respective image sensors) to determine the altitude of the UAV, its distance from the aircraft and its velocity. The distance from the aircraft and velocity determine a threat level posed by the UAV.
- the image processing uses known object detection algorithms to identify moving objects and compare the objects to a library of known objects, including known UAVs.
- the image processing comprises a trained classifier to identify UAV candidates. The classifier may be trained to identify UAV candidates based on movement characteristics and/or based on pattern or image matching.
- FIG. 5 is an illustrative view of a scenario according to an example.
- the processor, or processors controlling the image sensors on aircraft 102 may connect to a wide area network of other systems.
- a wide area network of other systems For example, there is an image sensor 162 having a field of view 164 of the UAV mounted on a control tower 160.
- the image processor controlling the image sensor 162 determines that the UAV is present and generates a signal indicating it.
- a second aircraft 170 with a plurality of image sensors mounted in similar positions as aircraft 102.
- the processor controlling the image sensor 172 determines the presence of the UAV 130 and generates a signal indicating its presence.
- the image sensor is described to be mounted on the control tower 160 but may be mounted on different ground-based locations such as aircraft hangars, other buildings, masts, or ground based vehicles.
- the system controlling the plurality of image sensors sends out a signal to other systems that have image sensors with overlapping views of the external area of the aircraft 102. Where fields of view of other image sensors are controllable, those image sensors may adjust their respective field of view to view the UAV candidate.
- the other image sensors perform a similar process that that has been described to verify that the UAV 130 is present in the external area to the aircraft 102. More broadly, a first sighting of a UAV candidate by any of the sensors (i.e. ground or aircraft-mounted) illustrated in Figure 5 may generate a signal that causes any other of the images sensors that has an overlapping field of view to confirm the UAV candidate as being a UAV. Such an arrangement provides added protection against UAVs that are spotted near to an airfield.
- Figure 6 is a process flow chat of method 600 of determining if there is an aerial vehicle in the external area of an aircraft, for example, using the plurality of image sensors mounted on the aircraft 102 as described in relation to Figures 1 to 5.
- the method receives first image data having a first field of view of the external area of the aircraft 102.
- the received first image data is processed to determine the presence of a UAV candidate.
- second image data is received from a second image sensor having a field of view that encompasses part of the field of view, or the target area containing the UAV candidate, of the first image.
- the second image data is processed to determine if the UAV candidate is present.
- a signal is generated to indicate the presence of the UAV if it is determined that there is one in the first and second image data.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Astronomy & Astrophysics (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
L'invention concerne un procédé de détection de la présence d'un véhicule aérien externe à proximité d'un aéronef en vol. Le procédé consiste à recevoir des données d'image représentant une première image capturée par un premier capteur d'images monté sur un aéronef ayant un premier champ de vision et à traiter les données d'image pour déterminer si un candidat véhicule aérien externe est présent dans un espace cible de la première image capturée ; et à recevoir des données d'image représentant une seconde image capturée par un second capteur d'images monté sur un aéronef ayant un second champ de vision, qui englobe l'espace cible, et à traiter les données d'image pour déterminer si le candidat véhicule aérien externe est présent dans la seconde image capturée. Enfin, le procédé comprend la génération d'un signal indiquant qu'un véhicule aérien externe est déterminé comme étant présent à proximité de l'aéronef si le candidat véhicule aérien externe est déterminé comme étant présent à la fois dans la première image capturée et dans la seconde image capturée.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1915388.1A GB2588893A (en) | 2019-10-23 | 2019-10-23 | Aerial vehicle detection |
| GB1915388.1 | 2019-10-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021078663A1 true WO2021078663A1 (fr) | 2021-04-29 |
Family
ID=68728380
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2020/079308 Ceased WO2021078663A1 (fr) | 2019-10-23 | 2020-10-19 | Détection de véhicule aérien |
Country Status (2)
| Country | Link |
|---|---|
| GB (1) | GB2588893A (fr) |
| WO (1) | WO2021078663A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR3149687A1 (fr) * | 2023-06-09 | 2024-12-13 | Thales | Procédé de détection d’obstacles mis en œuvre par un système de détection embarqué sur un aéronef et système de détection d’obstacles associé |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5581250A (en) * | 1995-02-24 | 1996-12-03 | Khvilivitzky; Alexander | Visual collision avoidance system for unmanned aerial vehicles |
| US6804607B1 (en) * | 2001-04-17 | 2004-10-12 | Derek Wood | Collision avoidance system and method utilizing variable surveillance envelope |
| US20150302858A1 (en) * | 2014-04-22 | 2015-10-22 | Brian Hearing | Drone detection and classification methods and apparatus |
| EP3121763A1 (fr) * | 2015-07-24 | 2017-01-25 | Honeywell International Inc. | Système de pare-chocs helo utilisant une caméra pour la détection d'obstacle |
| CN107831777A (zh) * | 2017-09-26 | 2018-03-23 | 中国科学院长春光学精密机械与物理研究所 | 一种飞行器自主避障系统、方法及飞行器 |
| CN108168706A (zh) * | 2017-12-12 | 2018-06-15 | 河南理工大学 | 一种监测低空无人飞行器的多谱红外成像检测跟踪系统 |
| US20190025858A1 (en) * | 2016-10-09 | 2019-01-24 | Airspace Systems, Inc. | Flight control using computer vision |
| EP3447436A1 (fr) * | 2017-08-25 | 2019-02-27 | Aurora Flight Sciences Corporation | Système d'interception de véhicule aérien |
| WO2019163454A1 (fr) * | 2018-02-20 | 2019-08-29 | ソフトバンク株式会社 | Dispositif de traitement d'images, objet volant, et programme |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108447075B (zh) * | 2018-02-08 | 2020-06-23 | 烟台欣飞智能系统有限公司 | 一种无人机监测系统及其监测方法 |
-
2019
- 2019-10-23 GB GB1915388.1A patent/GB2588893A/en not_active Withdrawn
-
2020
- 2020-10-19 WO PCT/EP2020/079308 patent/WO2021078663A1/fr not_active Ceased
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5581250A (en) * | 1995-02-24 | 1996-12-03 | Khvilivitzky; Alexander | Visual collision avoidance system for unmanned aerial vehicles |
| US6804607B1 (en) * | 2001-04-17 | 2004-10-12 | Derek Wood | Collision avoidance system and method utilizing variable surveillance envelope |
| US20150302858A1 (en) * | 2014-04-22 | 2015-10-22 | Brian Hearing | Drone detection and classification methods and apparatus |
| EP3121763A1 (fr) * | 2015-07-24 | 2017-01-25 | Honeywell International Inc. | Système de pare-chocs helo utilisant une caméra pour la détection d'obstacle |
| US20190025858A1 (en) * | 2016-10-09 | 2019-01-24 | Airspace Systems, Inc. | Flight control using computer vision |
| EP3447436A1 (fr) * | 2017-08-25 | 2019-02-27 | Aurora Flight Sciences Corporation | Système d'interception de véhicule aérien |
| CN107831777A (zh) * | 2017-09-26 | 2018-03-23 | 中国科学院长春光学精密机械与物理研究所 | 一种飞行器自主避障系统、方法及飞行器 |
| CN108168706A (zh) * | 2017-12-12 | 2018-06-15 | 河南理工大学 | 一种监测低空无人飞行器的多谱红外成像检测跟踪系统 |
| WO2019163454A1 (fr) * | 2018-02-20 | 2019-08-29 | ソフトバンク株式会社 | Dispositif de traitement d'images, objet volant, et programme |
Non-Patent Citations (2)
| Title |
|---|
| GIANCARMINE FASANO ET AL: "Multi-sensor data fusion: A tool to enable UAS integration into civil airspace", DIGITAL AVIONICS SYSTEMS CONFERENCE (DASC), 2011 IEEE/AIAA 30TH, IEEE, 16 October 2011 (2011-10-16), pages 5C3 - 1, XP032069380, ISBN: 978-1-61284-797-9, DOI: 10.1109/DASC.2011.6096082 * |
| SCHUMANN ARNE ET AL: "Deep cross-domain flying object classification for robust UAV detection", 2017 14TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), IEEE, 29 August 2017 (2017-08-29), pages 1 - 6, XP033233410, DOI: 10.1109/AVSS.2017.8078558 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR3149687A1 (fr) * | 2023-06-09 | 2024-12-13 | Thales | Procédé de détection d’obstacles mis en œuvre par un système de détection embarqué sur un aéronef et système de détection d’obstacles associé |
Also Published As
| Publication number | Publication date |
|---|---|
| GB201915388D0 (en) | 2019-12-04 |
| GB2588893A (en) | 2021-05-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12217181B2 (en) | Visual observer for unmanned aerial vehicles | |
| RU2692306C2 (ru) | Система сопровождения для беспилотных авиационных транспортных средств | |
| US20210358311A1 (en) | Automated system of air traffic control (atc) for at least one unmanned aerial vehicle (uav) | |
| KR20250009392A (ko) | 무인 비행체 항로 구축 방법 및 시스템 | |
| US10332409B2 (en) | Midair collision threat detection and assessment using visual information | |
| KR102752939B1 (ko) | 무인 비행체 항로 구축 방법 및 시스템 | |
| JP2023538589A (ja) | ハイジャック、電波妨害、およびなりすまし攻撃に対する耐性を伴う無人航空機 | |
| US11875691B2 (en) | Drone encroachment avoidance monitor | |
| AU2015309677B2 (en) | An aerial survey image capture system | |
| KR102290533B1 (ko) | 불법 비행 감지 및 대응을 위한 rtk-gps 연동 시스템 및 그 방법 | |
| RU2755603C2 (ru) | Система и способ обнаружения и противодействия беспилотным летательным аппаратам | |
| US20240248477A1 (en) | Multi-drone beyond visual line of sight (bvlos) operation | |
| RU2746090C2 (ru) | Система и способ защиты от беспилотных летательных аппаратов в воздушном пространстве населенного пункта | |
| US20210088652A1 (en) | Vehicular monitoring systems and methods for sensing external objects | |
| KR20190021875A (ko) | 소형 무인 비행체를 탐지하고, 무력화하기 위한 시스템 및 그 운용 방법 | |
| Zarandy et al. | A novel algorithm for distant aircraft detection | |
| Minwalla et al. | Experimental evaluation of PICAS: An electro-optical array for non-cooperative collision sensing on unmanned aircraft systems | |
| US20240096099A1 (en) | Intrusion determination device, intrusion detection system, intrusion determination method, and program storage medium | |
| Dolph et al. | Detection and tracking of aircraft from small unmanned aerial systems | |
| WO2021078663A1 (fr) | Détection de véhicule aérien | |
| JP7574935B2 (ja) | 不審機対処装置、不審機対処システム、不審機対処方法およびコンピュータプログラム | |
| KR101676485B1 (ko) | 이동통신기지국을 이용한 소형비행체 감시 서비스 제공 시스템 및 방법 | |
| McCalmont et al. | Detect and avoid technology demonstration | |
| US12436542B2 (en) | Anti-collision system for an aircraft and aircraft including the anti-collision system | |
| Ahn et al. | A preparatory research for UAM collision avoidance using ADS-B |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20799631 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20799631 Country of ref document: EP Kind code of ref document: A1 |