WO2020225518A1 - Drone system and method of communication between drones - Google Patents
Drone system and method of communication between drones Download PDFInfo
- Publication number
- WO2020225518A1 WO2020225518A1 PCT/GB2020/000044 GB2020000044W WO2020225518A1 WO 2020225518 A1 WO2020225518 A1 WO 2020225518A1 GB 2020000044 W GB2020000044 W GB 2020000044W WO 2020225518 A1 WO2020225518 A1 WO 2020225518A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- schedule
- drone
- gestures
- vehicle
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
- G06F21/445—Program or device authentication by mutual authentication, e.g. between devices or programs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the present invention relates to the field of drone-to-drone communication for autonomous unmanned vehicles.
- Unmanned autonomous vehicles are also known as autonomous drones, or drones, and the terms are used interchangeably herein.
- Drones can be unmanned autonomous aerial vehicles (e.g. unmanned autonomous quad/multicopters/helicopters airplanes etc), water vehicles (e.g. unmanned autonomous boats and submarines), and/or land vehicles (e.g. unmanned autonomous robotic rovers etc).
- This provides for communication between the autonomous drones, by way of a hand-shake protocol in at least one direction, so that the observing/receiving drone receives information from the transmitting drone, as a minimum providing sufficient information to identify the transmitting drone as one that is moving according to the same movement schedule, which provides a strong measure of certainty that the remote drone is from the same organization as the receiving drone is.
- This provides a way for a drone to identify and authenticate a remote drone without reliance on any transmissions other than using a camera and using ambient light (which may include infrared light especially at night).
- both drones are able to authenticate each other as described above, thus providing for a two-way handshake protocol and permitting further exchange of data in one or both directions by means of such movement gestures (especially if each drone is configured to change its transmission code or coding scheme to indicate that it has authenticated the other drone).
- the drone is adapted to determine its location, wherein the predefined movement gesture schedule is generated via a pseudorandom number generated from a seed which in turn is generated at least in part by position.
- a pattern which is location dependent is harder for an unauthentic drone to copy, unless the unauthentic drone is at substantially the same position, which could usually be easily detected.
- the transmitting drone la may be travelling or may be stationary, and irrespective of any travelling path the transmitting drone expresses the movement gestures as it does so.
- the gestures should be small minor adjustments to the navigation path, so as to use minimal energy, to the extent that they may only be noticeable to a particularly observant observer.
- an autonomous vehicle comprising: Motion actuators for effecting vehicular movement; Data storage medium comprising a predefined movement gesture schedule with respect to time; A computer processor adapted to determine a navigation plan, and adapted to control the motion actuators to: Autonomously drive the vehicle according to the navigation plan; and Additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule.
- the navigation maneuvers have at least five times the duration or interval time between them compared to the gestures during that segment.
- the gestures are of a type that enables them to be distinguished from navigation maneuvers. For example a 10cm translation in any direction (e.g. one perpendicular to a direction of travel if any) or a 10 degree rotation, followed by the reverse motion, executed in less than 0.5 seconds, can readily be distinguished from typical navigation maneuvers.
- the navigation maneuvers are substantially controlled so as to be distinguishable from the gestures (and preferably vice versa) - for example a change of direction of travel maintained for several seconds or more, will typically be evidently not a gesture.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Psychiatry (AREA)
- Data Mining & Analysis (AREA)
- Computer Security & Cryptography (AREA)
- Artificial Intelligence (AREA)
- Astronomy & Astrophysics (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Social Psychology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Apparatus is provided, and method of communicating between drones. Two drones are provided, a remote drone able to perform predetermined stored gestures and a local drone having a sensor to detect such gestures (e.g. a camera) and determine whether the movement comprises the predetermined gestures. If the performed gestures match the stored gestures the remote drone is authenticated. A drone may have both sets of functionality to enable two-way handshake protocol. Once the handshake protocol has been completed, transmission of further data is also possible. Movement gestures may include pseudo-random movements such as North, South, West, East, Up Down, and may have varying duration or delays between them. Since not all gestures will be visible to the receiving drone a statistical test is applied to determine whether there is a sufficient match to the predetermined code. The advantage is that a drone may authenticate and potentially share data with another drone, where transmissions are undesirable.
Description
DRONE SYSTEM AND METHOD OF COMMUNICATION BETWEEN DRONES
The present invention relates to the field of drone-to-drone communication for autonomous unmanned vehicles. Unmanned autonomous vehicles are also known as autonomous drones, or drones, and the terms are used interchangeably herein. Drones can be unmanned autonomous aerial vehicles (e.g. unmanned autonomous quad/multicopters/helicopters airplanes etc), water vehicles (e.g. unmanned autonomous boats and submarines), and/or land vehicles (e.g. unmanned autonomous robotic rovers etc).
Drone-to-drone communication is possible using standard wireless communication protocols. Sometimes such communication methods are not possible due to interference or may have undesirable side effects. It is an object of the present invention to enable drones to communicate with each other in such circumstances.
Aspects of the present invention is set out in claims 1 and 7.
This provides for communication between the autonomous drones, by way of a hand-shake protocol in at least one direction, so that the observing/receiving drone receives information from the transmitting drone, as a minimum providing sufficient information to identify the transmitting drone as one that is moving according to the same movement schedule, which provides a strong measure of certainty that the remote drone is from the same organization as the receiving drone is. This provides a way for a drone to identify and authenticate a remote drone without reliance on any transmissions other than using a camera and using ambient light (which may include infrared light especially at night). Preferably both drones are able to authenticate each other as described above, thus providing for a two-way handshake protocol and permitting further exchange of data in one or both directions by means of such movement gestures (especially if each drone is configured to change its transmission code or coding scheme to indicate that it has authenticated the other drone).
Whilst it is known for toy drones to detect and respond to human hand gestures, such as indicating with your hand the direction that a drone should fly, and it has been proposed in the literature for delivery drones to respond to human hand gestures, such as 'waving the
drone away or 'waving the drone down to deliver the parcel', it was not obvious that drones should communicate with each other using gestures, partly because there are so many other ways that drones can communicate and which offer far greater data rates, and partly because drone's don't' have hands and don't issue gestures. Furthermore, public disclosures of drones which respond to human hand gestures are very clearly outside the field of drone-to-drone communication for autonomous drones for two reasons -firstly these disclosures are specific to user-to-drone communication, and secondly because such drones which are directed by the human hand, are by definition, not autonomous.
Preferably the predefined movement gesture schedule comprises angular movement gestures about at least one axis. Preferably the predefined movement gesture schedule comprises a series comprising linear velocity movement gestures in at least two linear dimensions. These approaches help increase the chances that another drone can detect the gestures.
Preferably the drone is adapted to determine the time, wherein the predefined movement gesture schedule with respect to time, is a substantially non-repeating predetermined gesture schedule movement with respect to time. A substantially non-repeating pattern is harder for an unauthentic drone to copy.
Preferably the drone is adapted to determine its location, wherein the predefined movement gesture schedule is generated via a pseudorandom number generated from a seed which in turn is generated at least in part by position. A pattern which is location dependent is harder for an unauthentic drone to copy, unless the unauthentic drone is at substantially the same position, which could usually be easily detected.
Generally, the gestures are small movements or rotations that are additional to, and generally/mostly will be of shorter duration, compared to the navigation maneuvers of the drone. Thus even if the drone is moving, the observing drone will be able identify that the remote drone is moving according to the same movement schedule.
Since the drone will be observing from one direction, any movements in the direction of line of sight may be undetectable, so the method of communication is less reliable if the gestures are limited to linear velocity gestures in one specific dimension. Preferably the gestures either comprise linear velocity gestures in two dimensions, or comprise angular velocity gestures
about at least one axis, since this ensures that with adequate light and image quality at least some of the gestures can be detected, such that a correlation with the predetermined gesture sequence can be generally be completed successfully.
Each degree of gestural motion (each axis in which linear gestures are provided, and each axis in which rotational gestures are provided) can be considered a degree of freedom, and preferably at least 3, more preferably at least 5 degrees of freedom are used so that a greater rate of data can be encoded in the transmission. Furthermore preferably the speed and extent of the gestures in at least one (or preferably all) of the degrees of freedom is varied so as to encode yet higher rate of data to be encoded.
Furthermore, to minimize the risk of the gesture sequence being copied by other autonomous drones and repeated later, preferably the sequence is a substantially non-repeating sequence, such as for example may be generated by any pseudo-random number generator (optionally sequentially based on an original seed number, or else repeatedly in a single step based on a seed number such as may be provided by or derived from the current time). As such, provided that each drone has a means to tell the time (e.g. a quartz oscillator clock, or a miniature atomic clock, and/or a GNSS receiver) the sequence can be generated in both drones, and the receiving drone is able to verify and authenticate the transmitting drone. The term substantially non-repeating here of course means that long parts of the sequence can repeat but this will be rare, whilst of course very short parts of the sequence will repeat frequently.
Furthermore, to avoid the gesture sequence being copied in substantially real time by other autonomous drones, preferably both drones are adapted to determine the location of the transmitting drone, and the sequence is one that is generated pseudo-randomly at least in part based on the position of the transmitting drone or some derivative of that position. Accordingly it is not possible for the gesture sequence to be copied and relayed by an unauthentic drone (i.e. a drone belonging to a different organization), even in substantially real-time except if the unauthentic drone is at substantially the same location (and typically also there at the same time) as the authentic drone. This situation is not of concern provided that the granularity/accuracy of the location grid being used is sufficiently granular/accurate that in practice both the authentic and inauthentic drones would be visible to the receiving drone.
Preferably therefore, the receiving drone is adapted to detect that two remote drones are transmitting according to the same movement gesture schedule, and is adapted to detect a delay in the gesture schedule of one transmitting drone compared to the other (preceding) transmitting drone, and to authenticate only the preceding transmitting drone. Such a delay is likely to be at least a 50th of a second, and this can be determined by recording the motion with respect to time of both drones as a waveform and cross-correlating them to see which waveform precedes the other.
The drones may be of any type but optionally are unmanned aerial vehicles, such as with one or more vertical axis rotors, optionally are unmanned land vehicles, and optionally are unmanned water vehicles such as boats.
A preferred embodiment of the present invention will now be described with reference to the figures in which:
Figure 1 is a diagram illustrating drone communication according to the prior art; and
Figure 2 is a diagram illustrating an embodiment of the present invention.
Turning to figure 1 which represents a prior art approach, two drones la', lb' are shown, communicating 2 via a common wireless communication standard (wifi™), this approach would enable communication in many situations, and different standards could be adopted for longer distance communication.
Turning to figure 2 which represents one embodiment of the present invention, two drones la, lb are shown. The transmitting drone la exhibits a series of movements (arrows) which are gestures, and the receiving drone lb has a camera aimed at the transmitting drone. Here the camera has two lenses and sensors (not shown), arranged to gather image data in visible light during the day, and in far infra-red radiation at night. The receiving drone lb is adapted to detect drones and to point it's camera 3 at them, which may involve rotating the camera and/or may involve rotating the whole drone. The detection can be by various methods known in the art, such as radar detection using an on board radar, heat signature detection using the infra-red camera 3, perhaps based on the noise it produces using a microphone array, or image based object recognition using the camera, or the drone might be informed by radio link communication of the location of another drone from a remote transmission control centre or even from yet another drone, L
Both drones have a computer processor on board, with data storage medium containing a movement gesture code sequence 4. The example shown in figure 2 has a variety of coded movement gestures, such as to briefly accelerate slightly as follows: North (N), South (S), West (W), East (E), Up (U), Down (D), Clockwise (C), Anticlockwise (A), or to Pause (P). Whilst these directions are absolute ones with respect to compass directions, it is also possible to use movements relative to a navigation direction whilst moving, or indeed to an arbitrary direction, such as Left (L), Right (R) and so on. Additionally, the number of degrees of freedom preferably is increased, by coding for movements of different magnitude, and movements of different duration. Whilst a quad/multicopter is an example of a drone that has the ability to express movement gestures with at least four degrees of freedom (three linear axes, and a vertical rotation axes) as well as variations in magnitude and duration, other types of drone may only be able to express gestures with fewer degrees of freedom, such as for example an unmanned boat and some types of unmanned land rovers are in practice limited to turning left and right whilst travelling (albeit also able to vary the magnitude and duration of such movements), and an unmanned airplane is between these two extremes in the number of degrees of freedom it could express movement gestures with. A range of gestures should be selected that is appropriate, for the type of drone.
Both drones may hold a record of the predetermined gesture sequence as a code 4, but in this example they generate the code 4 themselves. Each drone has on board a clock (e.g. GNSS receiver optionally backed up with a quartz or miniature atomic clock, not shown) and a location detection device (e.g. GNSS receiver optionally backed up with additional location detector such as optical terrain tracker and/or inertial navigation device, not shown). The receiving drone lb is adapted to determine the location of the transmitting drone, by determining the direction/distance vector to the receiving drone using its camera 3 and adding this to its own location.
Both drones combine the clock output and the location output into a predetermined format (for example the two character sequences may be simply concatenated, or any other operation applied to generate a code that is dependent on both outputs). The resulting output is repeatedly passed as a seed number into a pseudorandom number generator, in order to generate a random output, for example this can be done twice a second. The random
output is used to randomly choose a sequence of gestures 4, such as North, South, Up, Down, West, East, Wait etc (or as appropriate for the drone in question, alternatively just Left, Right, Fast, Slow, Wait etc). Note that a gesture such as 'Left' does not change the overall navigation direction, but rather is a time limited impulse that direction which is then cancelled out. Whilst described as a velocity gesture, in practice the linear direction velocity gestures will typically comprise an increase in velocity followed by a matching decrease, so as to cause only a small perturbation in the navigation path of the drone.
The transmitting drone la may be travelling or may be stationary, and irrespective of any travelling path the transmitting drone expresses the movement gestures as it does so. The gestures should be small minor adjustments to the navigation path, so as to use minimal energy, to the extent that they may only be noticeable to a particularly observant observer.
The receiving drone observes the transmitting drone by means of the camera (4), and identifies perturbations in the path or position and/or orientation of the transmitting drone that are of a timescale relevant to the gesture sequence with respect to time. Semi- permanent changes in direction, for example those that last for much longer than the sequence with respect to time, can be ignored.
Since from the viewing direction not all of the gestures will necessarily be visible, and many of the gestures may be distorted by viewing direction, for example when looking from ahead 20 degrees downwards, the forward and backward motions may be undetectable, whereas up and down motions are both suppressed and may be confused for up and down motions, whereas left and right motions are reversed from the perspective of the viewer.
The detected movements are detected using a conventional object detection and motion detection algorithm, as applied to the camera data, and are recorded with respect to time into a sequence. If the orientation of the transmitting drone is known (for example either this is based on its direction of travel, or perhaps by means of a modification to the code wherein 'wait' always and only follows 'left') then the sequence is corrected for direction (so left becomes north, and so on, as appropriate). Alternatively if the transmitting drone's direction is not known, then either all directions should be tested, or the sequences should be multiplied as multidimensional vectors so as to give a consistent result if they all/mostly match barring an error in orientation.
By comparing the detected sequence with the generated sequence, the two sequences are compared to see if they match. This can be done as Boolean matching and counting how many match, or can be done by multiplying the gestures as vectors and adding all the results together to determine the magnitude of the match. Since some gestures may not be visible to the receiving drone a complete match should not be expected. However, at its simplest a threshold proportion of correct matches and a minimum sample size should be set above which the transmitting drone is identified as authenticated, and otherwise the transmitting drone is not authenticated.
Clearly, more sophisticated statistical tests are useful, however these are beyond the scope of this discussion because the statistical threshold will depend on the number of degrees of freedom of motion (which relates to the number of possible gestures), as well as the quality of the camera and image processing software, and any environmental conditions which should be tolerated such as rain, and because basic statistical tests can easily be decided through simple trial and error. Accordingly the invention enables authentication of a transmitting drone, which at its simplest is the detection of whether a drone is an authentic drone of the same origin as the receiving drone. However it is also possible to transmit additional data.
As an example a drone may be equipped to function both as transmitting drones and as receiving drones (i.e. it has the features of a receiving drone and comprises the features of a transmitting drone). Indeed both drones may be equipped to both transmit and receive. In this situation each drone may transmit continuously according to a first code. Drone 1 may authenticates drone 2, and Drone 2 may authenticate drone 1. This can happen at the same time or in either order.
A drone may add additional data to its transmission by for example adding a small perturbation to the timing of its gestures (adding a fractional delay to some gestures but not to others), and may orient itself to face a drone it is transmitting to and not applying any perturbations to any gestures which that drone will be unable to see (movements directly towards/away from that receiving drone). The perturbation in timing is small enough to not interfere with the process of authentication. Once a drone has authenticated another drone it enters a mode in which it additionally detects and records such perturbations in the timing of such visible gestures. The receiving drone may encrypt a message before encoding it in
such perturbations, for example using a shared key or a more complex scheme, and the receiving drone decrypts the perturbations to decode the message. More complex arrangements are also possible, for example changing to a new code to indicate that it has authenticated a drone, whilst the second code still permits authentication, and then once both drones have authenticated each other they can switch to encoding data in the gestures rather than the handshake protocol.
More generally speaking, a drone moves or applies perturbations to its navigation path, in accordance with a predetermined movement gesture schedule, thereby transmitting a code. Another drone has a visible and/or infra-red camera with which it detects such perturbations or movements, and checks whether they match the predetermined gesture schedule, thereby authenticating the first drone. A drone may have both sets of functionality to enable two-way handshake protocol.
Movement gestures may include pseudo-random movements such as North, South, West, East, Up Down, and may have varying duration or delays between them. Since not all gestures will be visible to the receiving drone a statistical test is applied to determine whether there is a sufficient match to the predetermined code.
The code may be generated repeatedly using a pseudo-random number generator which may be seeded using the current time and/or the current location of the transmitting drone. Once the handshake protocol has been completed, transmission of further data is also possible. The advantage is that a drone may authenticate and potentially share data with another drone, even in situations where radio or free-space optics transmissions would be subject to interference or might cause adverse consequences.
Apparatus is provided, and method of communicating between drones. Two drones are provided, a remote drone able to perform predetermined stored gestures and a local drone having a sensor to detect such gestures (generally a passive sensor, such as a camera) and determine whether the movement comprises the predetermined gestures. If the performed gestures match the stored gestures the remote drone is authenticated.
According to a 2nd aspect of the invention there is provided an autonomous vehicle comprising: Motion actuators for effecting vehicular movement; Data storage medium comprising a predefined movement gesture schedule with respect to time; A computer processor adapted
to determine a navigation plan, and adapted to control the motion actuators to: Autonomously drive the vehicle according to the navigation plan; and Additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule. According to a 3rd aspect of the invention there is provided an autonomous vehicle comprising: Motion actuators for effecting vehicular movement; Data storage medium comprising a predefined movement gesture schedule with respect to time; A sensor adapted to generate image data; A computer processor adapted to determine a navigation plan, and adapted to control the motion actuators to: Autonomously drive the vehicle according to the navigation plan; and Process the image data to identify a series of variations in another autonomous vehicle's linear and/or angular velocity, and Correlate the series of variations with the predefined movement gesture schedule to output a measure of correlation, and; If the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated. According to a 4th aspect of the invention there is provided an autonomous vehicle comprising:
Motion actuators for effecting vehicular movement; Data storage medium comprising at least one predefined movement gesture schedule with respect to time; A camera adapted to generate image data; A computer processor adapted to determine a navigation plan, and adapted to control the motion actuators to: Autonomously drive the vehicle according to the navigation plan; and Additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with one of the at least one predefined movement schedule; Process the image data to identify a series of variations in another autonomous vehicle's linear and/or angular velocity, and
Correlate the series of variations with at least one of the at least one predefined movement gesture schedule, to output a measure of correlation, and if the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated.
According to a 5th aspect of the invention there is provided a computer implemented method of communicating from a local autonomous transmitting vehicle to a remote autonomous receiving vehicle, comprising the step of controlling a computer processor of the local transmitting vehicle to: Control motion actuators of the transmitting vehicle for effecting vehicular movement; Read from a data storage medium of the transmitting vehicle,
comprising a predefined movement gesture schedule with respect to time; Determine a navigation plan; Control the motion actuators to drive the transmitting vehicle according to the navigation plan; Control the motion actuators to effect movements in the transmitting vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule.
According to a 6th aspect of the invention there is provided a computer implemented method of interpreting communicating from a remote autonomous transmitting vehicle to a local autonomous receiving vehicle, comprising the step of controlling a computer processor of the local receiving vehicle to: Read from a data storage medium of the receiving vehicle, comprising a predefined movement gesture schedule with respect to time; Read data from a camera of the local autonomous receiving vehicle; Determine a navigation plan; Control the motion actuators to control the motion of the local receiving vehicle according to the navigation plan; Process the image data from the camera, to identify a series of variations in another autonomous vehicle's linear and/or angular velocity; Correlate the series of variations with the predefined movement gesture schedule to output a measure of correlation, and if the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated.
The term 'match' means that the similarity is sufficient to enable the authentication. Authentication does not require total certainty, but may instead require a desired level of certainty. Inclusion of any differences between the two sequences would mean that it may require a longer sequence to achieve authentication, and whilst this could be implemented and would work, is fairly pointless except if it was desired to convey additional information during the authentication process (such as to identify the drone or to signal a warning), rather than at the end of it. The point however is that the sequences must match sufficiently to enable authentication to a required level of confidence, and if they do then they are considered to match.
At its simplest, for a use case which requires low confidence, the sequence could be a simple one, but preferably it is a random or pseudorandom string of gestures and/or have random or pseudorandom variations in their timing. Use of sequences containing at least 10 different gestures, and/or having varied timing, is preferable to reduce the authentication time. The gestures are distinct from navigation maneuvers, do not significantly detract from a
navigation path, and will have shorter duration or intervals between them compared to the navigation maneuvers. For example gestures typically are performed at a rate of at least one per second, and navigation maneuvers such as changing direction, typically are many seconds apart or often minutes apart when travelling to a destination. The navigation maneuvers are preferably navigation maneuvers in transit to a destination. Generally at least a segment of the navigation maneuvers have at least five times the duration or interval time between them compared to the gestures during that segment. Preferably the gestures are of a type that enables them to be distinguished from navigation maneuvers. For example a 10cm translation in any direction (e.g. one perpendicular to a direction of travel if any) or a 10 degree rotation, followed by the reverse motion, executed in less than 0.5 seconds, can readily be distinguished from typical navigation maneuvers. Preferably the navigation maneuvers are substantially controlled so as to be distinguishable from the gestures (and preferably vice versa) - for example a change of direction of travel maintained for several seconds or more, will typically be evidently not a gesture. However authentication could involve slower gestures, at the expense of authentication speed, travel efficiency, and potentially also authentication reliability, if this was desired. Generally the maneuvers need to be typically differentiable from the gestures, by the remote drone, however if at times the maneuvers are performed at a rate and nature that causes confusion then this reduces the reliability of authentication during that period of time. It may not be essential to always be able to perform authentication reliably, and may be acceptable to only be possible most of the time.
Claims
1. A method of communicating between two autonomous vehicles, comprising the steps of:
Providing a first autonomous vehicle and a second autonomous vehicle, both vehicles comprising:
Motion actuators for effecting vehicular movement;
Data storage medium comprising a predefined movement gesture schedule with respect to time;
A computer processor adapted to determine a navigation plan, and adapted to control the motion actuators to autonomously drive the vehicle according to the navigation plan;
The first autonomous vehicle comprises a computer processor adapted to
additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule. The second autonomous vehicle comprises:
A sensor adapted to generate image data;
A computer processor adapted to:
Process the image data to identify a series of variations in the other autonomous vehicle's linear and/or angular velocity;
Correlate the series of variations with the predefined movement gesture schedule to output a measure of correlation; and
If the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated; and
Wherein the predetermined movement gesture schedule with respect to time of the first vehicle matches that of the second vehicle.
2. Method of claim 1 wherein the predefined movement gesture schedule comprises angular movement gestures about at least one axis.
3. Method of claim 1 or 2 wherein the predefined movement gesture schedule
comprises a series comprising linear velocity movement gestures in at least two linear dimensions.
4. Method of claim 1, 2 or 3 wherein each of the first and second autonomous vehicles is adapted to determine the time, wherein the predefined movement gesture schedule with respect to time, is a substantially non-repeating predetermined gesture schedule movement with respect to time.
5. Method of claim 1, 2, 3 or 4 wherein the second autonomous vehicle comprises a computer processor adapted to additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule.
6. Method of claim 1, 2, 3, 4 or 5 wherein the first autonomous vehicle comprises:
A sensor adapted to generate image data;
A computer processor adapted to:
Process the image data to identify a series of variations in the other autonomous vehicle's linear and/or angular velocity, and
Correlate the series of variations with the predefined movement gesture schedule to output a measure of correlation; and
If the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated.
7. Apparatus comprising first and second autonomous vehicles:
Both autonomous vehicles comprising:
Motion actuators for effecting vehicular movement;
Data storage medium comprising a predefined movement gesture schedule with respect to time;
A computer processor adapted to determine a navigation plan, and adapted to control the motion actuators to autonomously drive the vehicle according to the navigation plan;
The first autonomous vehicle comprising a computer processor adapted to additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule; The second autonomous vehicle comprising:
A sensor adapted to generate image data; and
A computer processor adapted to:
Process the image data to identify a series of variations in another autonomous vehicle's linear and/or angular velocity;
Correlate the series of variations with the predefined movement gesture schedule to output a measure of correlation; and
If the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated; and
Wherein the predetermined movement gesture schedule with respect to time of the first vehicle matches that of the second vehicle.
8. Apparatus of claim 7 wherein the predefined movement gesture schedule comprises angular movement gestures about at least one axis.
9. Apparatus of claim 7 or 8, wherein the predefined movement gesture schedule comprises a series comprising linear velocity movement gestures in at least two linear dimensions.
10. Apparatus of claim 7, 8 or 9, wherein each of the first and second autonomous
vehicles is adapted to determine the time, wherein the predefined movement gesture schedule with respect to time, is a substantially non-repeating
predetermined gesture schedule movement with respect to time.
11. Apparatus of claim 7, 8, 9 or 10, wherein the second autonomous vehicle comprises a computer processor adapted to additionally effect movements in the vehicle's linear and/or angular velocity, so as to exhibit movement gestures in accordance with the predefined movement schedule.
12. Apparatus of claim 7, 8, 9, 10, or 11, wherein the first autonomous vehicle
comprises:
A sensor adapted to generate image data;
A computer processor adapted to:
Process the image data to identify a series of variations in the other autonomous vehicle's linear and/or angular velocity; and
Correlate the series of variations with the predefined movement gesture schedule to output a measure of correlation; and
If the measure of correlation exceeds a preset threshold, determine the other vehicle as authenticated.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GBGB1906239.7A GB201906239D0 (en) | 2019-05-03 | 2019-05-03 | Drone adapted to communicate with another drone and computer implemented method of communication |
| GB1906239.7 | 2019-05-03 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020225518A1 true WO2020225518A1 (en) | 2020-11-12 |
Family
ID=67384870
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/GB2020/000044 Ceased WO2020225518A1 (en) | 2019-05-03 | 2020-04-28 | Drone system and method of communication between drones |
Country Status (2)
| Country | Link |
|---|---|
| GB (2) | GB201906239D0 (en) |
| WO (1) | WO2020225518A1 (en) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160280370A1 (en) * | 2015-03-27 | 2016-09-29 | Amazon Technologies, Inc. | Influencing acceptance of messages in unmanned vehicles |
| US20160293018A1 (en) * | 2015-04-01 | 2016-10-06 | Korea University Research And Business Foundation | Method of controlling fleet of drones |
| US20170092138A1 (en) * | 2015-09-30 | 2017-03-30 | Alarm.Com Incorporated | Drone detection systems |
| WO2017161386A1 (en) * | 2016-03-17 | 2017-09-21 | Airspace Systems Inc. | System and method for aerial system discrimination and action |
| KR20170122948A (en) * | 2016-04-28 | 2017-11-07 | 고려대학교 세종산학협력단 | Method for verifying status of drone |
| US20180281946A1 (en) * | 2017-03-31 | 2018-10-04 | T-Mobile U.S.A., Inc. | Authorizing drone access to fulfillment centers |
| US20190044609A1 (en) * | 2017-08-04 | 2019-02-07 | Walmart Apollo, Llc | Systems, devices, and methods for relaying communications using autonomous drones |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10969777B2 (en) * | 2017-06-23 | 2021-04-06 | Qualcomm Incorporated | Local drone identification verification |
-
2019
- 2019-05-03 GB GBGB1906239.7A patent/GB201906239D0/en not_active Ceased
-
2020
- 2020-04-28 WO PCT/GB2020/000044 patent/WO2020225518A1/en not_active Ceased
- 2020-04-30 GB GB2006358.2A patent/GB2586303B/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160280370A1 (en) * | 2015-03-27 | 2016-09-29 | Amazon Technologies, Inc. | Influencing acceptance of messages in unmanned vehicles |
| US20160293018A1 (en) * | 2015-04-01 | 2016-10-06 | Korea University Research And Business Foundation | Method of controlling fleet of drones |
| US20170092138A1 (en) * | 2015-09-30 | 2017-03-30 | Alarm.Com Incorporated | Drone detection systems |
| WO2017161386A1 (en) * | 2016-03-17 | 2017-09-21 | Airspace Systems Inc. | System and method for aerial system discrimination and action |
| KR20170122948A (en) * | 2016-04-28 | 2017-11-07 | 고려대학교 세종산학협력단 | Method for verifying status of drone |
| US20180281946A1 (en) * | 2017-03-31 | 2018-10-04 | T-Mobile U.S.A., Inc. | Authorizing drone access to fulfillment centers |
| US20190044609A1 (en) * | 2017-08-04 | 2019-02-07 | Walmart Apollo, Llc | Systems, devices, and methods for relaying communications using autonomous drones |
Also Published As
| Publication number | Publication date |
|---|---|
| GB202006358D0 (en) | 2020-06-17 |
| GB2586303B (en) | 2022-02-16 |
| GB201906239D0 (en) | 2019-06-19 |
| GB2586303A (en) | 2021-02-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN105987694B (en) | The method and apparatus for identifying the user of mobile device | |
| EP4038503B1 (en) | Method for detecting replay attacks in gnss systems and devices thereof | |
| US20180160081A1 (en) | Information processing method, electronic device and computer storage medium | |
| Mendes et al. | Effects of GPS spoofing on unmanned aerial vehicles | |
| Pierlot et al. | BeAMS: A beacon-based angle measurement sensor for mobile robot positioning | |
| GB1605316A (en) | Improvements in or relating to identification of friend or foe(iff)systems | |
| US8990002B1 (en) | Method and apparatus for determining the relative position of a target | |
| US11165633B2 (en) | Method and system for securing data transmitted by a connected object against attacks affecting a control circuit of said object | |
| Carver et al. | Sunflower: Locating underwater robots from the air | |
| KR20200023974A (en) | Method and apparatus for synchronization of rotating lidar and multiple cameras | |
| JP6853530B2 (en) | Unmanned aerial vehicle current position detection system, unmanned aerial vehicle | |
| WO2014019505A1 (en) | Method for gesture recognition and device thereof | |
| WO2020225518A1 (en) | Drone system and method of communication between drones | |
| US10853435B2 (en) | Systems and methods for aligning event data | |
| Zheng et al. | Joint measurement and trajectory recovery in visible light communication | |
| US20040075555A1 (en) | System and method for authenticating live feed from surveillance system | |
| Licea et al. | Optical communication-based identification for multi-UAV systems: theory and practice | |
| JP6701153B2 (en) | Position measurement system for moving objects | |
| Rabinovich et al. | Cobe-coded beacons for localization, object tracking, and slam augmentation | |
| KR102550907B1 (en) | System for tracking unmanned aerial vehicle and computing device for executing the same | |
| US11402460B2 (en) | Method and system for processing a signal transmitted to a motor vehicle by a remote communicating entity | |
| KR102550637B1 (en) | A system for tracking an object in physical space using aligned frames of reference | |
| Prashanth et al. | Cryptographic method for secure object segmentation for autonomous driving perception systems | |
| CN106443584A (en) | Position determination method and apparatus | |
| CN113516896A (en) | Method for generating anti-interference and pulse-missing advanced synchronous laser pulse signal |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20726890 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20726890 Country of ref document: EP Kind code of ref document: A1 |