[go: up one dir, main page]

WO2021038667A1 - Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote - Google Patents

Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote Download PDF

Info

Publication number
WO2021038667A1
WO2021038667A1 PCT/JP2019/033176 JP2019033176W WO2021038667A1 WO 2021038667 A1 WO2021038667 A1 WO 2021038667A1 JP 2019033176 W JP2019033176 W JP 2019033176W WO 2021038667 A1 WO2021038667 A1 WO 2021038667A1
Authority
WO
WIPO (PCT)
Prior art keywords
landing
position information
information
target
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/033176
Other languages
English (en)
Japanese (ja)
Inventor
賢次 小関
好司 岸田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Trajectory Ltd
Original Assignee
Trajectory Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Trajectory Ltd filed Critical Trajectory Ltd
Priority to PCT/JP2019/033176 priority Critical patent/WO2021038667A1/fr
Publication of WO2021038667A1 publication Critical patent/WO2021038667A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U40/00On-board mechanical arrangements for adjusting control surfaces or rotors; On-board mechanical arrangements for in-flight adjustment of the base configuration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/10UAVs specially adapted for particular uses or applications for generating power to be supplied to a remote station, e.g. UAVs with solar panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U80/00Transport or storage specially adapted for UAVs
    • B64U80/80Transport or storage specially adapted for UAVs by vehicles
    • B64U80/84Waterborne vehicles

Definitions

  • the present disclosure relates to an unmanned aerial vehicle control device, an unmanned aerial vehicle control system, and an unmanned aerial vehicle control method.
  • UAV Unmanned Aerial Vehicle
  • Patent Document 1 discloses a method for autonomously flying and landing a UAV on a landing platform provided on the roof of an automobile.
  • the moving object to be transported that is, the landing target
  • the landing platform as disclosed in the above document.
  • the landing target is a specific person, etc.
  • the present disclosure has been made in view of the above problems, and the purpose of the present disclosure is to allow unmanned aerial vehicles to be easily transported by UAV even to landing targets that are not equipped with a specific platform. It is to provide a control device, a control system and a control method of an air vehicle.
  • a first position information acquisition unit that acquires the position information of the terminal obtained by the position information sensor of the terminal possessed by the landing target, and a second position information acquisition unit that acquires the position information of the flying object.
  • a landing target position prediction unit that predicts the position of the landing target at a time point after the reference time based on the history of the position information of the terminal before the reference time, and a time point after the reference time of the landing target.
  • An unmanned aircraft control device including a landing position determination unit that determines a position for landing the aircraft with respect to the landing target based on the information relating to the predicted position in the above and the position information of the aircraft. Provided.
  • a first position information acquisition unit that acquires the position information of the terminal obtained by the position information sensor of the terminal possessed by the landing target and a second position information acquisition unit that acquires the position information of the flying object.
  • the landing target position prediction unit that predicts the position of the landing target at a time point after the reference time based on the history of the position information of the terminal before the reference time, and the landing target position prediction unit after the reference time of the landing target.
  • Control of an unmanned aircraft including a landing position determination unit that determines a position for landing the aircraft with respect to the landing target based on information relating to the predicted position at the time of The system is provided.
  • a step of predicting the position of the landing target at a time point after the reference time, information on the predicted position of the landing target at a time point after the reference time, and information on the predicted position of the flying object is provided.
  • a method of controlling an unmanned air vehicle executed by a processor including a step of determining a position to land the air vehicle with respect to the landing target based on the position information.
  • the air vehicle according to the embodiment of the present disclosure has the following configuration.
  • (Item 1) A first position information acquisition unit that acquires the position information of the terminal obtained by the position information sensor of the terminal of the landing target, and The second position information acquisition unit that acquires the position information of the aircraft, and A landing target position prediction unit that predicts the position of the landing target at a time point after the reference time based on the history of the position information of the terminal before the reference time.
  • a landing position determining unit that determines a position for landing the flying object with respect to the landing target based on information relating to a predicted position at a time point after the reference time of the landing target and the position information of the flying object.
  • (Item 2) The control device for an unmanned aerial vehicle according to item 1, wherein the position information sensor included in the terminal is a GPS sensor and / or a GNSS sensor.
  • (Item 3) The control device for an unmanned air vehicle according to item 1 or 2, wherein the landing target position prediction unit predicts the position of the landing target based on information about the environment around the landing target.
  • (Item 4) The control device for an unmanned air vehicle according to any one of items 1 to 3, wherein the landing target position prediction unit predicts the position of the landing target based on traffic information related to the movement of the landing target.
  • the landing position determining unit determines a position for landing the aircraft with respect to the landing target so as to select a route for optimizing the flight target of the aircraft.
  • the unmanned aircraft control device described. The control device for an unmanned vehicle according to item 5, wherein the landing position determining unit selects a route for optimizing the flight target of the vehicle based on the operating conditions of the vehicle.
  • the optimization of the flight target includes minimizing the flight time of the flying object.
  • the control device for an unmanned aircraft according to any one of items 5 to 8, wherein the optimization of the flight target includes minimizing the energy utilization required for the flight of the aircraft.
  • the terminal has a sensor that estimates the orientation of the terminal in the horizontal direction.
  • Unmanned aircraft control device (Item 11) The unmanned aircraft according to item 10, wherein the landing position determining unit determines a position of a predetermined distance in the horizontal direction of the terminal obtained by the sensor from the position of the terminal as the landing position of the aircraft. Control device.
  • the terminal has an inertial sensor The control of an unmanned vehicle according to any one of items 1 to 11, wherein the landing target position prediction unit predicts the position of the landing target based on the information related to the acceleration of the terminal obtained by the inertia sensor. apparatus.
  • the terminal has a light emitter
  • the flying object has an imaging device and The item according to any one of items 1 to 12, wherein the landing position determining unit determines the landing position of the flying object based on the light emitted by the light emitting body imaged by the imaging device of the flying object. Control device for unmanned aircraft.
  • a first position information acquisition unit that acquires the position information of the terminal obtained by the position information sensor of the terminal of the landing target, and The second position information acquisition unit that acquires the position information of the aircraft, and A landing target position prediction unit that predicts the position of the landing target at a time point after the reference time based on the history of the position information of the terminal before the reference time.
  • a landing position determining unit that determines a position for landing the flying object with respect to the landing target based on information relating to a predicted position at a time point after the reference time of the landing target and the position information of the flying object.
  • How to control an unmanned aerial vehicle performed by a processor including.
  • FIG. 1 is a diagram showing an outline of a control system 1 for an unmanned aerial vehicle according to the first embodiment of the present disclosure.
  • the control system 1 for an unmanned vehicle is composed of a control device 10 (an example of a control device for an unmanned vehicle), a mobile terminal 20 (an example of a terminal), and a UAV30 (an example of an air vehicle). ..
  • the mobile terminal 20 is loaded on a ship 40 (an example of a moving body).
  • the mobile terminal 20 can be a terminal such as a smartphone, smart watch, or tablet possessed by a crew member on board the ship 40.
  • the mobile terminal 20 is not limited to such an example, and may be, for example, a personal computer provided in the ship 40 or the like.
  • the ship 40 is an example of a moving body
  • the moving body may be, for example, an automobile (including a motorcycle or the like), a bus, a truck, a work vehicle, a manned vehicle, or the like. That is, the moving body may be a moving body different from the above-mentioned UAV30.
  • a moving object is an example of a landing target.
  • the landing target may be, for example, an animal such as a person who possesses the mobile terminal. That is, the UAV 30 not only lands on the landing target itself, but may also land in the vicinity of the landing target.
  • the position information of the mobile terminal 20 possessed by the person is used to give the UAV 30 to the person. It is also possible to determine the landing position of.
  • the terminal is provided as a separate body for the landing target.
  • the control device 10 connects to the mobile terminal 20 and the UAV 30 via a network such as the Internet.
  • networks include local area networks (LAN), wide area networks (WAN), infrared, wireless, WiFi, point-to-point (P2P) networks, telecommunications networks, cloud communications, and the like.
  • the UAV 30 include an unmanned aerial vehicle (for example, a drone) that can fly autonomously or can be remotely controlled and controlled by the control device 10.
  • Predetermined information from the UAV 30 to the control device 10 for example, flight position information (latitude, longitude, altitude, etc.) of the UAV 30, flight path, battery usage / remaining amount, flight speed, flight time, acceleration, tilt, Information such as the operating status of other devices is transmitted automatically or in response to a request from the control device 10.
  • the control device 10 acquires the position information of the mobile terminal 20 loaded on the ship 40.
  • the position of the mobile terminal 20 can be regarded as the position of the ship 40.
  • the control device 10 continuously acquires the position information of the mobile terminal 20 to obtain the history of the position information of the mobile terminal 20.
  • the history of such position information corresponds to the route R2 that the ship 40 has traveled before reaching the reference time.
  • the control device 10 predicts the position of the ship 40 (that is, the route R1) at a time point after the reference time based on the route R2 up to the reference time. Then, the control device 10 determines the position where the UAV 30 will land on the ship 40 based on the prediction result of the route R1 and the position information of the UAV 30.
  • control system 1 having such a configuration, if the ship 40 has a mobile terminal 20 capable of communicating with the control device 10 even if the ship 40 does not have equipment such as a platform capable of directly communicating with the UAV 30, the ship by the UAV 30 can be used. It is possible to reliably and easily transport goods to 40.
  • each configuration of the control system 1 will be described.
  • FIG. 2 is a diagram showing a hardware configuration of the control device 10.
  • the illustrated configuration is an example, and may have other configurations.
  • control device 10 is connected to a database (not shown) to form a part of the system.
  • the control device 10 may be a general-purpose computer such as a workstation or a personal computer, or may be logically realized by cloud computing.
  • the control device 10 includes at least a control unit 11, a memory 12, a storage 13, a communication unit 14, an input / output unit 15, and the like, and these are electrically connected to each other through a bus 16.
  • the control unit 11 is an arithmetic unit that controls the operation of the entire control device 10, controls the transmission and reception of data between each element, and performs information processing and the like necessary for application execution and authentication processing.
  • the control unit 11 is a CPU (Central Processing Unit), and executes each information processing by executing a program or the like stored in the storage 13 and expanded in the memory 12.
  • the memory 12 includes a main memory composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage composed of a flash memory or a non-volatile storage device such as an HDD (Hard Disk Drive). ..
  • the memory 12 is used as a work area or the like of the control unit 11, and also stores a BIOS (Basic Input / Output System) executed when the control device 10 is started, various setting information, and the like.
  • BIOS Basic Input / Output System
  • the storage 13 stores various programs such as application programs.
  • a database storing data used for each process may be built in the storage 13.
  • the communication unit 14 connects the control device 10 to the network and / or the blockchain network.
  • the communication unit 14 may be provided with a short-range communication interface of Bluetooth (registered trademark) and BLE (Bluetooth Low Energy).
  • the input / output unit 15 is an information input device such as a keyboard and a mouse, and an output device such as a display.
  • the bus 16 is commonly connected to each of the above elements and transmits, for example, an address signal, a data signal, and various control signals.
  • FIG. 3 is a diagram showing a hardware configuration of the mobile terminal 20.
  • the illustrated configuration is an example, and may have other configurations.
  • the mobile terminal 20 includes at least a control unit 21, a memory 22, a storage 23, a communication unit 24, an input / output unit 25, and the like. Further, the mobile terminal 20 may include at least one of the sensor group 26, the light emitter 27, and the camera 28. These are electrically connected to each other through the bus 29. Since the functions of the control unit 21 to the input / output unit 25 and the bus 29 are the same as the corresponding components included in the control device 10, the description thereof will be omitted.
  • the sensor group 26 is various sensors provided in the mobile terminal 20.
  • the number and types of sensors provided are one or more, and are not particularly limited. These sensors may be provided in advance inside or outside the mobile terminal 20, or may be removable from the mobile terminal 20.
  • the sensor group 26 is composed of, for example, at least one of a positioning sensor 261, a geomagnetic sensor 262, and an inertial sensor 263.
  • the sensor group 26 may be composed of other sensors not described here.
  • the positioning sensor 261 is an example of a position information sensor, and has a function of positioning the position of the mobile terminal 20 and generating the position information of the mobile terminal 20.
  • the positioning sensor 261 is a GPS sensor and / or a GNSS sensor.
  • the positioning sensor 261 may be a sensor for positioning the altitude of the mobile terminal 20.
  • the position information sensor is not limited to the above-mentioned example, and is not particularly limited as long as the position information of the mobile terminal 20 can be obtained. By using a GPS sensor and / or a GNSS sensor as the positioning sensor 261, the position of the mobile terminal 20 can be grasped with higher accuracy.
  • the geomagnetic sensor 262 is an example of a sensor that estimates the orientation of the mobile terminal 20 in the horizontal direction.
  • the geomagnetic sensor 262 has a function of measuring the orientation of the mobile terminal 20 and estimating the orientation of the mobile terminal 20 in the horizontal direction.
  • the geomagnetic sensor 262 may be a so-called electronic compass. Further, the geomagnetic sensor 262 may have a function of estimating the inclination of the mobile terminal 20.
  • the inertial sensor 263 has a function of measuring the acceleration of the mobile terminal 20 and generating information related to the acceleration.
  • the inertial sensor 263 is, for example, an acceleration sensor or a gyro sensor.
  • the light emitter 27 is a device that emits light from the mobile terminal 20.
  • the light emitting body 27 may be, for example, an LED or the like.
  • the light emitting body 27 may be controlled by the control unit 21 so as to blink in a predetermined pattern.
  • the camera 28 is an example of a vision / image sensor, and is an image pickup device provided in the mobile terminal 20.
  • Such an imaging device may be a normal camera, an infrared camera for night vision, or the like. Further, the image captured by the camera 28 may be image-processed by the camera 28, the control unit 21, or the like.
  • FIG. 4 is a functional block diagram of the UAV 30 according to the first embodiment of the present disclosure.
  • the following functional block diagram is described as a concept stored in a single device (aircraft) for the sake of simplicity, but for example, some of its functions are described as an external device (for example, a control device). It may be logically configured by exerting it in 10) or by using cloud computing technology.
  • the flight controller 31 can have one or more processors such as a programmable processor (eg, central processing unit (CPU)).
  • processors such as a programmable processor (eg, central processing unit (CPU)).
  • the flight controller 31 has a memory 311 and can access the memory.
  • Memory 311 stores logic, code, and / or program instructions that the flight controller 31 can execute to perform one or more steps.
  • Memory 311 may include, for example, a separable medium such as an SD card or random access memory (RAM) or an external storage device.
  • the data acquired from the external device 35 such as the camera 351 or the sensor 352 may be directly transmitted and stored in the memory 311.
  • still image / moving image data taken by a camera or the like is recorded in an internal memory or an external memory.
  • the external device 35 is installed on the flying object via the gimbal 34.
  • the flight controller 31 includes a control module 312 configured to control the state of the flying object.
  • the control module 312 adjusts the spatial arrangement, velocity, and / or acceleration of an air vehicle with six degrees of freedom (translational motion x, y and z, and rotational motion ⁇ x , ⁇ y and ⁇ z).
  • the propulsion mechanism (motor 37, etc.) of the flying object is controlled via the ESC 36.
  • the motor 37 rotates the propeller 38 to generate lift of the flying object.
  • the control module 312 can control one or more of the states of the mounting unit and the sensors.
  • the flight controller 31 is a communication unit configured to transmit and / or receive data from one or more external devices (eg, transmitter / receiver (propo), terminal, display device, or other remote control). It is possible to communicate with 33.
  • the transmitter / receiver can use any suitable communication means such as wired communication or wireless communication. In this embodiment, since it is assumed that the flying object performs autonomous flight, manual operation by an external device such as a radio can be omitted.
  • the communication unit 33 uses one or more of a local area network (LAN), a wide area network (WAN), infrared rays, wireless, WiFi, a point-to-point (P2P) network, a telecommunications network, and cloud communication. can do.
  • LAN local area network
  • WAN wide area network
  • infrared rays wireless
  • WiFi wireless
  • P2P point-to-point
  • telecommunications network a telecommunications network
  • cloud communication can do.
  • the communication unit 33 transmits and / or transmits one or more of data acquired by a camera or various sensors, a processing result generated by the flight controller 31, predetermined control data, a user command from a terminal or a remote controller, and the like. Or you can receive it.
  • the sensor according to this embodiment is an inertial sensor (accelerometer, gyro sensor), positioning sensor (GPS sensor), proximity sensor (eg, rider), geomagnetic sensor, or vision. / May include an image sensor (eg, a camera).
  • FIG. 5 is a block diagram showing the functions of the control unit 11 and the storage 13 in the control device 10 according to the first embodiment of the present disclosure.
  • the functions and processes of the control device 10 according to the present embodiment will be described with reference to FIG.
  • the control unit 11 includes a first position information acquisition unit 111, a moving body position prediction unit 112, a second position information acquisition unit 113, a landing position determination unit 114, and a flight control unit 115. Further, the storage 13 includes a position prediction model 131, a traffic information database 132, and an environmental information database 133.
  • the first position information acquisition unit 111 has a function of acquiring the position information of the mobile terminal 20.
  • the position information of the mobile terminal 20 is generated by the positioning sensor 261.
  • the mobile terminal 20 is loaded on a moving ship 40. By acquiring such position information, the position information (route) of the ship 40 can be obtained.
  • the position information of the mobile terminal 20 is continuously transmitted to the control device 10 via the communication unit 24 of the mobile terminal 20.
  • the first position information acquisition unit 111 acquires the position information of the mobile terminal 20 via the communication unit 14.
  • the first position information acquisition unit 111 acquires the history of the position information of the mobile terminal 20 before the reference time.
  • the reference time referred to here is a reference time for predicting the route of the ship 40.
  • the reference time is not particularly limited, and may be, for example, the time when the flight of the UAV 30 starts, or the time when the route of the ship 40 is predicted again after the flight of the UAV 30.
  • the first position information acquisition unit 111 outputs the acquired position information of the mobile terminal 20 to the mobile body position prediction unit 112.
  • the mobile body position prediction unit 112 has a function of predicting the position of the ship 40 at a time point after the reference time based on the history of the position information of the mobile terminal 20 before the reference time. That is, the mobile body position prediction unit 112 predicts the possible route R1 of the ship 40 that has shifted to the reference time, based on the route R2 of the ship 40 that corresponds to the history of the position information of the mobile terminal 20.
  • the moving body position prediction unit 112 is an example of a landing target position prediction unit.
  • the moving body position prediction unit 112 predicts the position of the ship 40 by using, for example, the position prediction model 131 stored in the storage 13. That is, the moving body position prediction unit 112 inputs the history of the position information of the mobile terminal 20 into the position prediction model 131 as an input value, and outputs the output value output from the position prediction model 131 as the predicted position of the ship 40.
  • Such a position prediction model 131 may be based on, for example, traffic information related to the movement of the ship 40. That is, the moving body position prediction unit 112 may predict the position of the ship 40 based on the traffic information related to the movement of the ship 40.
  • the traffic information referred to here may be, for example, the navigation history and operation data of the ship 40, the navigation history or operation data of other vessels navigating the port in which the ship 40 is navigating, and the like.
  • the traffic information may be macro data such as the traffic volume in the port at a predetermined date and time, or data on the demand for using the port or the dock.
  • the traffic information may be road information, traffic jam information, information on an event or demand in an area where the car passes, and the like.
  • These traffic information may be information stored in the traffic information database 132 stored in the storage 13 of the control device 10, or may be information acquired by the control device 10 from the mobile terminal 20 or an external server. May be good.
  • the position prediction model 131 may be based on, for example, information on the environment around the ship 40. That is, the moving body position prediction unit 112 may predict the position of the ship 40 based on the information regarding the environment around the ship 40.
  • the information on the environment referred to here may be, for example, information on the tide flow, wind speed, wind direction, tide height, or weather of the port in which the ship 40 navigates.
  • the traffic information may be geographic information in the area including the port and information on changes in weather.
  • the information regarding such an environment may be information stored in the environment information database 133 stored in the storage 13 of the control device 10, or information acquired by the control device 10 from the mobile terminal 20 or an external server. May be good.
  • the position of the ship 40 can be predicted with higher accuracy.
  • a general machine learning method (with or without supervised learning) can be used.
  • techniques such as regression analysis, predictive filters, neural networks or deep learning can be used.
  • the position prediction model 131 can be constructed by learning using these methods using information on the route, and in some cases, traffic information and / or information on the environment and information on the actual route.
  • the moving body position prediction unit 112 may predict the position of the ship 40 based on the information related to the acceleration of the mobile terminal 20 obtained by the inertial sensor 263 of the mobile terminal 20.
  • the acceleration of the mobile terminal 20 corresponds to the acceleration generated by the movement of the ship 40.
  • the moving body position prediction unit 112 outputs the prediction result of the position of the ship 40 to the landing position determination unit 114.
  • the second position information acquisition unit 113 has a function of acquiring the position information of the UAV 30.
  • the position information of the UAV 30 is generated by the positioning sensor 313 (GPS sensor, GNSS sensor, etc.) loaded on the UAV 30, for example, as described above.
  • the position information of the UAV 30 is transmitted to the control device 10 via the communication unit 33 of the UAV 30.
  • the position information of the UAV 30 may be acquired by a method other than the method described above.
  • information relating to the position of the UAV 30 may be acquired based on the strength of the radio wave received from the UAV 30.
  • the position of the UAV30 is estimated from the viewpoint of inertial navigation based on the inertial sensor 314 and / or the geomagnetic sensor 315 of the UAV30, and the information related to the position is obtained from the estimation result. May be obtained.
  • Information related to the position of the UAV 30 may be acquired by using any combination of the positioning sensor 313, the inertial sensor 314, and the geomagnetic sensor 315 described above. Further, for example, when the UAV 30 is stopped at a predetermined port or dock, the position information may be the position information of the port or dock.
  • the second position information acquisition unit 113 outputs the acquired position information of the UAV 30 to the landing position determination unit 114.
  • the landing position determination unit 114 has a function of determining a position for landing the UAV 30 on the ship 40 based on the information relating to the predicted position at a time point after the reference time of the ship 40 and the position information of the UAV 30.
  • the landing position determination unit 114 determines, for example, the predicted position at a time after the reference time of the ship 40, that is, at least one of the predicted routes as the landing position of the UAV 30. At that time, the landing position determination unit 114 determines an appropriate landing position based on at least the position of the UAV 30. For example, the landing position determination unit 114 determines the position at which the UAV 30 can land on the ship 40 at the scheduled landing time from the position at the predetermined time of the UAV 30. At that time, for example, the landing position determination unit 114 may determine the landing position based on the operating condition of the UAV 30.
  • the driving conditions include, for example, the flight speed of the UAV30 (maximum speed or average speed, etc.), the load weight of the goods transported by the UAV30, the flight distance of the UAV30, the flight time of the UAV30, the weather in the flight area of the UAV30, the wind direction, and the like. Includes weather information such as wind speed, navigation status of other UAVs in the flight area, and the like.
  • the landing position determination unit 114 may determine the landing position of the UAV 30 so as to select a route that optimizes the flight target of the UAV 30, for example.
  • the flight target is, for example, the flight time from the start of flight to landing, the distance from the start of flight to landing, the energy consumption of UAV30 required from the start of flight to landing, and the like.
  • FIG. 6 is a schematic view showing an example of the landing position determination process by the landing position determination unit 114 according to the present embodiment.
  • the optimization of the flight target may be, for example, the minimization of the flight distance of the UAV 30.
  • the landing position determination unit 114 may determine the landing position M2 by estimating the flight path F2 that can have the shortest flight distance.
  • the optimization of the flight target may be, for example, the minimization of the flight time of the UAV30. As shown in FIG. 6, for example, if the wind direction in the flight area is from the right side to the left side of the paper, the flight speed increases and the flight time is shortened. Therefore, the landing position determination unit 114 may determine the landing position M3 by estimating the flight path F3 as the route that can minimize the flight time.
  • the optimization of the flight target may be, for example, the minimization of the energy utilization required for the flight of the UAV30.
  • the energy utilization amount may be, for example, the amount of battery consumption required from the start of flight to landing of the UAV30.
  • Such energy utilization is determined according to at least one of, for example, the flight time of the UAV30, the flight speed, the flight distance, the load weight of the goods to be transported, and the like.
  • the landing position determination unit 114 may determine the landing position M4 by estimating the flight path F4 that can minimize the energy utilization from the various factors described above.
  • the landing position determination unit 114 estimates the flight path that can land on the ship 40 in the flight path that can be flown, and lands. The position may be determined. Further, the landing position determination unit 114 may determine the landing position on condition that a plurality of flight targets among the above-mentioned flight targets are optimized. Further, for example, the landing position determination unit 114 may determine a position where the flight path of the UAV 30 intersects the predicted route of the ship 40 as the landing position. Since the flight path of the UAV 30 does not follow the movement of the ship 40, the UAV 30 can be efficiently brought close to the ship 40.
  • the landing position determination unit 114 may determine any of the positions suitable for landing in the predicted route of the ship 40 as the landing position in addition to the flight target of the UAV 30.
  • the position suitable for landing may be, for example, a position in an area outside the area partitioned as a harbor, a position where the wave height is relatively lower than other positions, and the like. Such a position is determined by the landing position determination unit 114 based on information on the environment around the ship 40 (for example, geographic information or weather information) or traffic information related to the movement of the ship 40 (for example, navigation information of another ship). May be determined by.
  • the landing position determination unit 114 outputs information regarding the determined landing position of the UAV 30 to the flight control unit 115.
  • the flight control unit 115 controls the flight of the UAV30 based on the acquired information on the landing position of the UAV30. For example, the flight control unit 115 transmits information regarding the landing position of the UAV 30 to the UAV 30. The flight control unit 115 controls the flight of the UAV 30 so that the UAV 30 flies to the landing position. Further, the flight control unit 115 may control the flight of the UAV 30 to the landing position passing through the path by generating information on the flight path to the landing position and transmitting the generated information to the UAV 30.
  • the route from the start of flight of the UAV 30 to the landing position may be set by the flight control unit 115 or may be set by the UAV 30. In the latter case, the flight control unit 115 transmits information about the landing position to the UAV 30, and the UAV 30 sets the route to the landing position autonomously or based on an instruction from a predetermined server including the control device 10.
  • a predetermined server including the control device 10.
  • the flight path may be set in this case by a program related to the flight path stored in the storage 13 of the control device 10, for example.
  • FIG. 7 is a flowchart of flight control of the UAV 30 in the control system 1 according to the present embodiment.
  • the first position information acquisition unit 111 of the control device 10 acquires the position information and the like before the reference time from the mobile terminal 20 (step SQ101).
  • Such location information may be acquired sequentially, or the history of location information before the reference time may be acquired once (or intermittently).
  • the moving body position prediction unit 112 of the control device 10 predicts the route of the ship 40 after the reference time (that is, the position of the ship 40 at a time point after the reference time) (step SQ103). The process related to such prediction may be performed once for the reference time, or may be repeated by updating the reference time sequentially.
  • the second position information acquisition unit 113 of the control device 10 acquires the position information of the UAV 30 (step SQ105). Such position information may be acquired sequentially.
  • the landing position determination unit 114 of the control device 10 determines the landing position of the UAV based on the acquired route prediction information, the position information of the UAV, and the like (step SQ107). The landing position determination process may be repeated sequentially. For example, even during the flight of the UAV30, a process of changing the determined landing position may be performed.
  • the flight control unit 115 of the control device 10 transmits information and the like regarding the landing position to the UAV 30 (step SQ109).
  • the received UAV30 flies to the landing position based on the information regarding the landing position and the like (step SQ111).
  • the process related to at least one of the steps may be repeated as appropriate even after the start of flight of the UAV 30. This makes it possible to bring the landing position of the UAV 30 closer to a more appropriate position.
  • the landing position of the UAV 30 in the navigation area of the ship 40 can be determined even if the position of the ship 40 cannot be directly grasped.
  • an appropriate position is provided. It is possible to land the UAV30 at. Further, by predicting the position of the ship 40 in movement with high accuracy, the UAV 30 can be brought closer to the ship 40 more efficiently and landed on the ship 40.
  • the landing position determination unit 114 determines the landing position from a macroscopic point of view in the area where the moving vessel 40 navigates.
  • the landing position determination unit 114 according to the present embodiment has a function of determining a more microscopic landing position on the deck of the ship 40 when the UAV 30 lands on the ship 40 after approaching the landing position from the above macroscopic viewpoint. Has.
  • FIG. 8 is a diagram for explaining a process of determining the landing position of the UAV 30 on the ship 40 when the control system 1 according to the second embodiment of the present disclosure is used.
  • FIG. 8 shows a case where the UAV 30 approaches the landing position at a time after the predicted reference time of the ship 40 and then tries to land on the deck of the ship 40, as described in the above embodiment. ing.
  • the mobile terminal 20 is loaded on the ship 40.
  • the mobile terminal 20 is mounted on the deck, but the mounting location is not limited.
  • the mobile terminal 20 may be held by a person boarding the ship 40 or the like.
  • the mobile terminal 20 and the UAV 30 do not directly communicate with each other.
  • FIG. 9 is a diagram for explaining an example of the determination process by the landing position determination unit 114 according to the present embodiment.
  • the landing position determining unit 114 acquires information relating to the orientation of the mobile terminal 20, and lands the UAV 30 based on such information. You may decide the position to make it.
  • the landing position determination unit 114 may determine any of the positions extending from the mobile terminal 20 in the direction D1 as the landing position. Thereby, even in a situation where the mobile terminal 20 and the UAV 30 cannot directly communicate with each other, it is possible to easily adjust the landing position on the deck of the ship 40 by using the mobile terminal 20.
  • the UAV 30 can be landed at an appropriate position on the ship 40 even when the ship 40 is not moving.
  • FIG. 10 is a diagram for explaining another example of the determination process by the landing position determination unit 114 according to the present embodiment.
  • the landing position determination unit 114 may determine the position L2, which is a predetermined distance D2 in the direction of the mobile terminal 20 from the position of the mobile terminal 20, as the landing position of the UAV 30.
  • the predetermined distance D2 may be adjusted by, for example, the mobile terminal 20 or the like, or may be a value adjusted in advance by the control device 10. This enables finer control of the landing position of the UAV 30.
  • FIG. 11 and 12 are diagrams for explaining another example of the determination process by the landing position determination unit 114 according to the present embodiment.
  • the landing position determining unit 114 is based on the light emitting 60 emitted from the light emitting body 27.
  • the landing position of the UAV 30 may be determined.
  • the light emitting 60 is imaged by the camera 351 (an example of the imaging device) mounted on the UAV 30, the control device 10 acquires the image of the captured light emitting 60, and the landing position determining unit 114 acquires the image. It may be determined based on the image of the light emission 60 contained in.
  • FIG. 11 are diagrams for explaining another example of the determination process by the landing position determination unit 114 according to the present embodiment.
  • the UAV 30 shows an image P1 obtained by the camera 351 capturing the light emitted by the light emitting body 27 of the mobile terminal 20.
  • the position of the mobile terminal 20 can be grasped by the UAV 30 by acquiring the light emitted by the light emitting body 27 from the image P1.
  • the UAV 30 can be guided even if the mobile terminal 20 and the UAV 30 are not in direct communication, and the UAV 30 can be landed at an appropriate position.
  • the light emitting pattern by the light emitting body 27 is not particularly limited.
  • the landing position determining unit 114 may appropriately adjust the landing position according to the light emitting pattern of the light emitting body 27.
  • the camera 28 of the mobile terminal 20 captures an image of the UAV 30, transmits the captured image to the control device 10, and the landing position determining unit 114 determines the landing position of the UAV 30 based on the captured image. May be determined. This makes it possible to finely adjust the landing position of the UAV 30.
  • the landing position determination unit 114 may use the mobile terminal 20.
  • the landing position of the UAV 30 may be determined based on the information relating to the height of the UAV 30.
  • the height of a ship 40 may change significantly due to waves at sea. Therefore, by using the information related to the height of the mobile terminal 20, the landing position determination unit 114 can determine the landing position (including the height direction) according to the change in the height of the ship 40. This makes it possible to smoothly land the UAV 30.
  • the landing position determination unit 114 determines a more accurate landing position on the ship 40, and the information related to the determined landing position is output to the flight control unit 115.
  • the flight control unit 115 controls the UAV30 to fly to a highly accurate landing position.
  • the control system 1 according to the present embodiment, not only the landing position of the UAV 30 in the navigation area of the ship 40 but also the more accurate landing position of the UAV 30 in the ship 40 can be determined by the ship 40. This can be achieved even if the position cannot be grasped directly.
  • the device described in the present specification may be realized as a single device, or may be realized by a plurality of devices (for example, a cloud server) in which some or all of them are connected by a network.
  • the control unit 11 and the storage 13 of the control device 10 may be realized by different servers connected to each other by a network.
  • the series of processes by the apparatus described in the present specification may be realized by using any of software, hardware, and a combination of software and hardware. It is possible to create a computer program for realizing each function of the control device 10 according to the present embodiment and implement it on a PC or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
  • the air vehicle disclosed in this disclosure can be expected to be used as an industrial air vehicle in surveys, surveys, observations, etc.
  • the aircraft of the present disclosure can be used in airplane-related industries such as multicopter drones, and further, through this disclosure, it contributes to the improvement of the safety of these aircraft and the flight of the aircraft. be able to.
  • Control system 10 Control device 20 Mobile terminal 30 UAV 40 Ship 111 1st position information acquisition unit 112 Mobile position prediction unit 113 2nd position information acquisition unit 114 Landing position determination unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif de commande d'un véhicule aérien sans pilote avec lequel il est possible de mettre en œuvre facilement le transport des fournitures par l'UAV, même au niveau d'un site d'atterrissage qui n'est pas équipé d'une plate-forme spécifique. À cet effet, le dispositif de commande d'un véhicule aérien sans pilote selon la présente invention est pourvu : d'une première unité d'acquisition d'informations de position pour acquérir des informations de position pour un terminal fourni à un site d'atterrissage, les informations de position étant obtenues au moyen d'un capteur d'informations de position disposé sur le terminal ; d'une seconde unité d'acquisition d'informations de position pour acquérir des informations de position pour un véhicule aérien ; d'une unité de prédiction de position de site d'atterrissage pour prédire la position du site d'atterrissage à un instant après un temps de référence, sur la base d'un historique des informations de position pour le terminal jusqu'au temps de référence ; et d'une unité de détermination de position d'atterrissage pour déterminer, sur la base des informations de position pour le véhicule aérien et des informations concernant la position prédite du site d'atterrissage au moment après le temps de référence, la position où faire atterrir le véhicule aérien sur le site d'atterrissage.
PCT/JP2019/033176 2019-08-23 2019-08-23 Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote Ceased WO2021038667A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/033176 WO2021038667A1 (fr) 2019-08-23 2019-08-23 Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/033176 WO2021038667A1 (fr) 2019-08-23 2019-08-23 Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote

Publications (1)

Publication Number Publication Date
WO2021038667A1 true WO2021038667A1 (fr) 2021-03-04

Family

ID=74685398

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/033176 Ceased WO2021038667A1 (fr) 2019-08-23 2019-08-23 Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote

Country Status (1)

Country Link
WO (1) WO2021038667A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240089908A1 (en) * 2020-08-20 2024-03-14 I911 International, Inc. System for intelligent first responder assignment to a water vessel
US12004119B2 (en) 2020-08-20 2024-06-04 I911 International, Inc. System for accurate location estimation of a water vessel

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007526175A (ja) * 2004-03-02 2007-09-13 ノースロップ グラマン コーポレイション 自動収集管理装置
JP2015074277A (ja) * 2013-10-07 2015-04-20 株式会社 ミックウェア 飛行体制御システム、端末装置、情報処理方法、およびプログラム
US20160122038A1 (en) * 2014-02-25 2016-05-05 Singularity University Optically assisted landing of autonomous unmanned aircraft
WO2017001384A1 (fr) * 2015-06-29 2017-01-05 Sabic Global Technologies B.V. Utilisation d'une composition d'initiateur de radical libre pour la réduction de gels dans des matériaux de polyéthylène
US20180046179A1 (en) * 2016-08-10 2018-02-15 Lg Electronics Inc. Mobile terminal and method of controlling the same
JP2018094983A (ja) * 2016-12-09 2018-06-21 Kddi株式会社 飛行装置、通報方法及びプログラム
JP2018112495A (ja) * 2017-01-12 2018-07-19 イームズロボティクス株式会社 飛行システム、飛行管理方法及び飛行プログラム
JP2019043473A (ja) * 2017-09-06 2019-03-22 Kddi株式会社 飛行装置、管理装置、撮影制御方法、及び撮影制御プログラム
JP2019067252A (ja) * 2017-10-03 2019-04-25 株式会社トプコン 経路選定装置、無人航空機、データ処理装置、経路選定処理方法および経路選定処理用プログラム
JP2019121144A (ja) * 2017-12-29 2019-07-22 井関農機株式会社 農作業支援用の飛行体および農作業支援システム

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007526175A (ja) * 2004-03-02 2007-09-13 ノースロップ グラマン コーポレイション 自動収集管理装置
JP2015074277A (ja) * 2013-10-07 2015-04-20 株式会社 ミックウェア 飛行体制御システム、端末装置、情報処理方法、およびプログラム
US20160122038A1 (en) * 2014-02-25 2016-05-05 Singularity University Optically assisted landing of autonomous unmanned aircraft
WO2017001384A1 (fr) * 2015-06-29 2017-01-05 Sabic Global Technologies B.V. Utilisation d'une composition d'initiateur de radical libre pour la réduction de gels dans des matériaux de polyéthylène
US20180046179A1 (en) * 2016-08-10 2018-02-15 Lg Electronics Inc. Mobile terminal and method of controlling the same
JP2018094983A (ja) * 2016-12-09 2018-06-21 Kddi株式会社 飛行装置、通報方法及びプログラム
JP2018112495A (ja) * 2017-01-12 2018-07-19 イームズロボティクス株式会社 飛行システム、飛行管理方法及び飛行プログラム
JP2019043473A (ja) * 2017-09-06 2019-03-22 Kddi株式会社 飛行装置、管理装置、撮影制御方法、及び撮影制御プログラム
JP2019067252A (ja) * 2017-10-03 2019-04-25 株式会社トプコン 経路選定装置、無人航空機、データ処理装置、経路選定処理方法および経路選定処理用プログラム
JP2019121144A (ja) * 2017-12-29 2019-07-22 井関農機株式会社 農作業支援用の飛行体および農作業支援システム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240089908A1 (en) * 2020-08-20 2024-03-14 I911 International, Inc. System for intelligent first responder assignment to a water vessel
US12004119B2 (en) 2020-08-20 2024-06-04 I911 International, Inc. System for accurate location estimation of a water vessel

Similar Documents

Publication Publication Date Title
US11835953B2 (en) Adaptive autonomy system architecture
AU2018355071B2 (en) UAV group charging based on demand for UAV service
CN110226143B (zh) 前导无人机的方法
JP6980600B2 (ja) 基地装置、基地装置の制御方法、及び基地装置の制御プログラム
US20210109546A1 (en) Predictive landing for drone and moving vehicle
US8494697B2 (en) Methods and systems for predicting water vessel motion
US10133281B1 (en) Leading drone system
CN111295627B (zh) 水下领航无人机系统
CN110234571A (zh) 中继无人机方法
CN108287542A (zh) 基于协同云控制的无人机与无人船协作控制系统与方法
EP4180767B1 (fr) Planification d'itinéraire pour un véhicule terrestre à travers un terrain non familier
US12148307B2 (en) Computation load distribution
US20240176367A1 (en) Uav dispatching method, server, dock apparatus, system, and storage medium
JP2021117502A (ja) 着陸制御装置、着陸制御方法およびプログラム。
Andrade et al. Autonomous UAV surveillance of a ship's path with MPC for maritime situational awareness
JP7679097B2 (ja) 作業計画生成システム
WO2021038667A1 (fr) Dispositif de commande de véhicule aérien sans pilote, système de commande de véhicule aérien sans pilote et procédé de commande de véhicule aérien sans pilote
JP7688369B1 (ja) 制御システム、制御方法およびプログラム
JP2021118364A (ja) 通信制御装置、通信制御方法およびプログラム。
Andrade Real-time and offline path planning of Unmanned Aerial Vehicles for maritime and coastal applications
JP2025091726A (ja) 作業判定システム、作業判定方法およびプログラム
WO2021038622A1 (fr) Système de commande pour corps mobile sans pilote et procédé de commande de corps mobile sans pilote
Ortiz et al. A Micro Aerial Vehicle for Vessel Visual Inspection Assistance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19942661

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 19942661

Country of ref document: EP

Kind code of ref document: A1