WO2025052526A1 - Système d'imagerie utilisant un aéronef sans pilote, procédé d'imagerie et programme - Google Patents
Système d'imagerie utilisant un aéronef sans pilote, procédé d'imagerie et programme Download PDFInfo
- Publication number
- WO2025052526A1 WO2025052526A1 PCT/JP2023/032286 JP2023032286W WO2025052526A1 WO 2025052526 A1 WO2025052526 A1 WO 2025052526A1 JP 2023032286 W JP2023032286 W JP 2023032286W WO 2025052526 A1 WO2025052526 A1 WO 2025052526A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unmanned aerial
- aerial vehicle
- drone
- formation
- athlete
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C13/00—Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
- B64C13/02—Initiating means
- B64C13/16—Initiating means actuated automatically, e.g. responsive to gust detectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C19/00—Aircraft control not otherwise provided for
- B64C19/02—Conjoint controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/30—Constructional aspects of UAVs for safety, e.g. with frangible components
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
Definitions
- This disclosure relates to technology using unmanned aerial vehicles (UAVs) such as drones.
- UAVs unmanned aerial vehicles
- Patent Document 1 describes a system and method for capturing still and/or moving scenes using a multiple camera network.
- Patent Document 1 there is technology to control the position and direction of multiple unmanned aerial vehicles flying in formation to surround a subject to be photographed, such as an athlete.
- the method of filming is to use cameras installed at fixed positions on the course, or in the case of marathons, for example, the method of filming is to use cameras that run alongside the athletes.
- the timing and range that can be filmed are limited, and a large number of cameras are required, making it difficult to follow the athletes at all times.
- running alongside cameras it is difficult to follow the athletes in events that are held on unpaved courses (for example, snowboarding) or events where the athletes move at high speeds (for example, motor races), as it is difficult to have the cameras running alongside the athletes.
- unmanned aerial vehicles such as drones have become more advanced, and it is now possible to film athletes from the sky using manned or unmanned aerial vehicles.
- This method of aerial photography makes it possible to film a wide area of the course, eliminating the need for multiple cameras. With this method, even in competitions that are held on unpaved courses or where the athletes are moving at high speed, it is possible to continuously film the athletes' movements and expressions as long as the athletes can be followed by flying.
- an aerial photography method in which unmanned aerial vehicles are used to photograph athletes, and to control the formation flight of multiple unmanned aerial vehicles so that they follow the athletes, thereby enabling the acquisition of footage with a preferred shooting angle, etc.
- the above-mentioned method of aerial photography using unmanned aerial vehicles also poses the following challenges. First and most importantly, safety must be ensured. It is necessary to reduce the risk of contact or collision so that unmanned aerial vehicles do not come into contact with or collide with athletes or nearby spectators. Formation flight control is required to ensure this safety. For example, even if an unmanned aerial vehicle goes into an abnormal state, it is necessary to be able to more reliably avoid contact or collision with athletes.
- the purpose of this disclosure is to provide aerial photography technology using the unmanned aerial vehicle described above that can ensure safety and improve the quality of captured footage when following and filming athletes.
- Typical embodiments of the present disclosure include a photography system, photography method, etc., and are characterized by the following configurations:
- a filming system using unmanned aerial vehicles comprising a plurality of unmanned aerial vehicles that fly following a filming target, each of the plurality of unmanned aerial vehicles photographing the filming target with a camera to provide filmed footage, the plurality of unmanned aerial vehicles flying in formation with a predetermined positional relationship to the filming target, at least one first unmanned aerial vehicle among the plurality of unmanned aerial vehicles controlling the flight of the first unmanned aerial vehicle so that at least one parameter value of the relative position, relative distance, absolute position, relative speed, and absolute speed with respect to the filming target satisfies a predetermined condition, a unmanned aerial vehicle that does not satisfy the predetermined condition is determined to be abnormal in formation flight, and an evacuation action is taken to move the unmanned aerial vehicle with abnormal formation flight away from the filming target.
- the evacuation action includes at least movement in a direction increasing the altitude.
- the first unmanned aerial vehicle acquires position information of the subject to be photographed, and controls the position of the first unmanned aerial vehicle so that the relative position or relative distance of the first unmanned aerial vehicle to the position of the subject to be photographed satisfies a target value or target range.
- An imaging system in which the first unmanned aerial vehicle acquires information on the position and moving direction of the subject to be imaged, and controls the position of the first unmanned aerial vehicle so that the relative position or relative distance of the first unmanned aerial vehicle to the position and moving direction of the subject to be imaged satisfies a target value or target range.
- An imaging system which determines that the first unmanned aerial vehicle is in the formation flight abnormality when the relative position or distance of the first unmanned aerial vehicle to the imaging subject does not satisfy a target value or target range, or when the difference between the absolute position of the first unmanned aerial vehicle and a target absolute position exceeds a predetermined value.
- An imaging system which determines that the first unmanned aerial vehicle is in the formation flight abnormality when the relative speed of the first unmanned aerial vehicle with respect to the subject to be imaged does not satisfy a target value or target range, or when the difference between the absolute speed of the first unmanned aerial vehicle and a target absolute speed exceeds a predetermined value.
- a second unmanned aerial vehicle among the plurality of unmanned aerial vehicles which is not the first unmanned aerial vehicle, acquires position information of the first unmanned aerial vehicle and controls the position of the second unmanned aerial vehicle so that the relative position or relative distance of the second unmanned aerial vehicle to the position of the first unmanned aerial vehicle satisfies a target value or target range.
- a second unmanned aerial vehicle among the plurality of unmanned aerial vehicles which is not the first unmanned aerial vehicle, acquires position information and information on the moving direction of the first unmanned aerial vehicle, and controls the position of the second unmanned aerial vehicle so that the relative position or relative distance of the second unmanned aerial vehicle to the position and moving direction of the first unmanned aerial vehicle satisfies a target value or target range.
- An imaging system according to any one of (1) to (10), which determines that the second unmanned aerial vehicle is in the formation flight abnormality when the relative speed of the second unmanned aerial vehicle with respect to the first unmanned aerial vehicle does not satisfy a target value or target range, or when the difference between the absolute speed of the second unmanned aerial vehicle and a target absolute speed exceeds a predetermined value.
- An imaging system according to any one of (1) to (11), which determines that the unmanned aerial vehicle in which a control communication anomaly or a positioning accuracy anomaly is detected is in the formation flight anomaly, or determines that the first unmanned aerial vehicle in which a positioning accuracy anomaly is detected for the subject to be imaged is in the formation flight anomaly.
- An imaging system according to any one of (1) to (12), which, when it is determined that the first unmanned aerial vehicle is experiencing the formation flight abnormality, causes the entire formation consisting of the plurality of unmanned aerial vehicles, including the first unmanned aerial vehicle, to perform the evacuation action.
- An imaging system according to any one of (1) to (13), which, when it is determined that the second unmanned aerial vehicle is experiencing the formation flight abnormality, causes the second unmanned aerial vehicle experiencing the formation flight abnormality to perform the evacuation action, or causes the entire formation consisting of the multiple unmanned aerial vehicles including the second unmanned aerial vehicle experiencing the formation flight abnormality to perform the evacuation action.
- An imaging system according to any one of (1) to (14), which controls the orientation of the camera of the unmanned aerial vehicle so that the subject is captured near the center of the image captured by the camera.
- An imaging system in which the direction of movement of the subject is ascertained from a sensor provided on the subject, or the direction of movement of the subject is ascertained from a sensor provided on the unmanned aerial vehicle, or the direction of movement of the subject is ascertained from a sensor provided in the environment in which the subject moves, or the direction of movement of the subject is ascertained from the image captured by the camera.
- the formation flight positions of the multiple unmanned aerial vehicles include at least two of the following: a forward position in a forward direction relative to the subject to be photographed, a rear position in a rearward direction relative to the subject to be photographed, a right position in a rightward direction relative to the subject to be photographed, a left position in a leftward direction relative to the subject to be photographed, and a directly above position in a directly above direction relative to the subject to be photographed, photographing system.
- the evacuation action includes the following movements depending on the formation flight position; in the case of the unmanned aerial vehicle in the front position, it includes movement in the direction of increasing altitude, movement to the left or right, or movement in the backward direction; in the case of the unmanned aerial vehicle in the rear position, it includes movement in the direction of increasing altitude, movement to the left or right, or movement in the backward direction; in the case of the unmanned aerial vehicle in the right position, it includes movement in the direction of increasing altitude, movement to the right, or movement in the backward direction; in the case of the unmanned aerial vehicle in the left position, it includes movement in the direction of increasing altitude, movement to the left, or movement in the backward direction; and in the case of the unmanned aerial vehicle in the directly above position, it includes movement in the direction of increasing altitude, movement to the left or movement in the backward direction.
- a method for photographing an object using an unmanned aerial vehicle including a plurality of unmanned aerial vehicles that fly following a subject, each of the plurality of unmanned aerial vehicles photographing the subject with a camera to provide a photographed image, the plurality of unmanned aerial vehicles flying in a formation flight position with a predetermined positional relationship with the subject, the photographing method including the steps of: controlling the flight of at least one of the plurality of unmanned aerial vehicles such that at least one parameter value of the relative position, relative distance, absolute position, relative speed, and absolute speed with respect to the subject satisfies a predetermined condition; determining that an unmanned aerial vehicle that does not satisfy the predetermined condition is abnormal in formation flight; and moving the abnormal formation flight unmanned aerial vehicle in a direction away from the subject.
- (21) A program for causing a computer to execute processing corresponding to the imaging system of (1) or the imaging method of (20).
- a storage medium/recording medium storing the program.
- a representative embodiment of the present disclosure provides aerial photography technology using the unmanned aerial vehicle, which can ensure safety and improve the quality of the captured footage when tracking and filming athletes. Issues, configurations, effects, etc. other than those described above are described in the description of the embodiment of the invention.
- FIG. 1 shows the overall configuration of a system in the first embodiment.
- FIG. 2 is a diagram showing the configuration of a formation flight position of a formation in the first embodiment.
- FIG. 1 is a plan view from above showing an example of the formation flight positions, leader aircraft, and sub aircraft in embodiment 1.
- FIG. 2 is a plan view showing the formation flight position of the formation from the side in the first embodiment.
- FIG. 2 is a plan view showing the formation tracking control from above in the first embodiment.
- FIG. 11 is a plan view showing another example of the formation flight position of the formation in embodiment 1, viewed from the side.
- FIG. 11 is a plan view showing another example of the formation flight position of the formation in the first embodiment, viewed from above.
- FIG. 4 is an explanatory diagram of the first tracking control of the formation in the first embodiment.
- FIG. 4 is an explanatory diagram of the second tracking control of the formation in the first embodiment.
- FIG. 11 is a plan view showing an example of the configuration of a leader aircraft in a formation in the first embodiment, viewed from above.
- FIG. 11 is a plan view showing additional functions as viewed from above in the first embodiment.
- FIG. 1 is a functional block diagram of an unmanned aerial vehicle (drone) in the first embodiment.
- FIG. 1 is a diagram showing an example of the appearance of an unmanned aerial vehicle (drone) in embodiment 1.
- FIG. 2 is a functional block diagram of the control device in the first embodiment.
- FIG. 2 is a diagram showing an example of the appearance of a control device in the first embodiment.
- FIG. 2 is a functional block diagram of a pre-setting terminal in the first embodiment.
- FIG. 2 is a functional block diagram of the captured image display device in the first embodiment.
- FIG. 2 is a functional block diagram of an athlete terminal in the first embodiment.
- 3 is a functional block diagram of a server according to the first embodiment; 13 is a functional block diagram of the second half of the server in the first embodiment;
- FIG. 1 is a state transition diagram in the first embodiment.
- FIG. 11 is a flow diagram of a pre-takeoff preparation mode in the first embodiment.
- FIG. 4 is a diagram showing a first example of a standby position in the first embodiment.
- FIG. 13 is a diagram showing a second example of a standby position in the first embodiment.
- FIG. 13 is a diagram showing a third example of a standby position in the first embodiment.
- FIG. 1 is a diagram showing an example of a geofence in the first embodiment.
- FIG. 2 is a diagram showing an example of a screen display of a captured image display device in the first embodiment.
- FIG. 4 is a flow diagram of a shooting standby mode in the first embodiment.
- FIG. 4 is a flow diagram of a tracking shooting mode in the first embodiment.
- FIG. 11 is a flow chart for determining formation flight abnormalities of the leader aircraft in the first embodiment.
- FIG. 11 is a flow diagram for determining abnormalities in the formation flight of a sub-aircraft in embodiment 1.
- FIG. 4 is an explanatory diagram of a communication abnormality in the first embodiment.
- FIG. 11 is an explanatory diagram for determining the relative distance to a competitor in the first embodiment.
- FIG. 11 is an explanatory diagram of proximity determination to a group of athletes in the first embodiment.
- FIG. 4 is an explanatory diagram of remote determination in the first embodiment.
- FIG. 11 is a flow diagram of an abnormality evacuation mode in the first embodiment.
- FIG. 11 is an explanatory diagram of a first example of evacuation in an abnormality evacuation mode in the first embodiment.
- FIG. 11 is an explanatory diagram of a second example of evacuation in the abnormality evacuation mode in the first embodiment.
- FIG. 11 is an explanatory diagram of an escape action in which following is stopped in the first embodiment.
- FIG. 11 is an explanatory diagram of an escape action for maintaining following in the first embodiment.
- FIG. 11 is a diagram showing an example of a retreat action at a front position in the first embodiment.
- 11A to 11C are diagrams showing examples of evacuation actions at left and right positions in the first embodiment.
- FIG. 11 is a diagram showing an example of a retreat action at a rear position in the first embodiment.
- FIG. 13 is a diagram showing an example of a retreat action at a directly above position in the first embodiment.
- FIG. 11 is a flow diagram of a landing mode in the first embodiment.
- FIG. 11 is an explanatory diagram of evacuation target determination in accordance with an abnormality in the first embodiment.
- FIG. 13 is an explanatory diagram of evacuation target determination as a modified example in the first embodiment.
- FIG. 4 is an explanatory diagram showing an example of priority regarding formation flight positions in the first embodiment.
- FIG. 4 is an explanatory diagram showing an example of control using priority in the first embodiment.
- FIG. 13 is an explanatory diagram showing temporary saving and returning as a modified example in the first embodiment.
- FIG. 13 is an explanatory diagram showing a combination of evacuation and camera zoom control as a modified example in the first embodiment.
- the program, functions, processing units, etc. may be described as the main focus, but the main hardware focus for these is the processor, or a controller, device, computer, system, etc. that is composed of the processor, etc.
- the computer executes processing according to the program read into memory using resources such as memory and communication interfaces as appropriate through the processor. This realizes specified functions, processing units, etc.
- the processor is composed of semiconductor devices such as a CPU/MPU or GPU, for example. Processing is not limited to software program processing, and can also be implemented using dedicated circuits. Dedicated circuits that can be used include FPGAs, ASICs, CPLDs, etc.
- the program may be pre-installed as data on the target computer, or may be distributed as data from a program source to the target computer.
- the program source may be a program distribution server on a communication network, or a non-transient computer-readable storage medium, such as a memory card or disk.
- the program may be composed of multiple modules.
- the computer system may be composed of multiple devices.
- the computer system may be composed of a client-server system, a cloud computing system, an IoT system, etc.
- the various data and information are composed of structures such as, for example, tables and lists, but are not limited to these. Expressions such as identification information, identifiers, IDs, names, numbers, etc. are mutually interchangeable.
- the photography system grasps the state regarding the risk of contact with the contestant using sensors, etc.
- the photography system determines that there is a formation flight abnormality when the relative position or relative distance of the unmanned aerial vehicle to the contestant falls outside a predetermined target range, or when the difference between the current absolute position of the unmanned aerial vehicle and the target absolute position exceeds a predetermined value.
- the target range is a range from a lower limit value, which is a predetermined value, to an upper limit value, which is also a predetermined value. Outside the target range means being below the lower limit value or exceeding the upper limit value.
- Formation flight abnormality is a state in which the flight control of the formation is abnormal. In other words, an abnormality occurs when the unmanned aerial vehicle is too close or too far away from the contestant.
- the filming system will determine that there is an abnormality in the formation flight if the relative speed of the unmanned aerial vehicle to the contestant falls outside a predetermined target range of speed, or if the difference between the current absolute speed and the target speed exceeds a predetermined value. In other words, it will determine that there is an abnormality when the response speed or movement speed of the unmanned aerial vehicle is insufficient.
- the photography system detects an abnormality in the control communication or positioning accuracy of the unmanned aerial vehicle between the contestant and the unmanned aerial vehicle, it determines that the unmanned aerial vehicle has a formation flight abnormality. In addition, if the photography system detects an abnormality in the contestant's positioning accuracy in the unmanned aerial vehicle, it determines that the unmanned aerial vehicle has a formation flight abnormality.
- the photography system uses sensors and the like to grasp the state regarding the risk of contact between the unmanned aerial vehicles in the formation and other unmanned aerial vehicles. When the relative position or distance between a certain unmanned aerial vehicle and other unmanned aerial vehicles falls outside the target range, the photography system determines that there is an abnormality in the formation flight of the unmanned aerial vehicle. In other words, when a certain unmanned aerial vehicle is too close or too far away from the other unmanned aerial vehicles, it is determined that there is an abnormality.
- the imaging system determines that there is an abnormality in the formation flight if the relative speed of the unmanned aerial vehicle with respect to other unmanned aerial vehicles falls outside a predetermined target range of speed, or if the difference between the current absolute speed and the target speed exceeds a predetermined value.
- the imaging system detects an abnormality in the control communication or positioning accuracy of an unmanned aerial vehicle between unmanned aerial vehicles, it determines that the unmanned aerial vehicle is experiencing a formation flight abnormality.
- the filming system obtains information from the athlete's terminal/sensor equipped on the athlete via wireless communication.
- the athlete's terminal/sensor information includes GNSS positioning information as the athlete's ID and position information.
- the filming system may obtain the athlete's ID and position information using a sensor (such as a distance sensor) equipped on the unmanned aerial vehicle.
- the filming system may obtain the athlete's ID and position information from a device equipped in the competition environment.
- the filming system may estimate the athlete's ID and position from the video captured by the camera of the unmanned aerial vehicle and camera control information (such as the position and orientation of the camera).
- the filming system may obtain information on the athlete's movement direction and speed from an athlete terminal/sensor equipped on the athlete. Alternatively, the filming system may obtain information on the athlete's movement direction and speed using a sensor (such as a distance sensor) equipped on the unmanned aerial vehicle. Alternatively, the filming system may obtain information on the athlete's movement direction and speed from a device equipped in the competition environment. Alternatively, the filming system may estimate the athlete's movement direction and speed from the footage captured by the camera of the unmanned aerial vehicle and camera control information.
- the image capture system controls the position of the unmanned aerial vehicle so that the relative distance between the athlete and the unmanned aerial vehicle is constant, in other words, so that the athlete is within a target range or close to a target value.
- the image capture system controls camera parameter values such as the camera direction and zoom amount so that the sense of distance and size of the athlete in the captured image is constant or appropriate, depending on the relative distance between the athlete and the unmanned aerial vehicle.
- the imaging system controls the position of the unmanned aerial vehicle (in other words, the formation flight position) so that the relative position of the unmanned aerial vehicle to the position of the athlete is constant.
- the imaging system also controls the direction of the camera of the unmanned aerial vehicle, etc., according to the relative position of the unmanned aerial vehicle.
- the filming system controls the direction of the camera of the unmanned aerial vehicle so that the athlete is captured in an optimal position in the captured image, for example near the center, depending on the relative position of the unmanned aerial vehicle.
- control targets for formation flying The photography system generates a control target value for each unmanned aerial vehicle in the formation at each point in time according to the formation flight position relative to the athlete. In conventional technology, it is difficult to generate a control target value for formation flight, but in the embodiment, the photography system generates a control target value for the formation flight position in a relative positional relationship with the position of the athlete, who is the subject of the photography.
- the multiple unmanned aerial vehicles that make up the formation may have unmanned aerial vehicles with different roles.
- the leader vehicle is an aircraft that grasps the position of the athlete and follows it so that its relative position/relative distance to the athlete's position is within a target range.
- the sub vehicle is an aircraft that grasps the position of the leader vehicle and follows it so that its relative position/relative distance to the leader's position is within a target range.
- at least one unmanned aerial vehicle is the leader vehicle, and the vehicles other than the leader vehicle are sub vehicles. All vehicles in the formation may be leader vehicles.
- either the first following control or the second following control described below may be applied.
- the first tracking control controls the formation flight positions of multiple unmanned aerial vehicles in a relative positional relationship to the position of the competitor, and does not need to be aligned with the direction of movement of the competitor.
- At least one leader unmanned aerial vehicle among the multiple unmanned aerial vehicles in the formation acquires the position information of the competitor, and controls the position of the aircraft so that its relative position to the position of the competitor is within a target range.
- the filming system generates a target range for the leader vehicle as a control target.
- a sub-vehicle which is a second unmanned aerial vehicle other than the leader vehicle among the multiple unmanned aerial vehicles in the formation, acquires the position information of the leader vehicle, and controls the position of the aircraft so that its relative position to the position of the leader vehicle is within the target range.
- the filming system generates a target range for the sub-vehicle as a control target.
- each leader vehicle acquires the position information of the competitor, and controls the position of the aircraft so that its relative position to the position of the competitor is within the target range.
- the filming system generates a target range for each leader vehicle as a control target.
- Second tracking control controls the formation flight positions of multiple unmanned aerial vehicles in a relative positional relationship to the position and direction of movement of the competitor, and changes the formation flight position so that it rotates overall to match the direction of movement of the competitor.
- At least one leader unmanned aerial vehicle which is a first unmanned aerial vehicle among the multiple unmanned aerial vehicles in the formation, acquires position information and direction of movement information of the competitor, and controls the position of the aircraft so that its relative position to the position and direction of movement of the competitor is within a target range.
- the filming system generates a target range for the leader vehicle as a control target.
- a sub-vehicle which is a second unmanned aerial vehicle other than the leader vehicle among the multiple unmanned aerial vehicles in the formation, acquires position information and direction of movement information of the leader vehicle, and controls its position so that its relative position to the position and direction of movement of the leader vehicle is within the target range.
- the filming system generates a target range for the sub-vehicle as a control target.
- each leader vehicle acquires position information and direction of movement information of the competitor, and controls the position of the aircraft so that its relative position to the position and direction of movement of the competitor is within the target range.
- the imaging system generates a target range for each leader as a control target.
- the filming system determines that the sub-aircraft is flying abnormally in formation due to the risk of contact, it will determine at least that sub-aircraft as an evacuation target and have it take evacuation action. In this case, other aircraft in the formation that do not evacuate will continue filming. Alternatively, the filming system may determine that the entire formation, including that sub-aircraft, is an evacuation target and have the entire formation take evacuation action.
- the formation has an escape action according to the formation flight position.
- the escape action is, in its basic concept, a movement in a direction away from the contestant who is the subject of the photograph.
- the escape action may particularly include a movement to increase altitude and ascend.
- the formation flight positions of the multiple unmanned aerial vehicles in the formation include at least two of the following: a forward position in a forward direction relative to the contestant, a rear position in a rearward direction relative to the contestant, a right position in a rightward direction relative to the contestant, a left position in a leftward direction relative to the contestant, and a directly above position in a direction directly above the contestant.
- the evacuation action may include the following movements: For a UAV in the forward position, this may include ascending, moving left or right, or moving backwards. For a UAV in the rear position, this may include ascending, moving left or right, or moving backwards. For a UAV in the right position, this may include ascending, moving right or moving backwards. For a UAV in the left position, this may include ascending, moving left or moving backwards. For a UAV directly above, this may include ascending, moving left or moving backwards.
- the photographing system of the first embodiment of the present disclosure will be described with reference to FIG. 1 and subsequent figures.
- the photographing system of the first embodiment is a system in which, as shown in FIG. 1, FIG. 36, and the like, a plurality of drones (unmanned aerial vehicles) 1 constituting a formation 10 fly so as to follow an athlete P in a tracking photographing mode and photograph the athlete P with a camera.
- the photographing method of the first embodiment is a method having steps executed by the server 30 of the photographing system of the first embodiment and the drone 1.
- the photographing system evacuates the formation 10 in an abnormality evacuation mode.
- the evacuation is, for example, an upward movement that increases the altitude as a movement away from the athlete P.
- the server 30 in Fig. 1 is described as the main subject of the control processing, but in a modified example, the main subject may be each unmanned aerial vehicle 1.
- the server 30 grasps the state of each drone 1 based on communication with each drone 1 in the formation 10, and determines the control processing in consideration of the overall situation of the formation 10. For example, the server 30 determines which drone 1 to evacuate depending on which drone 1 is experiencing formation flight abnormalities, and controls the evacuation action.
- each drone 1 in the formation 10 implements a judgment algorithm for control processing, and while each aircraft communicates via wireless communication, each aircraft makes its own judgment and decides on the control processing.
- a drone 1 grasps the status of other drones 1 based on communication with the other drones 1, and decides on the control processing in consideration of the overall situation of the formation 10.
- the leader drone 1 may decide which drone 1 to evacuate depending on which other drone 1 is experiencing formation flight abnormalities, and control the evacuation action.
- the formation 10 is described as flying automatically based on a preset setting.
- the present invention is not limited to this, and in a modified example, the formation 10 may be flown manually by the pilot U1 operating at least some of the drones 1 in the formation 10.
- the imaging system provides the pilot device 20 of the pilot U1 with support information for manual flight, such as control target values, the state of the drones 1 such as abnormalities, evacuation instructions, and the like.
- the leader aircraft is an aircraft that grasps the relative positional relationship with the athlete P who is the subject of the photography and flies while following the athlete P who is the subject of the photography.
- the sub aircraft is an aircraft that grasps the relative positional relationship with the leader aircraft and flies while following the leader aircraft.
- At least one leader aircraft is provided in the formation 10. All aircraft in the formation 10 may be leader aircraft (FIG. 9). Note that, if it seems strange to have multiple "leader aircraft" in the formation 10, the "leader aircraft" may be renamed to another name, such as main or primary.
- the leader aircraft may also be described as the first unmanned aerial vehicle, the sub aircraft as the second unmanned aerial vehicle, etc.
- FIG. 1 shows an overall system configuration diagram.
- the system of FIG. 1 may be described as a photography system, a drone photography system, a competition photography system, etc.
- the photography system of FIG. 1 has an athlete P (in other words, an athlete, a player, etc.) who competes (in other words, a race, a sport, etc.) in a competition environment 60 (in other words, a site, etc.).
- the photography system of FIG. 1 is a system including a formation 10 (in other words, a group of aircraft, a group of drones, etc.) of a plurality of unmanned aerial vehicles 1 (in other words, drones 1, etc.) that follow and photograph the athlete P.
- the photography system of FIG. 1 has a computer system 100 that is connected to the formation 10 in the competition environment 60 by communication including wireless communication.
- the computer system 100 includes an operator device 20, which is a device used by an operator U1, an operator's server 30, a pre-setting terminal 40, and a captured image display device 50, which are connected via a communication network 90.
- the components such as the operator device 20, server 30, pre-setting terminal 40, and captured image display device 50 are connected to the communication network 90 via communication.
- other devices such as a database server and an operator's management terminal may also be connected.
- the operator device 20 has a control device 21 and a monitor device 22.
- the competition environment 60 is a competition venue or the like, and includes a competition course 61.
- An example of the competition is snowboarding, which is shown diagrammatically in the drawings.
- FIG. 1 shows a state in which an athlete P is on the course 61 during a competition.
- FIG. 1 also shows a state in which a formation 10 of multiple drones 1, five in this example, is flying near the athlete P and photographing the athlete P from the air.
- the formation 10 takes aerial photographs while following the athlete P during the competition.
- the athlete P is the subject of photography, the object to be photographed, the target object, etc.
- the athlete P is wearing an athlete terminal 70, etc.
- the subject of photography includes not only the athlete P, but also objects that move along with the athlete P. Examples of such objects include equipment such as a board, and vehicles such as a bicycle or yacht.
- the photography system in FIG. 1 also uses publicly known satellites 81 and wireless base stations 82.
- Satellite 81 transmits signals via GNSS (Global Navigation Satellite System).
- Wireless base station 82 communicates wirelessly with drone 1 and the like.
- Drone 1, athlete terminal 70, and wireless base station 82 perform positioning and the like based on the signals received from satellite 81.
- Drone 1 determines its position by wireless communication with satellite 81 and wireless base station 82, such as date and time, and its own position in space, for example, latitude, longitude, and altitude.
- the means of positioning is not limited to satellite 81, and if a positioning system is provided in the competition environment 60, that positioning system may be used.
- the pilot U1 is a person who pilots or monitors the drone 1 by operating the control device 21 and monitor device 22.
- the control device 21 and monitor device 22 communicate with each other.
- the control device 21 pilots the drone 1 based on the operations and settings by the pilot U1. In other words, the control device 21 can instruct the flight of the drone 1.
- the monitor device 22 has a display screen, and displays a monitor screen for piloting the drone 1 on the display screen when piloting the drone 1 based on the video data from the control device 21.
- the monitor image displayed on the monitor device 22 is an image based on the video data etc. transmitted from the drone 1.
- the control device 21 and monitor device 22 communicate wirelessly with the drone 1 on-site, the athlete terminal 70, and the wireless base station 82.
- the drone 1 can also be automatically controlled based on the pre-settings made on the pre-setting terminal 40.
- the pilot U1 manually controls the drone 1 by operating the control device 21 while watching the screen of the monitor device 22.
- the control device 21 and monitor device 22 may also be configured as an integrated device.
- the pre-setting terminal 40 is a device for inputting pre-settings related to the flight of the drone 1.
- the pilot U1 inputs pre-settings to the pre-setting terminal 40.
- the pre-setting terminal 40 works in conjunction with the server 30.
- the input information at the pre-setting terminal 40 is sent to the server 30, where pre-setting processing is performed and the pre-setting information is stored.
- the pre-setting terminal 40 and the server 30 may be integrated into one.
- the pre-setting terminal 40 and the pilot device 20 may be integrated into one.
- the captured image display device 50 receives the image of the contestant P captured by the camera (FIG. 2) of the drone 1 and displays it on the display screen. Specifically, the drone 1 transmits the image data 250 of the image captured by the camera to the pilot device 20 by wireless communication, the pilot device 20 transmits the image data 250 to the server 30 by communication, and the server 30 transmits the processed image data 250 to the captured image display device 50.
- the captured image display device 50 displays the captured image on the display screen based on the received image data 250. For example, a user U2 such as a viewer or a race commentator can view the image on the captured image display device 50.
- the pilot device 20 or the server 30 may perform a predetermined process on the image data 250 from the drone 1 to create distribution image data suitable for viewing. In a modified example, the image data 250 from the drone 1 may be transmitted to the server 30 or the captured image display device 50.
- the server 30 communicates with the drones 1 of the formation 10 via the pilot device 20, and performs processing related to the specified functions characteristic of the embodiment.
- the processing of the specified functions includes controlling the flight of the formation 10 and controlling photography.
- the processing of the specified functions may be performed mainly by the server 30, or mainly by the drones 1, or may be partially performed by the pilot device 20, etc.
- the processing of the specified functions may be realized by sharing the work among the drones 1, the server 30, the pilot device 20, etc.
- the competitions, races, and sports targeted in this embodiment are, in particular, competitions with high speeds and intense movement, competitions held on unpaved courses 61, and in particular competitions held by athletes P alone.
- Examples of such competitions include snowboarding, skiing, yachting, surfing, cars, motorbikes, bicycles, and marathons. Following and aerial photography by the formation 10 of drones 1 is effective for these competitions.
- FIG. 2 shows an example of the formation flight position of a formation 10 of multiple drones 1 relative to a contestant P who is the subject of the image.
- FIG. 2 is Example 1.
- the formation 10 has five drones 1, drones 1a, 1b, 1c, 1d, and 1e.
- FIG. 2 also shows the cameras ⁇ Ca, Cb, Cc, Cc, Cd, Ce ⁇ of each drone 1.
- Drone 1a is equipped with a camera Ca
- drone 1b is equipped with a camera Cb
- drone 1c is equipped with a camera Cc
- drone 1d is equipped with a camera Cd
- drone 1e is equipped with a camera Ce.
- contestant P is wearing a contestant terminal 70.
- the movement of the drones 1 in the formation 10 as they fly to follow the athlete P is based on the direction of movement PD1 of the athlete P or the direction of travel of the course 61.
- a course portion 61a is shown, which is a part of the course 61.
- the directions and positions of the drones 1 in the formation 10, such as forward, backward, left and right, are specified in a relative relationship according to the position of the athlete P and the direction of movement PD1.
- the front is in the same direction as the direction of movement PD1
- the rear is in the opposite direction
- left and right and up and down are specified with respect to the direction of movement PD1.
- drone 1a is placed diagonally above and forward of athlete P as the forward position
- drone 1b is placed diagonally above and rearward as the rear position.
- the diagonally above position is a higher position relative to the height of athlete P relative to the surface of course 61.
- Drone 1c is placed diagonally above and to the right of athlete P as the right position
- drone 1d is placed diagonally above and to the left as the left position.
- drone 1e is placed directly above athlete P in the vertical direction.
- the positions of the drones 1 can be restated as follows in terms of the camera shooting directions.
- the positive Y direction (+Y) is taken as the front, based on the standard.
- the shooting direction is diagonally downward and rear.
- the shooting direction is diagonally downward and forward.
- the shooting direction is diagonally downward and left.
- the shooting direction is diagonally downward and right.
- the shooting direction is directly downward.
- the image captured by camera Ca has an angle of view that captures athlete P from the front in the direction of movement PD1.
- the image captured by camera Cb has an angle of view that captures athlete P from behind.
- the image captured by camera Cc has an angle of view that captures athlete P from the right side.
- the image captured by camera Cd has an angle of view that captures athlete P from the left side.
- the image captured by camera Ce has an angle of view that captures athlete P and others from directly above (Figure 25 described below).
- FIG. 3 is an X-Y plan view of Example 1 of the formation flight position of the formation 10 in FIG. 2, as viewed from above in the vertical direction (+Z).
- the direction of movement PD1 of the athlete P is the Y direction (+Y).
- Drone 1a is placed at a predetermined distance d1 in front of the athlete P's position PP1 in the Y direction
- drone 1b is placed at a predetermined distance d2 behind the athlete P's position PP1 in the Y direction (-Y).
- Drone 1c is placed at a predetermined distance d3 to the right of the athlete P's position PP1 in the X direction (+X), and drone 1d is placed at a predetermined distance d4 to the left of the athlete P's position PP1 in the X direction.
- Drone 1e is placed at a predetermined distance above the athlete P's position PP1 in the Z direction (+Z).
- Each drone 1 in the formation 10 is assigned a number, ID, etc. for identification and management purposes. This can be any information that allows for identification.
- the filming system identifies and manages each drone 1.
- FIG. 4 is a Y-Z plan view of Example 1 of the formation flight position of the formation 10 in FIGS. 2 and 3, as viewed from the right side (+X) in the horizontal direction.
- the movement direction PD1 of the athlete P is slightly diagonally downward in the Y direction (+Y).
- the course portion 61a is a downhill slope, so the movement direction PD1 is diagonally downward to match the descent.
- Drones 1a, 1b, 1c, and 1d are positioned at a height h1 in the height direction (in this example, perpendicular to the road surface) from the position of the road surface of the course portion 61a or the height position of the athlete P.
- Drone 1e is positioned at a higher height h2 (h2>h1) in the height direction from the position of the road surface of the course portion 61a or the height position of the athlete P.
- each drone 1 in the formation 10 is selected so that it is in a parallel positional relationship in accordance with the inclination of the road surface of the course 61.
- the position of each drone 1 in the formation 10 may be selected so that it is in a horizontal or vertical positional relationship with the athlete P, rather than being parallel to the road surface.
- the direction and position can be specified in ways other than the above example.
- a spatial coordinate system (X, Y, Z) corresponding to the competition environment 60 the direction of travel of a certain course portion 61a of a snowboarding competition course 61 is the illustrated Y direction (+Y), and the direction of travel of the athlete P is the moving direction PD1.
- the following movement direction AD1 of the formation 10 is also roughly set to the Y direction (+Y).
- the orientation of each camera is roughly maintained as shown in the figure.
- some shaking and the like occurs during flight, so the position and orientation may fluctuate, but a predetermined following control is performed, so the position and orientation are brought closer to a predetermined target state.
- the course 61 in a snowboarding competition has several turns to the left and right in the direction of travel.
- the formation 10 and the cameras may follow each turn roughly (see FIG. 8B below).
- the images captured by each camera are stabilized so that the orientation is roughly the same. This makes it easier to view the competition objectively in the captured images.
- the direction of the formation 10 may be defined based on the spatial coordinate system of the competition environment 60. If the positional relationship of the formation flight positions of the multiple drones 1 in the formation 10 is appropriate, it may not be aligned with the movement direction of the athlete P (see FIG. 8A below). For example, if a snowboarding course 61 has multiple course sections divided according to turns, the orientation of the formation 10 and the camera may be maintained in roughly the same direction in the multiple course sections.
- the movement direction of the formation 10 and the camera may be changed in more detail accordingly. For example, even within one course portion 61a in FIG. 2, when the movement direction of the athlete P fluctuates from side to side, the direction of the formation 10 and the camera may be made to track accordingly.
- the above-mentioned regulations regarding the direction and position of the drone 1 and the camera can be selected and set according to the characteristics of the competition, etc., such as the formation flight position setting unit 316 in Figure 18A described below.
- Each of the drones 1 in the formation 10 may be assigned a role, such as a leader drone or a sub drone, as described below.
- the drone 1a in the front position is set as the leader drone, and the other drones 1, drones 1b, 1c, 1d, and 1e, are set as sub drones. That is, in the first example, only one of the formation 10 is set as the leader drone. In another example, drone 1e may be set as the leader drone.
- Various settings are possible for the leader drone and the sub drone. For example, see the leader/sub drone setting unit 317 in FIG. 18A described later.
- Drone 1a the leader drone, tracks its own position based on the position and direction of movement PD1 of contestant P.
- Drones 1b, 1c, 1d, and 1e, the sub drones track their own positions based on the position and direction of movement of drone 1a, the leader drone.
- Figure 5 shows the positional relationship understanding and tracking control between the leader and sub-machines in Figure 3.
- the leader drone 1a understands the position, movement direction PD1, speed, etc. of the athlete P based on positioning and detection, and determines a forward position La as shown in the figure as its own position in a relative relationship to the position of the athlete P, etc.
- the dashed circle indicates the target range for the forward position La.
- Drone 1a determines its own forward position La as a position that is a specified distance d1 ( Figure 3) in the Y direction forward from the position PP1 of the athlete P, in line with the movement direction PD1.
- the sub-machine grasps the position, direction of movement, and speed of the leader drone 1a based on positioning and detection, and determines its own position relative to the leader drone's position, etc., as the rear position Lb shown in the figure.
- Drone 1b determines its own rear position Lb as a position that is a certain distance (d1 + d2 in Figure 3) backward (-Y) from the position of drone 1a.
- the sub-drone grasps the position, direction of movement, and speed of the leader drone 1a based on positioning and detection, and determines its own position relative to the leader drone's position, etc., as shown in the right position Lc.
- the sub-drone grasps the position, direction of movement, and speed of the leader drone 1a based on positioning and detection, and determines its own position relative to the leader drone's position, etc., as shown in the right position Lc.
- drone 1d in left position Ld and drone 1e directly above.
- their relative relationships are shown with dashed arrows.
- the multiple drones 1 in the formation 10 can follow the contestant P and photograph him/her while maintaining a controlled, predetermined formation flight position as shown in Figures 2 to 4.
- a positional relationship that ensures sufficient distance between each drone 1 and the contestant P is also necessarily met.
- the drone 1a in the front position is at about the same distance from the drone 1b in the rear position to the contestant P, but this is not limited to this and the distance between each drone 1 and the contestant P may be different.
- the distance between the drone 1a in the front position and the contestant P may be the closest. Since it is desirable for the leader drone 1 to be able to grasp and follow the position of the contestant P with high accuracy, the position of the leader drone 1 may be the closest to the contestant P.
- FIG. 6 shows an example of the position of the formation 10 in a modified example of FIG. 4.
- FIG. 6 is a second example.
- the drones 1c and 1d on the left and right may be placed at a position of height h3, in other words, at the same altitude, relative to the height position of the athlete P, as shown in the figure.
- the drone 1e directly above is not provided, and the formation 10 is composed of four drones 1 (1a to 1d).
- the drone 1b in the rear position may also be placed at a position of height h3 relative to the height position of the athlete P, since it is unlikely to interfere with the athlete P.
- FIG. 7 shows an example of the position of the drone 1 of the formation 10 in another modified example.
- FIG. 7 is a third example.
- the drone 1f is positioned diagonally forward to the right
- the drone 1g is positioned diagonally forward to the left
- the drone 1h is positioned diagonally backward to the right
- the drone 1i is positioned diagonally backward to the left, in a plane corresponding to the road surface, based on the position of the athlete P.
- the drone 1f is photographed by the camera Cf with an angle of view that captures the right diagonal of the athlete P.
- the drone 1 may be placed in an oblique position in a plane corresponding to the road surface, avoiding the front, rear, left and right positions as seen from the athlete P in this way.
- formation flight positions in which multiple drones fly in the same direction relative to the athlete but at different altitudes.
- tracking control Formation flight control in the tracking photography mode described below, in other words tracking control, can be selectively applied in two ways: control based only on the relative positional relationship with respect to the position of athlete P, or control based on the relative positional relationship based on the direction of movement of athlete P in addition to the position of athlete P, as described below.
- First tracking control 8A shows a case where the first tracking control is performed based only on the relative positional relationship with respect to the athlete P.
- the first tracking control in Fig. 8A controls the formation flight positions of the multiple drones 1 in the formation 10 based on the relative positional relationship with respect to the position of the athlete P, without tracking the movement direction of the athlete P.
- the leader drone determines its own forward position La, maintaining a predetermined relative distance in a predetermined direction, for example the +Y direction, from position PPa.
- the other sub drones 1b, 1c, and 1d determine their own positions in a predetermined positional relationship with respect to the position La of the leader drone 1a.
- drone 1b determines its rear position Lb, maintaining a predetermined relative distance in the -Y direction.
- Drone 1c determines its right position Lc, maintaining a predetermined relative distance in the +X direction.
- Drone 1d determines its left position Ld, maintaining a predetermined relative distance in the -X direction.
- the four drones follow the position of the contestant P in a formation flight position with a predetermined positional relationship, for example +Y, -Y, +X, -X, that is, a front-back, left-right positional relationship.
- the camera orientation is roughly maintained constant within the spatial coordinate system.
- [Second Tracking Control] 8B shows a case where the second tracking control is performed in accordance with the position and moving direction of the athlete P.
- the imaging system also changes the position of each aircraft in the formation 10 to match the change in the moving direction.
- the leader aircraft controls its own forward position La and the orientation of its camera to achieve a specified relative positional relationship, for example, to match the position PPa and direction of movement PDa of athlete P.
- Each sub-aircraft in the formation 10 controls its own position and the orientation of its camera to achieve a specified relative positional relationship, to match the position and direction of movement of the leader aircraft.
- drone 1a is positioned at forward position La that matches that direction of movement PDb.
- the camera is oriented to capture athlete P at a specified angle of view.
- the position of the formation 10 does not have to be limited to perfectly tracking changes in the direction of movement of the subject being photographed. If there is a large fluctuation in the direction of movement of the subject being photographed, the direction of the formation 10 may be tracked more gradually with smaller fluctuations. Also, if there is a large fluctuation in the direction of movement of the subject being photographed (if it turns 180 degrees), moving each aircraft in the formation 10 in a straight line to the target position may cause the movement path to cross with other aircraft and result in a collision. Therefore, if there is a large fluctuation in the movement direction by more than a predetermined value like this, each aircraft can be moved in an arc centered on the athlete to prevent the movement path from crossing with other aircraft.
- the filming system generates a control target value for controlling the formation flight to follow the athlete P at each timing.
- Each aircraft 1 controls its flight so as to approach that control target value.
- the control target value may be a target range for the position of the drone 1 as shown in Figure 5 above.
- FIG. 9 is another example of the setting of the leader and sub drones in the formation 10 as shown in FIG. 2.
- all five drones 1 (1a to 1e) in the formation 10 are set as leader drones, and there are no sub drones.
- Each of these drones 1 grasps the position, moving direction, speed, etc. of the contestant P based on positioning and detection, and determines its own position in a relative relationship according to the position of the contestant P.
- FIG. 9 is another example of the setting of the leader and sub drones in the formation 10 as shown in FIG. 2.
- the multiple drones 1 in the formation 10 can follow the contestant P while maintaining a controlled predetermined formation flight position, as in FIG. 2, and photograph the contestant P.
- the positional relationship with a sufficient distance between each drone 1 is necessarily satisfied.
- [Additional function: Judgment by secondary device] 10 shows an example of a case where each drone 1 in the formation 10, even if it is a sub-drone, has an additional function of checking the relative distance, etc., to ensure safety between the other drones 1 and the contestant P.
- the sub-drone may check the relative distance and relative speed between the contestant P and other sub-drones, and if it is outside the target range, may determine that there is an abnormality in the formation flight.
- each drone 1 checks to ensure that the distance between itself and contestant P is greater than a specified distance, and also checks to ensure that the distance between other drones is greater than a specified distance.
- the explanation focuses on the drone 1c in the right position.
- the drone 1c which is the sub-drone, determines its own right position relative to the position of the drone 1a in the front position, which is the leader drone.
- the drone 1c controls itself so that the distance dca between it and the drone 1a is equal to or greater than a predetermined distance (the lower limit of the target range).
- the drone 1c also checks whether the distance dcp between it and the contestant P is equal to or greater than a predetermined distance.
- the drone 1c also checks whether the distance dcb between it and the drone 1b in the rear position is equal to or greater than a predetermined distance.
- each drone 1 is less than a specified distance from the athlete P or from other drones 1, it will determine that formation flight is abnormal due to the risk of collision.
- Each of the specified distances mentioned above can be set as a control setting.
- each drone 1 may determine that formation flight is abnormal if the relative speed between the athlete P and other drones 1 does not meet specified conditions.
- Fig. 11 shows an example of a functional block configuration of the unmanned aerial vehicle 1, which is the drone 1 in Fig. 1.
- the unmanned aerial vehicle 1 has an image capture unit 110, a flight function unit 120, an obstacle detection unit 130, a flight state measurement unit 140, a reception state detection unit 160, and a communication unit 170.
- the unmanned aerial vehicle 100 also includes a processor 1001, a memory 1002, a battery 1003, and the like. These components are interconnected via an architecture such as a bus, and are capable of communication, input/output, and the like.
- the processor 1001 reads and writes data and information to the memory 1002, and realizes a predetermined function through program processing.
- the function is not limited to software program processing, but can also be implemented using hardware circuits such as FPGAs.
- each unit such as the image capture unit 110 is illustrated separately from the processor 1001, it may be realized using processing by the processor 1001.
- the photographing unit 110 has a photographing camera (camera) 111, a camera holding unit 112, a camera control unit 113, a holding control unit 113, a camera state acquisition unit 115, a camera abnormality detection unit 116, and an image processing unit 117.
- the photographing camera (camera) 111 corresponds to, for example, camera Ca in FIG. 2.
- the camera holding unit 112 is a drive mechanism, a so-called gimbal, that stably holds the camera 111 in a movable manner.
- the camera state acquisition unit 115 acquires parameter values such as the zoom amount and camera direction of the camera 111.
- the camera abnormality detection unit 116 detects a state in which the camera 111 is not able to capture images normally, such as a cloudy lens, as an abnormality.
- the image processing unit 117 detects the subject of the image captured by the camera 111, and adjusts parameter values such as the zoom amount of the camera 111 so that the size and distance of the subject of the image within the angle of view of the image are as constant as possible.
- the image processing unit 117 also detects the subject of the image captured by the camera 111, and adjusts parameter values such as the camera direction so that the subject of the image is as close to the center as possible within the angle of view of the image.
- the image processing unit 117 may also detect the subject of the image captured by the camera 111, and estimate the position, speed, and movement direction of the subject of the image in space.
- the flight function unit 120 has a motor 121, a propeller 122, a flight control unit 123, and a flight abnormality detection unit 124.
- the motor 121 drives the propeller 122.
- the flight control unit 123 controls the motor 121 based on control commands from the pilot device 21. This controls flight using the propeller 122.
- the flight abnormality detection unit 124 detects abnormalities in the flight function from the behavior of the motor 121, etc.
- the obstacle detection unit 130 has an obstacle detection sensor 131 and a detection processing unit 132.
- the obstacle detection sensor 131 is a type of sensor for detecting obstacles for the drone 1, and can be realized, for example, by an optical camera, a ToF sensor (ToF: Time of Flight), LiDAR (Light Detection and Ranging), or an IR sensor (IR: Infrared).
- the detection processing unit 132 detects the athlete P, other drones 1, and other obstacles based on sensing information from the obstacle detection sensor 131.
- This photography system may calculate and detect the positions and movement directions of a certain unmanned aerial vehicle (also referred to as the vehicle itself) based on information on the position and movement direction of the athlete P or other vehicles, and on distance information between the vehicle itself and the athlete P or other vehicles detected based on the obstacle detection sensor 131, etc.
- a certain unmanned aerial vehicle also referred to as the vehicle itself
- the flight condition measurement unit 140 has an aircraft position measurement unit 141, an aircraft orientation measurement unit 142, and an aircraft speed measurement unit 143.
- the flight condition measurement unit 140 measures the flight condition of the aircraft.
- the aircraft position measurement unit 141 measures the position of the aircraft in space based on signals from satellites 81 and communications with wireless base stations 82.
- the aircraft orientation measurement unit 142 measures the orientation of the aircraft.
- the aircraft speed measurement unit 143 measures the speed of the aircraft.
- the reception status detection unit 160 has a control communication anomaly detection unit 161 and a positioning status anomaly detection unit 162.
- the control communication anomaly detection unit 161 detects anomalies in control communication. If an abnormality occurs in the control communication of a leader aircraft or a sub aircraft in the formation 10, the control communication anomaly detection unit 161 determines that there is an abnormality in the formation flight control of the leader aircraft or the sub aircraft. An abnormality in control communication occurs when communication is interrupted or delayed, etc.
- the positioning status abnormality detection unit 162 detects abnormalities in the positioning of the unmanned aerial vehicle 1. If the positioning accuracy of the position of the leader or sub aircraft in the formation 10 deteriorates, the positioning status abnormality detection unit 162 determines that the formation flight control of the leader or sub aircraft is abnormal. The positioning status abnormality detection unit 162 also detects abnormalities in the positioning of the athlete P who is the subject of the photograph. If the positioning accuracy of the position of the athlete P deteriorates, the positioning status abnormality detection unit 162 determines that the formation flight control of the leader aircraft is abnormal. Examples of cases where the positioning accuracy has deteriorated include weak GNSS signal strength from satellite 81, multipath of the satellite signal, etc.
- the communication unit 170 is a part that implements various communication interfaces, such as a communication interface with the flight control device 21, a communication interface with other aircraft 100, and a communication interface with the athlete terminal 70.
- [Appearance of the drone] 12 shows an example of the external configuration of the drone 1.
- a camera 111 is held via a camera holding part 112 on the underside of a main body 1000.
- An obstacle detection sensor 131 is installed on the front side of the main body 1000.
- a plurality of motors 121 and propellers 122 are provided on the side of the main body 1000.
- the obstacle detection sensor 131 can be realized, for example, by an optical camera, a ToF sensor, LiDAR, or an IR sensor.
- the obstacle detection sensor 131 may also be configured as a stereo camera with two optical cameras that are mounted facing forward of the drone 1's body.
- the obstacle detection sensor 131 may not only be a sensor that detects the forward direction mounted on the front surface of the main body 1000, but also a sensor that can detect directions other than the forward direction, for example, may be mounted on all six surfaces of the main body 1000.
- the propeller 122 and the motor 121 constitute a thrust generating device.
- the thrust generating devices may be provided as four thrust generating devices at four locations on the aircraft as in the illustrated example, or multiple thrust generating devices may be provided at multiple locations.
- the input unit 220 has an aircraft position operation unit 221, an aircraft attitude operation unit 222, a camera attitude operation unit 223, a camera zoom operation unit 224, a flight command input unit 225, and a power supply input unit 226.
- the abnormality status display unit 211 displays the results of the determination made by this photography system regarding abnormalities in the formation flight control and abnormalities in the reception status.
- the flight status display unit 212 displays the flight status of multiple drones 1 in the formation 10.
- the flight status displayed may include flight mode, or the relative distance and relative speed to the athlete P, or the relative distance and relative speed between each drone 1.
- Flight modes may include the tracking shooting mode in FIG. 19 described below, and the emergency evacuation mode.
- the athlete information display unit 213 displays the captured video of the athlete P and athlete information estimated from the unmanned aerial vehicle control information. Examples of athlete information include speed and distance traveled.
- the flight command input unit 225 has the following functions.
- the flight command input unit 225 can switch between manual and automatic control.
- the flight command input unit 225 can also input a command to switch from the follow-up photography mode in FIG. 19 to the landing mode.
- the follow-up photography mode is a mode in which the formation 10 follows and photographs the contestant P.
- the landing mode is a mode in which the formation 10 lands. In the landing mode, the flight command input unit 225 can also specify landing operations such as landing permission, hovering standby, soft landing on the spot, and propeller stop.
- the communication unit 230 is connected to the unmanned aerial vehicle 1, the athlete terminal 70 (the athlete P side device), the server 30, the pre-setting terminal 40, and the captured image display device 50 via communication lines, and is equipped with communication interfaces corresponding to them.
- the control device 21 is connected to the monitor device 22 via the communication unit 230.
- the monitor device 22 is equipped with functions such as a display unit 240 for displaying a monitor screen based on video data 250 from the control device 21.
- FIG. 14 shows an example of the external configuration of the control device 21.
- a display as a display unit 210 is connected to the main body 2100, and control information is displayed on the display.
- the main body 2100 is provided with a right stick 2101, a left stick 2102, a right lever 2103, a left lever 2104, and the like.
- the right operation of the right stick 2101 corresponds to the right movement of the drone 1
- the left operation corresponds to the left movement of the drone 1
- the up operation corresponds to the upward movement of the drone 1
- the down operation corresponds to the downward movement of the drone 1.
- the right operation of the left stick 2102 corresponds to the right turn of the drone 1
- the left operation corresponds to the left turn of the drone 1
- the up operation corresponds to the forward movement of the drone 1
- the down operation corresponds to the backward movement of the drone 1.
- the right lever 2103 corresponds to, for example, panning the camera
- the left lever 2104 corresponds to, for example, zooming the camera.
- the control device 21 transmits a signal corresponding to the operation to the drone 1.
- the drone 1 controls its operation according to the received signal.
- [Pre-configured Device] 15 shows an example of a functional block configuration of the pre-setting terminal 40.
- the pre-setting terminal 40 includes a display unit 410, an input unit 420, a communication unit 430, etc.
- the pre-setting terminal 40 also includes a processor 4001, a memory 4002, a battery 4003, etc.
- the display unit 410 has a setting information display unit 411 and an error display unit 412.
- the setting information display unit 411 displays information input by each input unit of the input unit 420. If the input formation flight position or standby position (described later) is set to a position that deviates from the safety range set for each sport being photographed, the error display unit 412 determines that a position setting error has occurred and displays the details of the error.
- the input section 420 has a subject competition information input section 421, a subject athlete information input section 422, a course/geofence input section 423, a standby position input section 424, a takeoff position/landing position input section 425, a formation flight position input section 426, a leader/sub aircraft input section 427, an evacuation input section 428, a priority input section 429, etc.
- the subject competition information input unit 421 inputs information about the competition to be photographed.
- the subject athlete information input unit 422 inputs information about the athlete P to be photographed.
- the course/geofence input unit 423 inputs the competition course 61 and the corresponding geofence.
- the geofence is information for restricting the space in which the unmanned aerial vehicle 1 can fly (Figure 24).
- the standby position input unit 424 inputs the standby position described below ( Figure 21, etc.).
- the takeoff position/landing position input unit 425 inputs the takeoff position and landing position described below.
- the formation flight position input unit 426 inputs flight positions and conditions of drones 1 in formation 10 as shown in the example of FIG. 2.
- the input information in other words setting information, includes information such as the target position and distance in each axial direction relative to the position of athlete P, and an allowable range (e.g. upper and lower limits) centered on the target position.
- the axial directions here refer to the forward/backward axes, left/right axes, and up/down axes as relative directions and relative coordinate systems.
- it is necessary to ensure a distance of d1 or more in the forward direction from the position PP1 of athlete P it is necessary to ensure a height of h1 or more in the vertical direction from the position of athlete P.
- the conditions may include judgment conditions for determining the risk of contact between drones 1 or between drone 1 and athlete P.
- the leader/sub aircraft input unit 427 inputs the settings of the leader aircraft and sub aircraft as in the example of Figure 3. At least one aircraft in the formation 10 must be set as a leader aircraft. It is not permitted to set all aircraft in the formation 10 as sub aircraft. All aircraft in the formation 10 may be set as leader aircraft as in the example of Figure 9.
- the priority input unit 429 inputs priority settings, which will be described later. Priorities can be set for formation flight positions. Evacuation control using priorities is possible.
- the communication unit 430 is connected to the operator device 20, server 30, captured image display device 50, etc. via communication lines.
- the communication unit 530 is connected to the operator device 20, server 30, setting terminal 40, etc. via communication lines.
- the captured image display device 50 is a device intended for general viewers and race commentators as user U2.
- user U2 can view the camera image that he or she wants to view by displaying it enlarged as the main camera image ( Figure 25).
- the athlete terminal 70 comprises a status acquisition unit 710, a photographing unit 720, an abnormality detection unit 730, a communication unit 740, etc.
- the athlete terminal 70 also comprises a processor 7001, a memory 7002, a battery 7003, etc.
- the athlete terminal 70 is installed on the athlete P or on the vehicle or equipment used by the athlete P.
- the athlete terminal 70 may be installed on the athlete P's clothes, hat, helmet, board, bicycle, yacht, etc.
- the status acquisition unit 710 is realized by various sensors, etc.
- the status acquisition unit 710 has a position measurement unit 711, a speed measurement unit 712, a movement direction measurement unit 713, etc.
- the position measurement unit 711 measures the position of the athlete P using GNSS/GPS, etc.
- the speed measurement unit 712 measures the speed of the athlete P.
- the movement direction measurement unit 713 measures the movement direction of the athlete P (for example, the movement direction PD1 in Figure 2).
- an inertial measurement unit (IMU) that can measure angular velocity and acceleration with high accuracy and a transponder can be used to acquire the current position, speed, movement direction, etc.
- IMU inertial measurement unit
- a transmitter and a reader are installed on the ground side and the athlete P side, respectively, and the equipment is installed at a predetermined interval on the course 61 on the ground side. This makes it possible to detect the position, speed, movement direction, etc. of the athlete P.
- the photographing unit 720 has an on-board camera 721.
- the on-board camera 721 photographs images from the viewpoint of the athlete P looking at the surroundings (e.g., the direction of travel). There may be cases where the photographing unit 720 is not provided.
- the abnormality detection unit 730 has a positioning status abnormality detection unit 731.
- the positioning status abnormality detection unit 731 detects abnormalities related to the status acquisition by the status acquisition unit 710, such as abnormalities in positioning by the position measurement unit 711.
- the athlete terminal 70 may have a function to detect surrounding drones 1 in the formation 10 if it is desired to further increase safety. For example, the athlete terminal 70 may determine that an abnormality has occurred when the distance to the surrounding drone 1 falls below a predetermined distance. By providing the results of this determination and detection to the drone 1 and server 30, it is possible to control measures to reduce the risk of collision.
- the athlete terminal 70 is not essential, but if provided, more advanced control becomes possible.
- [server] 18A and 18B show an example of a functional block configuration of the server 30.
- Fig. 18A shows the first half
- Fig. 18B shows the second half.
- the server 30 includes a pre-takeoff setting unit 310, an aircraft state information display unit 320, a standby abnormality determination unit 330, a competition information acquisition unit 340, an athlete information acquisition unit 350, a control target command unit 360, an abnormality determination unit 370, an evacuation action determination unit 380, a distribution video control unit 390, a communication unit 395, and the like.
- the server 30 also includes a processor 3001, a memory 3002, a battery 3003, and the like.
- the pre-takeoff setting unit 310 is a part that performs settings on the server 30 side that correspond to the pre-settings on the pre-setting terminal 40, and this setting is performed based on the input information of the pre-setting terminal 40.
- the pre-takeoff setting unit 310 acquires information input from each input unit of the input unit 410 of the pre-setting terminal 40, and sets each condition, etc.
- the pre-takeoff setting unit 310 includes a shooting target competition setting unit 311, a shooting target athlete setting unit 312, a course/geofence setting unit 313, a standby position setting unit 314, a takeoff position/landing position setting unit 315, a formation flight position setting unit 316, a leader/sub aircraft setting unit 317, an evacuation setting unit 318, a priority setting unit 319, a setting error determination unit 310A, etc.
- the subject sport setting unit 311 sets the subject sport to be photographed in response to the input information of the subject sport information input unit 421 in FIG. 15.
- the subject athlete setting unit 312 sets the subject athlete to be photographed in response to the input information of the subject athlete information input unit 422.
- the course/geofence setting unit 313 sets the course and geofence in response to the input information of the course/geofence input unit 423.
- the standby position setting unit 314 sets the standby position in response to the input information of the standby position input unit 424.
- the takeoff position/landing position setting unit 315 sets the takeoff position and landing position in response to the input information of the takeoff position/landing position input unit 425.
- the formation flight position setting unit 316 sets the formation flight position in response to the input information of the formation flight position input unit 426.
- the leader/sub aircraft setting unit 317 sets the leader aircraft and sub aircraft in response to the input information of the leader/sub aircraft input unit 427.
- the evacuation setting unit 318 sets the evacuation action in response to the input information in the evacuation input unit 428.
- the priority setting unit 319 sets the priority in response to the input information in the priority input unit 429.
- the setting error determination unit 310A determines that a position setting error has occurred when the formation flight position, standby position, etc. input from the pre-setting terminal 40 is set to a position that deviates from the safety range set for each sport being photographed, and causes the error display unit 412 of the pre-setting terminal 40 to display the details of the error.
- the aircraft status information acquisition unit 320 is a part that acquires the status of the aircraft 1 in the formation 10.
- the aircraft status information acquisition unit 320 has an aircraft flight status acquisition unit 321, a reception status acquisition unit 322, an obstacle information acquisition unit 323, etc.
- the aircraft flight status acquisition unit 321 acquires each piece of measurement information from the drone 1 by the flight status measurement unit 140 ( Figure 11) of the drone 1. Examples of each piece of measurement information include position, direction, and speed.
- the obstacle information acquisition unit 323 acquires from the drone 1 obstacle detection information by the obstacle detection unit 130 of the drone 1.
- the obstacle detection information includes the contestant P, other aircraft, other obstacles, etc.
- An obstacle is any object that is an obstacle to the flight of the drone 1.
- the reception status acquisition unit 322 acquires the reception status of the drone 1's control signal and positioning signal from the drone 1's reception status detection unit 160.
- the standby abnormality determination unit 330 determines that an abnormality exists when abnormality information is acquired from the drone 1 side.
- the standby abnormality determination unit 330 is used particularly for abnormality determination in the standby state.
- the standby abnormality determination unit 330 acquires abnormality information from the flight function unit 120 or flight abnormality detection unit 124, camera abnormality detection unit 116, control communication abnormality detection unit 161, and positioning status abnormality detection unit 162 as functional units on the drone 1 side ( Figure 11).
- the competition information acquisition unit 340 acquires competition/race information, such as the start/end of a competition/race, from an external device.
- the athlete information acquisition unit 350 acquires information about the athlete P.
- the athlete information acquisition unit 350 has a movement state acquisition unit 351 and a movement prediction unit 352.
- the movement state acquisition unit 351 acquires information such as the current position, speed, and movement direction of the athlete P from the athlete terminal 70. Instead of acquiring information from the athlete terminal 70, the movement state acquisition unit 351 may estimate the position, speed, movement direction, etc. of the athlete P from information such as the image captured by the drone 1, the position of the drone 1, and the camera direction.
- the control target command unit 360 has a command generation unit 361, a command execution unit 362, a command display unit 363, etc.
- the command generation unit 361 generates commands such as target values for the positions (relative positions or absolute positions) of the leader and sub aircraft of the formation 10, target values for the speeds, etc.
- the command execution unit 362 controls the operation of the drones 1 in the formation 10 in accordance with the generated commands.
- the command display unit 363 transmits command information to the control device 21 for display.
- the abnormality determination unit 370 determines various abnormalities, which will be described later.
- the abnormality determination unit 370 determines an abnormal state during tracking shooting mode.
- the abnormality determination unit 370 determines abnormalities in the leader unit and the sub unit. A detailed flow of abnormality determination will be described later.
- the abnormality determination unit 370 has a relative distance abnormality determination unit 371, a relative speed abnormality determination unit 372, a reception abnormality determination unit 373, a target value tracking abnormality determination unit 374, a geofence deviation determination unit 375, and the like.
- the evacuation action determination unit 380 determines the evacuation action to be taken when an abnormality occurs in the leader aircraft of the formation 10, the evacuation action to be taken when an abnormality occurs in the sub aircraft, the evacuation action to be taken for each formation flight position, etc. A detailed flow of the evacuation determination will be described later.
- the distribution image control unit 390 controls the distribution and display of the images captured by the drones 1 of the formation 10 (image data 250 in FIG. 1) to the captured image display device 50.
- [State transition diagram] 19 is a state transition diagram of the present photography system.
- the present photography system has several modes for controlling the flight of the formation 10, etc.
- the photography system switches between modes as appropriate.
- the photography system has the following modes: pre-takeoff preparation mode M1, takeoff mode M2, photography standby mode M3, follow photography mode M4, emergency evacuation mode M5, and landing mode M6.
- the pre-takeoff preparation mode M1 is a preparation mode before the formation 10 takes off.
- the formation 10 takes off from a predetermined takeoff point.
- the takeoff mode M2 is the mode during which the formation 10 takes off.
- the photography standby mode M3 is a state of preparation for photography before the start of a competition/race. From the photography standby mode M3, there is a transition to the follow photography mode M4 after the start of the competition/race (transition s34). Furthermore, from the photography standby mode M3, there is a transition to the abnormality evacuation mode M5 in response to the detection of an abnormality (transition s35).
- the following photography mode M4 is a mode in which the formation 10 is placed in a predetermined formation flight position (e.g., Figure 2) and flies following the athlete P, who is the subject of photography, while photographing the athlete P. From the following photography mode M4, the mode transitions to the landing mode M6 upon the end of the competition/race (transition s46). Also, from the following photography mode M4, the mode transitions to the emergency evacuation mode M5 in response to the detection of an abnormality (transition s45). Also, from the following photography mode M4, it is possible to transition to the photography standby mode M3 (transition s43).
- Landing mode M6 is a mode in which the aircraft lands at a predetermined landing site, and is the mode in which the formation 10 lands.
- the emergency evacuation mode M5 is a mode in which a predetermined evacuation action is performed if an abnormality is detected in the tracking shooting mode M4 or the shooting standby mode M3. From the emergency evacuation mode M5, there is a transition to the landing mode M6 (transition s56). Also, from the emergency evacuation mode M5, there is a transition back to the tracking shooting mode M4 (transition s54). Also, from the emergency evacuation mode M5, there is a transition back to the shooting standby mode M3 (transition s53).
- a standard state transition is as follows. Before the start of a competition/race, the system starts in pre-takeoff preparation mode M1, transitions to takeoff mode M2, and transitions from takeoff mode M2 to photography standby mode M3. After the start of a competition/race, the system transitions from photography standby mode M3 to follow photography mode M4, and photography is performed by following the contestant P during the competition/race. If there is no formation flight abnormality as an abnormality, the system transitions from follow photography mode M4 to landing mode M6 after the end of the competition/race. If an abnormality is detected in photography standby mode M3, the system transitions to abnormality evacuation mode M5, and then transitions to landing mode M6.
- abnormality evacuation mode M5 If an abnormality is detected in following photography mode M4, the system transitions to abnormality evacuation mode M5, and then transitions to landing mode M6. In other words, if a formation flight abnormality is detected, evacuation is performed for the aircraft 1 in which the abnormality was detected or the formation 10 including it, and then transitions to landing mode M6.
- a photography standby mode M3 may be set up even after the start of the competition/race.
- the drone 1 in photography standby mode M3, the drone 1 flies following the athlete P, who is the subject of photography, at a certain distance, but does not photograph him.
- the drone 1 flies close to the athlete P, who is the subject of photography, and transitions to the following photography mode M4 (transition s34) after setting up a predetermined formation flight position, and begins photographing the athlete P.
- the drone 1 flies away from the athlete P and returns to photography standby mode M3 (transition s43).
- Step S101 the photographing system sets a photographing target sport.
- the photographing target sport is, for example, a sport selected from snowboarding, skiing, yachting, surfing, automobile racing, motorbike racing, bicycle racing, marathon racing, and the like.
- step S102 the imaging system sets the athlete to be photographed.
- the imaging system sets the identification information of the athlete P and the identification information of the athlete terminal 70 that includes the sensor worn by the athlete P.
- step S103 the photography system sets the formation flight position in the follow photography mode M4.
- the formation flight position is, for example, as shown in Figures 2 to 4.
- the photography system also sets that standby position ( Figure 21, etc.).
- the photography system sets the leader and sub aircraft of the formation 10.
- the photography system sets at least one leader aircraft, and sets the other aircraft of the formation 10 as sub aircraft.
- the leader aircraft is an aircraft that controls its relative position with respect to the competitor P, as described above.
- the sub aircraft is an aircraft that controls its relative position with respect to the leader aircraft, as described above. Since the leader aircraft targets the relative position and distance with respect to the competitor P, it is preferable to set the aircraft in the formation 10 that flies closest to the competitor P as the leader aircraft.
- step S105 the filming system sets the competition course 61 and a geofence ( Figure 24).
- the filming system sets the geofence to prevent the unmanned aerial vehicle 1 from flying outside the flight area where it can fly safely.
- the filming system sets the geofence manually or automatically based on position information such as the terrain, immovable obstacles, and spectator positions.
- step S106 the filming system sets a landing position where the unmanned aerial vehicle 1 can land safely.
- the filming system sets the landing position manually or automatically based on position information such as the terrain, immovable obstacles, and spectator positions.
- FIG. 21 is an explanatory diagram of the standby position of the formation 10 corresponding to the photography standby mode M3.
- a first example of the standby position is shown in a schematic perspective view.
- the athlete P is at the start point 62 of the course 61.
- the direction of travel immediately after the start is the Y direction (+Y).
- the standby positions of the drones 1 (1a, 1b, 1c, 1d, 1e) of the formation 10 are set at a position that secures a predetermined distance ds1 backward (-Y) from the start point 62, which is the position of the athlete P, in terms of the Y direction position, set at the same position as the position of the athlete P in terms of the X direction position, and set at a position that secures a predetermined distance ds2 upward (+Z) from the position of the athlete P (here, the road surface) in terms of the Z direction position.
- the five drones 1 that make up the formation 10 are arranged in series in the Z direction, which is the vertical direction. This is not limiting, and a standby position can be set. For example, they may be arranged in series in the horizontal direction. Alternatively, they may be arranged in a reduced distance between the drones 1 while maintaining their positional relationship, based on the formation flight position in the tracking shooting mode as shown in Figure 2.
- Figure 22 shows a second example of a standby position.
- the standby position is established at a position a predetermined distance forward in the direction of travel from the position of athlete P at starting point 62, and a predetermined distance above that position.
- the standby position of the formation 10 is established at a predetermined distance ds3, for example 10 m, forward (+Y) from the starting point 62, and a predetermined distance ds4, for example 10 m, vertically upward (+Z).
- ds3 for example 10 m, forward (+Y) from the starting point 62
- ds4 for example 10 m, vertically upward (+Z).
- FIG. 23 shows a third example of a standby position.
- the standby position is established at a position that is a predetermined distance above the position of the contestant P at the starting point 62.
- the standby position of the formation 10 is established at a predetermined distance ds5, for example 10 m, vertically above (+Z) the starting point 62.
- the formation flight positions of each drone 1 (1a-1e) are established with this standby position as the reference position. In the case of this standby position, as in the case of FIG. 22, a formation flight position in the tracking photography mode can be smoothly formed around the contestant P after the start of the competition.
- evacuation control is performed at least in the tracking shooting mode M4, but this is not limiting, and evacuation control may also be applied to the shooting standby mode M3 in a similar manner.
- evacuation control may also be applied to the shooting standby mode M3 in a similar manner.
- evacuation control may also be applied to the shooting standby mode M3 in a similar manner.
- FIG. 21 when multiple drones 1 are arranged in series in the shooting standby mode M3, if a state in which the distance between the drones 1 cannot be sufficiently secured is determined to be abnormal, an evacuation action may be performed. In this case, a transition s35 occurs from the shooting standby mode M3 to the abnormality evacuation mode M5 in FIG. 19.
- FIG. 24 is an explanatory diagram of the setting of a geofence for the course 61.
- the geofence is set as a spatial region at a position covering the latitude, longitude, and altitude of the course 61 along the topography of the course 61.
- a geofence may be set using a width and height for the course 61 represented by a line.
- the geofence is set so as to avoid spaces that the drone 1 should not enter, such as forests, rocks, cliffs, and watersides.
- the geofence is set so as to avoid the area.
- a geofence is set in the sky so that the height distance from the road surface is equal to or greater than a predetermined height distance 2402 in the vertical direction.
- FIG. 25 shows an example of the screen display of the captured image on the captured image display device 50.
- the captured image of the selected camera is displayed as the main camera image 2501.
- the main camera image 2501 corresponds to the main camera image display unit 511 (FIG. 16) described above.
- the captured image of the selected camera is an image captured by the camera Cc of the drone 1c located on the right side of the formation 10 in FIG. 2, and is an image with an angle of view capturing the right side of the athlete P on the course 61 in the direction of travel (image of "Right Angle" No. 3).
- the drone 1d located on the left side is also captured in the background.
- a sub-camera image 2502 is displayed that corresponds to the other camera image display section 512 described above.
- the sub-camera image 2502 displays four camera images in a small area as images from the cameras of the drones 1 (1a, 1b, 1d, 1e) in the formation 10, excluding the image from the camera Cc of the drone 1c located on the right.
- FIG. 25 An enlarged view of the sub-camera image 2502 is shown at the bottom of Figure 25.
- a user U2 such as a viewer, can select a desired sub-camera image from the sub-camera image 2502 by operating the remote control of the captured image display device 50, operating the cursor, inputting a number designation, touching, or the like.
- the aforementioned main camera switching unit 521 switches so that the selected sub-camera image is displayed in the main camera image 2501. For example, when the image at "Front Angle" No. 1 is selected, the image from camera Ca of the front drone 1a is displayed as the main camera image 2501.
- Fig. 26 shows a flow in the photography standby mode M3.
- the main processing entity is the server 30, but it may also be each drone 1 of the formation 10.
- step S301 the photography system causes all drones 1 in the formation 10 to take off from a takeoff point (e.g., takeoff point 63 in FIG. 21) in takeoff mode M2 and move them to a predetermined standby position.
- a takeoff point e.g., takeoff point 63 in FIG. 21
- the predetermined standby position include the aforementioned FIG. 21. This standby position may be the same as the formation flight position in the tracking photography mode M4, or it may be a different position.
- step S302 the photography system checks the status of the drone 1 in the standby position and determines whether any drone 1 has been detected as having an abnormality. If no drone 1 has been detected as having an abnormality (NO), the system proceeds to step S303, and if any drone 1 has been detected as having an abnormality (YES), the system proceeds to step S307.
- step S303 the filming system notifies the pilot device 21 that all drones 1 in the formation 10 have completed their standby state. This notification allows the pilot U1 to recognize that all drones 1 in the formation 10 have completed their standby state.
- step S304 the filming system checks whether the competition has started based on the acquired competition information, and if it has not started (NO), in step S305, the filming system has all drones 1 wait at their standby positions, and returns to step S302, repeating the same process. If the competition has started (YES), in step S306, the filming system transitions all drones 1 in the formation 10 to tracking filming mode M4.
- step S307 the filming system notifies the piloting device 21 that an abnormality has been detected in the drone 1.
- step S308 the filming system checks whether the drone 1 in which the abnormality was detected is the leader drone. If the leader drone is abnormal (YES), in step S309, the filming system transitions all drones 1 in the formation 10 to abnormality evacuation mode M5. The control in the above step S309 is such that if the leader drone is abnormal, all drones in the formation 10, including the sub drones, are determined to be targets for evacuation, since the leader drone, which has the important role of determining its positional relationship with the athlete P, is abnormal.
- step S310 the photography system transitions the abnormal sub-drone to landing mode M6 via abnormality evacuation mode M5 ( Figure 19).
- the control in step S310 described above is for determining only the abnormal sub-drone as the evacuation target, since if a sub-drone is abnormal, and some of the sub-drones that define themselves based on their position relative to the leader drone leave the formation 10, other drones can continue to follow and photograph them.
- step S311 the photography system checks whether the formation flight position of the abnormal sub-aircraft in step S310 is a predetermined position, for example, the front or rear position in Figure 2.
- this predetermined position is a position with a high priority among the formation flight positions. If this predetermined position is met (YES), in step S312, the photography system reassigns other drones 1 in other positions in the formation 10 (for example, left/right positions or directly above) to the formation flight position from which the abnormal sub-aircraft was evacuated. In other words, the formation flight position is changed or updated.
- the photography system reassigns a normal aircraft in another position to the high priority position in order to ensure photography at that high priority position. This allows photography to continue from the camera in the high priority position. Details of the control using priority in step S311 will be described later.
- step S313 the photography system checks whether the competition has started based on the acquired competition information, and if it has not started (NO), in step S314, it waits at the standby position, and returns to step S302, and repeats the same process. If the competition has started (YES), in step S315, the photography system transitions the drones 1 in the formation 10 that are not malfunctioning, i.e., the drones remaining in the standby position, to the tracking photography mode M4.
- Fig. 27 shows a flow of the follow-up photography mode M4.
- the photography system enters the follow-up photography mode M4, it executes the flow of Fig. 27. Note that the athlete P is moving with the start of the competition.
- step S401 the photography system moves the drones 1 of the formation 10 that are in the standby position in photography standby mode M3 to the formation flight position (e.g., Figures 2 to 4) in tracking photography mode M4.
- the formation flight position e.g., Figures 2 to 4
- the filming system acquires information and status such as the ID, absolute position, speed, and direction of movement of athlete P.
- the drone 1 and server 30 grasp this information by acquiring information from the athlete terminal 70 or the like.
- the filming system acquires various pieces of information about athlete P acquired from the status acquisition unit 710 of the athlete terminal 70 or an external device.
- the athlete terminal 70 including a sensor attached to athlete P
- acquisition from sensors such as the obstacle detection unit 130 of the drone 1 and estimation from images captured by the camera of the drone 1. Examples of grasping the position and other information of athlete P will be described later.
- step S403 the imaging system predicts the future position of athlete P (e.g., at the next time point) based on the information and state acquired in step S402.
- step S404 the filming system generates and updates control target values related to the formation flight position and speed of the formation 10 to follow the competitor P, based on the information and state predicted in step S403.
- the target values generated in step S404 include position (at least one of relative distance, relative position, and absolute position) and speed.
- position at least one of relative distance, relative position, and absolute position
- speed For the leader aircraft, a position where the relative distance or relative position from the competitor P satisfies a specified condition is generated as the target value ( Figure 5, etc.).
- the sub aircraft a position where the relative distance or relative position from the leader aircraft satisfies a specified condition is generated as the target value.
- step S405 the photography system acquires the absolute positions of each of the drones 1, the leader and sub drones in the formation 10.
- the drones 1 and the server 30 obtain their absolute positions by positioning or the like.
- the absolute positions in the global coordinate system may be calculated based on relative position information that uses a pre-installed wireless base station 82 ( Figure 1) as the reference point.
- step S406 the photography system acquires the relative position or distance of the leader camera to the position of athlete P.
- step S407 the photography system obtains the relative position or distance of the sub-camera to the position of the leader camera.
- step S408 the imaging system generates and updates control commands for each of the leader and sub drones 1 in the formation 10 based on the information acquired up to step S407. These control commands correspond to feedback control commands based on the difference between the target value and the current value.
- step S409 the photography system acquires information indicating the communication status of each drone 1 between the leader drone and the sub drone in the formation 10.
- a drone 1 monitors the communication status with the other drones 1 and detects any abnormalities.
- the server 30 also monitors the communication status with each drone 1 through the control device 21 and detects any abnormalities.
- the photography system acquires information on the control communication status and positioning status acquired by the control communication abnormality detection unit 161 and positioning status abnormality detection unit 162 in FIG. 11 for each of the leader drone and the sub drone.
- step S410 the photography system determines whether there are any formation flight anomalies for each of the leader and sub drones 1 in the formation 10 based on the information acquired in step S409. Details will be described later.
- Step S411 is a branch depending on the result of the judgment in step S410. If it is judged that there is an abnormality in the formation flight (YES), the process proceeds to step S412. If it is not judged that there is an abnormality in the formation flight (NO), the process proceeds to step S413. In step S412, the photography system transitions to the abnormality evacuation mode M5.
- step S413 it is confirmed whether the competition/race has ended, for example whether the contestant P has reached the finish line. If it has not ended (NO), the process proceeds to step S414, and if it has ended (YES), the process proceeds to step S415.
- the competition information acquisition unit 340 (Fig. 18A) of the server 30 mentioned above acquires and grasps information on the status of the competition/race, such as competition start information and competition end information, and this information can be used to confirm the start and end of the competition/race.
- step S414 the tracking photography mode M4 continues, and in this case, the same process is repeated from step S42.
- step S415 the photography system transitions to landing mode M6 or photography standby mode M3. For example, if the next competition by the next athlete continues, the photography system may transition to photography standby mode M3.
- FIG. 28 shows an example of the flow of the formation flight abnormality judgment in step S410 in FIG. 27.
- FIG. 28 shows an example of the formation flight abnormality judgment process flow of the leader aircraft.
- the judgment priority is different from the priority of the formation flight position.
- the judgment priority is a first priority that considers the safety of the athlete P and judges the collision avoidance between the athlete P and the drone 1.
- the judgment priority is a second priority that considers the safety of spectators around the course 61 and judges the deviation avoidance of the drone 1 from the range of the geofence of the course 61.
- the judgment priority is a third priority that considers the safety of the drone 1 and judges the collision avoidance between the drones 1.
- steps S2801 to S2805 there are five judgment steps, steps S2801 to S2805, and the judgment conditions for each step are illustrated as "condition A1" to "condition A5".
- Condition A1 has the highest judgment priority.
- step S2801 as condition A1, the filming system judges whether the communication state of the leader aircraft of formation 10 is worse than a predetermined level, such as the aforementioned communication anomaly.
- step S2801 if there is an abnormality in the control communication of the leader aircraft, for example if the control communication is interrupted or delayed, it is determined that there is a communication anomaly. Also, if an abnormality is detected in the positioning of the leader aircraft and contestant P, for example if the positioning accuracy has deteriorated, it is determined that there is a positioning anomaly in the leader aircraft. More on communication anomalies later. These correspond to the result of the anomaly detection in the judgment of condition A1 in step S2801 (YES). If there is a communication anomaly in the leader aircraft (YES), the system proceeds to step S2806.
- condition A2 the photography system judges whether the relative position or relative distance of the leader aircraft to athlete P is abnormal.
- Condition A2 is judged as abnormal, for example, when the relative position or relative distance deviates from a predetermined range, in other words the target range, in other words, when it is below the lower limit or exceeds the upper limit.
- Condition A2 is judged as abnormal, in other words, as follows.
- the photography system judges whether the state of the leader aircraft deviates from the control target value by a predetermined value or more. For example, if the difference between the current position of the leader aircraft and the target position exceeds a predetermined value, it is judged as abnormal.
- the target position is defined as a relative position from the position of athlete P, or is defined as an absolute position on course 61.
- the target speed is defined as a relative speed from the speed of athlete P, or is defined as an absolute speed on course 61. The judgment of condition A2 will be described later. If the relative position/relative distance is abnormal, which satisfies condition A2 (YES), proceed to step S2806.
- condition A2 in step S2802 is not limited to only the "athlete" set as the subject of photography, but if there are multiple athletes running in the vicinity of the athlete as a group, the judgment may be made on the group of athletes.
- the relative distance to be judged is the relative distance to any one of the athletes in the group, and if the relative distance falls below the lower limit of a specified range, it is judged to be an abnormality. Judgments regarding the group of athletes will be described later.
- step S2803 as condition A3, the photography system determines whether the current position of the leader camera is outside the range of the geofence. If it is outside the range, it is determined to be an abnormality, and the process proceeds to step S2806.
- step S2804 as condition A4, the photography system determines whether the relative speed of the leader camera to the contestant P is abnormal. For example, if the relative speed of the leader camera to the contestant P exceeds a predetermined value, it is determined to be abnormal. If the relative speed is abnormal (YES), the process proceeds to step S2806.
- step S2805 as condition A5, the photography system determines, in response to the additional function (Figure 10), whether the relative position, relative distance, and relative speed of the leader machine to the positions and speeds of the surrounding sub-machines are within a specified range. If they are outside the specified range, it is determined that there is an abnormality, and the process proceeds to step S2806. For example, if the relative distance of the leader machine to the sub-machines is less than a specified value, it is determined that there is an abnormality. For example, in Figure 10, even if the relative distance of the leader machine to contestant P is within the target range, depending on the positions of the leader machine and the surrounding sub-machines, the relative distance between the sub-machines and the leader machine may be too close.
- step S2806 the photography system judges that the leader aircraft has a formation flight abnormality. If the judgment results for all of the above conditions A1 to A5 indicate no abnormality, then in step S2807 the photography system judges that the leader aircraft has no formation flight abnormality.
- FIG. 29 shows an example of a flow for determining an abnormality in the formation flight of the sub-aircraft, as an example of the flow for determining an abnormality in the formation flight of step S410 in Figure 27.
- the photography system determines that an abnormality has occurred if the communication state of the sub-aircraft is poor, or if there is an abnormality in communication or positioning, as condition B1, and proceeds to step S2907.
- step S2902 the photography system determines, as condition B2, that the relative position or distance of the current position of the sub-camera to athlete P deviates from a predetermined range, and proceeds to step S2907.
- This determination includes a judgment on the group of athletes, similar to the flow of the leader camera described above.
- step S2903 the imaging system determines that an abnormality has occurred if the current position of the sub-camera deviates from the geofence range as condition B3, and proceeds to step S2907.
- step S2904 the imaging system determines that an abnormality has occurred if the relative distance or relative position of the sub-camera to the leader camera deviates from a specified range as condition B4, and proceeds to step S2907.
- step S2906 the imaging system determines that there is an abnormality as condition B6, corresponding to the additional function ( Figure 10), if the relative distance, relative position, or relative speed of the target sub-aircraft with respect to other sub-aircraft in the formation 10 is outside a specified range, and proceeds to step S2907.
- step S2907 the photography system judges that the sub-aircraft has a formation flight abnormality. If the judgment results for all of the above conditions B1 to B6 indicate no abnormality, then in step S2908 the photography system judges that the sub-aircraft has no formation flight abnormality.
- Formation flight abnormalities are abnormalities related to flight in a specified positional relationship with respect to the contestant P as the flight of the formation 10 in the tracking photography mode M4. Formation flight abnormalities correspond to a state in which it is determined that there is a risk of contact or collision between the contestant P and the drone 1, or contact or collision between the drones 1. Formation flight abnormalities include the following states:
- a specified range e.g., a target range
- the position or speed of the unmanned aerial vehicle 1 does not satisfy specified conditions.
- the leader aircraft that acquires and grasps positioning information from the athlete P's athlete terminal 70 if the positioning performance of the athlete terminal 70 deteriorates, the corresponding leader aircraft will be deemed to be abnormal.
- [Understanding athletes' positions, etc.] 30 is an explanatory diagram of technical means for grasping the position, speed, direction, etc. of the athlete P.
- some technical means are shown when the drone 1a in the forward position grasps the position, speed, and direction of the athlete P.
- the athlete P has a position, speed, and direction P101.
- the athlete terminal 70 of the athlete P obtains position information by GNSS positioning, and obtains information such as speed and direction by sensors.
- drone 1a can grasp the position, speed, and direction P101 of athlete P by obtaining the ID and GNSS positioning information from athlete terminal 70 based on wireless communication with athlete terminal 70 of athlete P. Also, as described above, a device (transponder, etc.) installed in the competition environment 60 may measure the position of athlete P, and drone 1 and server 30 may obtain information from that device via communication.
- a device transponder, etc.
- the drone 1a can measure the athlete P using these various sensors to determine the position, speed, and direction P101 of the athlete P.
- the drone 1a can also estimate and grasp the position, speed, and direction P101 of the athlete P based on known image analysis processing, etc., from the image of the athlete P captured by the camera Ca at a front angle of view and the control information of the camera Ca (e.g., orientation, zoom amount, etc.).
- Drone 1a can perform tracking control of the formation flight position according to the position, speed, and direction P101 of contestant P grasped by the above technical means.
- the filming system can generate a target range for the position, speed, and direction of drone 1a (including camera Ca) in a relative positional relationship with contestant P as a control target value.
- the leader device identifies and specifies the individual athlete P who is being photographed, and follows him/her.
- the athlete terminal 70 has an ID (athlete ID) that can identify the athlete P.
- the leader device may obtain the athlete ID, etc., from the athlete terminal 70.
- the leader device may estimate the athlete ID from an analysis of the contents of the camera image.
- FIG. 31 is an explanatory diagram of an example of a communication anomaly.
- communication anomalies are broadly classified into control communication anomalies and positioning accuracy anomalies.
- Control communication anomalies include anomalies in control communication with other unmanned aerial vehicles 1 and anomalies in control communication with the athlete terminal 70.
- Positioning accuracy anomalies include anomalies in the positioning of the unmanned aerial vehicle 1 and anomalies in the positioning of the athlete terminal 70.
- Control communication anomalies include, for example, communication interruptions and communication delays.
- Positioning accuracy anomalies include a decrease in positioning accuracy, a decrease in the strength of the GNSS signal, the occurrence of multipath, and the like.
- drone 1a is the leader drone, and the other drones are sub-drones.
- the control communication with drone 1c, the leader drone is abnormal (e.g., communication from drone 1c is interrupted, etc.)
- drone 1c is determined to be abnormal.
- the control communication with drone 1a, the leader drone is abnormal (e.g., communication from drone 1a is interrupted, etc.)
- drone 1d is determined to be abnormal.
- the positioning accuracy of drone 1a, the leader drone is lower than a predetermined value, drone 1a is determined to be abnormal.
- drone 1a i.e., drone P associated with drone 1a
- drone 1a is determined to be abnormal.
- the positioning accuracy of drone 1b, the sub-drone is lower than a predetermined value, drone 1b is determined to be abnormal.
- Fig. 32 is an explanatory diagram of the determination of the relative distance to the contestant P as a determination of formation flight abnormality.
- step S2802 in Fig. 28 and step S2902 in Fig. 29 is shown.
- This example shows a case where the relative distance between the drone 1a in the front position and the contestant P is determined on an XY plane view.
- This example also corresponds to Fig. 5.
- the target control range for the position of drone 1a in the forward position relative to athlete P1's position PP1 and movement direction PD1 can be expressed as dpmin ⁇ dp ⁇ dpmax.
- dp is the relative distance between drone 1a's position PP1 and athlete P1's position.
- dpmin is the lower limit of the target range, and dpmax is the upper limit of the target range. It depends on the competition, but in one example, it might be 3m ⁇ dp ⁇ 5m, 10m ⁇ dp ⁇ 15m, etc.
- the above target range can also be expressed as follows. If the target position is cp and the allowable positive/negative range around the target position is ⁇ w, the target range can also be expressed as the range from cp-w to cp+w.
- the aforementioned tracking control ( Figure 8B, etc.) is, for example, to control the position of drone 1a so that it is within such a target range. If it falls outside the target range, in other words, if it is unable to maintain being within the target range, it is determined that there is an abnormality in the formation flight as described above.
- the determination using the above target range includes cases where the relative distance dp is outside the target range when it is less than the lower limit dpmin (dp ⁇ dpmin) and when it is greater than the upper limit dpmax (dp>dpmax).
- the determination of whether the relative distance dp is less than the lower limit dpmin is also referred to as a close determination.
- the determination of whether the relative distance dp is greater than the upper limit dpmax is also referred to as a distant determination.
- At least a proximity determination is performed, but it is more preferable to also perform a remote determination.
- the target range can also be used to determine the relative speed of drone 1 to the speed of contestant P. For example, if the speed of contestant P is Vp and the speed of drone 1a is Va, the target range for the relative speed is generated with the relative speed of drone 1a as (Va-Vp). If the relative speed (Va-Vp) falls outside the target range, it is determined that there is an abnormality in the formation flight.
- the leader aircraft may follow the athlete group, judge the abnormality of the athlete group, and control the leader aircraft to move away from the athlete group. For example, in the judgment of condition A2 in step S2802, if the relative distance to any one of the athletes P in the athlete group from the view of the leader aircraft, drone 1a, becomes smaller than the lower limit of the target range, it is judged to be an abnormality regarding proximity.
- Figure 33 shows an example of proximity determination with a group of athletes.
- athletes P are athletes P1, P2, P3, P4, etc. running in a competition such as a marathon.
- the formation 10 initially targets athlete P1 for filming, but then other athletes P2 to P4 come close to athlete P1.
- Drone 1a the leader, is filming the rear (-Y) with camera Ca.
- athletes P2 to P4 are also captured within the filming range of camera Ca.
- athletes P1 to P4 are considered to be a group of athletes PG.
- the group of athletes PG is captured in the image captured by camera Ca.
- dp be the relative distance to athlete P.
- the relative distance to athlete P1 is dp1
- the relative distance to athlete P2 is dp2.
- the target range for the relative distance dp is dpmin ⁇ dp ⁇ dpmax, as in FIG. 32. Normally, if the detected relative distance dp falls outside this target range, for example, if the relative distance dp1 falls below the lower limit dpmin as a proximity determination, it is determined that the drone is too close to athlete P, and an abnormality regarding the proximity occurs.
- drone 1a uses the closest relative distance dp2 to determine whether drone 1a is in proximity to the group of athletes PG. In other words, if the relative distance dp2 is less than the lower limit dpmin, it is similarly determined that an abnormality regarding proximity has occurred.
- FIG. 34 shows an example of remote determination.
- drone 1b which is the sub-machine in the rear position.
- the relative distance to athlete P is dy.
- drone 1b is a distance dy1 behind (-Y) athlete P1, who is the subject of the photograph.
- another athlete P2 is traveling behind athlete P1.
- the relative distance dy2 to the other athlete P2 becomes small, increasing the risk of collision.
- the risk of collision with another similar formation 10 for example the leading drone, increases.
- both the close-proximity judgment with respect to the group of athletes and the remote judgment as described above may be applied.
- Remote judgment can also be applied to the group of athletes in the same way.
- step S501 the photography system checks whether the aircraft with the formation flight abnormality is the leader or the sub. If it is the leader, the system proceeds to step S502, and if it is the sub, the system proceeds to step S507.
- step S502 the imaging system decides to evacuate all aircraft in formation 10, including the leader and sub aircraft, and notifies the pilot 21 of this evacuation.
- step S503 the photography system increases the altitude of each of all drones 1, including the leader and sub drones, in the formation 10 to be evacuated, to a predetermined altitude as an example of an evacuation action.
- the system increases the altitude from the altitude before the evacuation to the altitude after the evacuation by a difference in altitude of at least a predetermined value.
- step S504 the filming system checks whether the relative distance of the leader aircraft from competitor P is greater than a predetermined value. In other words, this is a check to see whether sufficient evacuation has been performed as an evacuation action. If the distance is not sufficient (NO), the evacuation is insufficient, so in step S505, the filming system makes the aircraft wait in the air, for example. It then returns to step S504. If the distance is sufficient (YES), the evacuation is sufficient, so in step S506, the filming system transitions the aircraft to landing mode M6.
- step S507 the imaging system decides to evacuate only the sub-machine with an abnormality, and notifies the control device 21 of this evacuation.
- step S508 the imaging system increases the altitude of the sub-camera to be evacuated to a predetermined altitude as an example of an evacuation action.
- step S509 the filming system checks whether the relative distance of the sub-aircraft to the leader aircraft is greater than a predetermined value. In other words, this is a check to see whether sufficient evacuation has been performed as an evacuation action. If the sub-aircraft has not left the leader aircraft (NO), the evacuation is insufficient, so in step S510, the filming system makes the aircraft wait in the sky and then returns to step S509. If the sub-aircraft has left the leader aircraft (YES), the evacuation is sufficient, so in step S511, the filming system transitions the aircraft to landing mode M6. This control lands the sub-aircraft after the relative distance from the leader aircraft, which continues to follow competitor P, becomes sufficiently large.
- the evacuation actions of steps S503 and S508 are, as a general rule, actions in which the drone 1 is moved in a direction away from the athlete P, and in this embodiment, a specific example is an altitude increase operation.
- the details of the evacuation action can be specified in various ways.
- the altitude increase operation as the evacuation action of steps S503 and S508 can be either an automatic control operation or a manual operation by the pilot U1.
- step S507 if there is an abnormality only in a sub-machine of the formation 10, only the abnormal sub-machine is evacuated in step S507, but this is not limited to the above. All machines including the abnormal sub-machine and the leader machine may be evacuated. In the case where only the abnormal sub-machine is evacuated, the remaining drone 1 can continue taking pictures without evacuating. In the case where all machines are evacuated, an emphasis on safety can be achieved. In other modified examples, all sub-machines may be evacuated while leaving behind some drones 1, for example the leader machine.
- FIG. 36 is an explanatory diagram of an example of an evacuation action in the abnormality evacuation mode M5, and is a YZ plane view seen from the right side with respect to the movement direction PD1 of the athlete P.
- the state (1) is a state in which, in the follow-up shooting mode M4, the formation 10 follows the athlete P and moves to shoot him/her in the formation flight position configuration as shown in FIG. 2 to FIG. 4 above.
- the leader drone 1a it is determined to evacuate the entire formation 10 according to FIG. 35.
- an abnormality is detected in the sub-drone, it is determined to evacuate only the sub-drone. This example shows a case in which the entire formation 10 is evacuated.
- the state (2) shows the state in which the entire formation 10 has retreated from the state (1) by moving upward (+Z) in increasing altitude.
- each drone 1 of the formation 10 has moved upward a distance DH1 from its position in (1).
- “upward” here refers to the vertical direction (+Z direction), it is not limited to this and may be a direction perpendicular to the road surface of the course 61.
- the relative distance dap is the relative distance between the leader drone 1a and the contestant P at the time of the retreat. Note that this retreat, which is an upward movement in altitude, is an action in which the drone stops following the contestant P (described below).
- State (3) shows a state in which the formation 10 transitions from state (2) to landing mode M6 after the relative distance dap becomes a predetermined value or greater, and the formation 10 lands at the set landing point 64.
- the formation's evacuation operation is not limited to this, and it is sufficient as long as the flight at least increases the relative distance between the contestant and the formation, and does not necessarily have to involve a landing operation.
- FIG. 37 shows another example of the evacuation action, in which only one sub-vehicle in the formation 10 that has a formation flight abnormality is evacuated.
- the state (1) is a state in which the drone is following the contestant P in the follow-up shooting mode M4.
- the sub-vehicle, drone 1c in the right position has an abnormality.
- the state (2) shows a case in which only the drone 1c is evacuated upward (+Z) at a distance DH2 in the abnormality evacuation mode M5.
- This evacuation which is an upward movement in altitude, is an operation in which the drone stops following the contestant P.
- the other drones in the formation 10 continue to follow the contestant P in the follow-up shooting mode M4.
- the relative distance dcp is the relative distance between the evacuated drone 1c and the contestant P.
- the state (3) shows a state in which the drone 1c is landed at the landing site 64 in the landing mode M6 after the relative distance dcp becomes sufficiently large.
- [Evacuation action type] 38A and 38B are explanatory diagrams showing types of retreat actions. Two types of retreat actions are shown according to whether the player stops following the player P or continues following the player P during the retreat action. In other words, the difference between these two types of retreat actions is whether they are defined as an absolute movement independent of the movement of the player P or as a relative movement following the movement of the player P. Either type of retreat action may be applied.
- Figure 38A shows an example of an evacuation action to stop following the contestant P in a Y-Z plane view.
- the sub drone 1c on the right has an abnormality, only that drone 1c is evacuated.
- the drone 1c performs a movement 3801 in which it ascends upward (+Z) to a specified altitude.
- the drone 1c stops following the contestant P and simply moves vertically (+Z) in space.
- the contestant P and the other drones 1 in the formation 10 are moving in the movement direction PD1 (+Y). Therefore, the distance between the ascending drone 1c and the contestant P and other drones 1 gradually increases over time.
- the drone 1c may come to a stop by hovering at the upper position Lc1.
- the distance dcp is the distance between the ascending drone 1c and the contestant P.
- drone 1c at upper position Lc1 may perform movement 3802, such as backward (-Y), which is opposite to the movement direction PD1.
- Movement 3801 and movement 3802 are both movements away from contestant P and have a low risk of contact with other drones 1 in the formation 10.
- the route is selected so that there are no other drones 1 on drone 1c's evacuation movement path, or so that the relative distance between drone 1 and other drones 1 is sufficiently large.
- the evacuation action may be a movement that stops following contestant P.
- Figure 38B shows an example of an evacuation action that maintains following the athlete P.
- the sub drone 1c at the right position has an abnormality, only that drone 1c is evacuated.
- the drone 1c moves 3803 upward away from the athlete P while maintaining its relative relationship to the athlete P, i.e., its positional relationship diagonally upward to the right.
- the movement 3803 maintains following the athlete P, that is, includes a component of movement in the movement direction PD1 (+Y), and moves diagonally upward forward (+Y, +Z) in space.
- the athlete P and the other drones 1 in the formation 10 are moving in the movement direction PD1.
- the distance dcp is the distance between the ascending drone 1c and the athlete P. Drone 1c may continue moving in the +Y direction to maintain tracking so as to remain at position Lc2 above the moving athlete P.
- drone 1c at upper position Lc2 may perform movement 3804, such as backward (-Y), which is opposite to the movement direction PD1.
- Movement 3804 may be performed by reducing speed in the forward/backward direction (Y direction) or setting it to zero (hovering), so as to be relatively behind contestant P.
- Movement 3803 and movement 3804 are both movements away from contestant P and have a low risk of contact with other drones 1 in the formation 10.
- the route is selected so that there are no other drones 1 on drone 1c's evacuation movement route, or so that the relative distance between drone 1c and other drones 1 is sufficiently large.
- the evacuation action may be movement while maintaining following of contestant P.
- an appropriate evacuation route, etc. is controlled so that collisions do not occur between the evacuating drones and the drones 1 that do not evacuate.
- evacuation action for each formation flight position The above-mentioned evacuation action can be preset for each formation flight position using the formation flight position input unit 426 (FIG. 15) and the formation flight position setting unit 316 (FIG. 18A) described above.
- An example of the evacuation action for each formation flight position for the formation 10 as shown in FIG. 2 is shown below.
- the example in FIG. 39 shows an example of an evacuation action of the drone 1a in the front position.
- the drone 1a in the front position performs a movement 3901 in which it increases its altitude and rises upward (+Z). After the movement 3901, it may also move left or right in the left-right direction (X direction). From the left or right position, it may also move further backward (-Y). In another example, after the upward movement 3901, it may move backward (-Y) 3902. However, it is necessary to select an appropriate altitude, etc. so as not to collide with the drone 1e directly above. In another example, the drone 1a in the front position may move left or right away from the contestant P without increasing its altitude, or may move further backward.
- FIG. 39 illustrates these examples of evacuation. All of the evacuation movement routes are designed to reduce the risk of contact with other drones 1.
- FIG. 40 shows an example of the evacuation action of the drones 1c and 1d on the left and right.
- the drones 1c and 1d move upward (+Z) 3903.
- the drones 1c and 1d may move backward (-Y).
- the drone 1c may move to the right away from the athlete P and the formation 10
- the drone 1d may move to the left away from the athlete P and the formation 10.
- the drones 1c and 1d may move backward (-Y).
- the drones 1c and 1d on the left and right may move to the left or right away from the athlete P without increasing their altitude. After that left and right movement, they may move backward.
- the example in Figure 41 shows an example of an evacuation action of drone 1b in a rear position.
- the evacuation action of drone 1b begins with movement 3904, in which the altitude is increased and drone 1b rises upward (+Z). After that, drone 1b may be moved backward (-Y). Furthermore, drone 1b may be moved to the right or left after movement 3901. In other examples, drone 1b may be moved backward (-Y) without increasing the altitude, or drone 1b may be moved to the right or left.
- Figure 41 illustrates all of these examples.
- the example in Figure 42 shows an example of an evacuation action of drone 1e located directly above.
- the evacuation action of drone 1e may first be to increase its altitude and move upward 3905. Alternatively, if the altitude is high enough from the start, it may move backward (-Y) 3906 without rising.
- the evacuation action may also be to move to the left or right away from player P.
- Figure 42 illustrates all of these examples.
- the evacuation shown as a movement backward (-Y) in Figure 40 corresponds to movement 3802 in the case of Figure 38A, and to movement 3804 in the case of Figure 38B.
- Such an evacuation movement may be an absolute movement that stops following the athlete P as in Figure 38A, or a relative movement that maintains following the athlete P.
- the speed in the forward/backward direction (Y direction) is reduced or set to zero, so that the drone moves backward (-Y) relative to the athlete P.
- FIG. 43 shows a control flow of the landing mode M6.
- the photography system accepts the input of a landing action in the control device 21.
- the server 30 or the drone 1 transmits an instruction or notification to the control device 21 to transition to the landing mode M6 based on the completion of the above-mentioned evacuation action.
- the control device 21 waits for the input of a landing action by the pilot U1 (in other words, the input of a landing operation instruction) according to the instruction.
- landing actions such as landing permission, hovering standby, soft landing on the spot, and propeller stop can be specified by an input operation by the pilot U1.
- step S602 the filming system checks whether there is a landing action input, and if there is a landing action input (YES), in step S603, the filming system executes a landing action corresponding to the input. For example, as shown in FIG. 36 (3), the filming system lands the formation 10 at landing site 64 in landing mode M6. If there is no landing action input (NO), in step S604, the filming system causes the formation 10 to wait by hovering in the state after the evacuation action (FIG. 36 (2)) for a predetermined period of time. Then, in step S605, after the predetermined period has elapsed, even if there is no input, the filming system automatically switches the formation 10 to landing mode M6 and lands it at landing site 64.
- NO no landing action input
- FIG. 44 is an explanatory diagram showing an example of determining an evacuation target based on an abnormal state in step S502 or step S507 of the flow in Fig. 35.
- a specific example of which aircraft in the formation 10 is determined to be abnormal and which aircraft is determined to be an evacuation target is shown.
- drone 1a is assumed to be the leader aircraft and the other aircraft are assumed to be sub-aircraft.
- Example 1 First, drone 1a, the leader drone, performs an abnormality determination under specified conditions. For example, if drone 1a determines that it is abnormal, it determines that the entire formation 10, including drone 1a, is to be targeted for evacuation.
- Example 2 Furthermore, if the leader drone 1a determines that another drone, for example drone 1c, is abnormal as a result of the abnormality determination, it will determine that at least drone 1c is to be evacuated.
- another drone for example drone 1c
- Example 3 Furthermore, drone 1c, which is the sub-machine, performs abnormality determination under specified conditions. For example, if drone 1c determines that it is abnormal, it will at least determine that it is the target for evacuation.
- Example 4 Furthermore, drone 1d, which is the sub drone, performs an abnormality determination under specified conditions. For example, if drone 1d determines that drone 1a, which is the leader drone among the other drones, is abnormal, it determines that the entire formation 10, including drone 1a, is to be targeted for evacuation.
- FIG. 45 shows an example of a judgment between sub-machines when deciding an evacuation target as a modified example.
- the other sub-machines may also be decided as evacuation targets.
- the photography system checks the relative distance between the abnormal drone 1c and other drones 1 in the vicinity. For example, the drone 1c checks the distance dcb between the drone 1c at the rear position.
- the drone 1b at the rear position checks the distance dcb between the drone 1c and the drone 1c. If the distance dcb is less than a predetermined value, it is judged that there is a contact risk. As a result, the photography system may also decide on the drone 1b as an evacuation target just to be sure.
- FIG. 46 shows an example of the priority regarding the formation flight position and the corresponding camera position and shooting angle of view.
- the drone 1a in the forward position La obtains an image 4601 of the contestant P with a front angle of view using the rearward facing camera Ca.
- the drone 1b in the rearward position Lb obtains an image 4602 of the contestant P with a rearward angle of view using the forward facing camera Cb.
- the drone 1c in the right position Lc obtains an image 4603 of the contestant P with a right side angle of view using the left facing camera Cc.
- the drone 1d in the left position Ld obtains an image 4604 of the contestant P with a left side angle of view using the right facing camera Cd.
- the drone 1e (FIG. 2) in the directly above position Le obtains an image of the contestant P with a bird's-eye view angle using the downward facing camera Ce.
- the photography system can set a priority for each formation flight position using the above-mentioned priority setting unit 319 ( Figure 18A) or the like.
- Priority is, in other words, importance.
- a first priority, priority p1 is set for image 4601 at forward position La and a front angle of view
- a second priority, priority p2 is set for image 4602 at rear position Lb and a rear angle of view
- a third priority, priority p3 is set for image 4603 at right position Lc and a right side angle of view
- a fourth priority, priority p4 is set for image 4604 at left position Ld and a left side angle of view.
- a fifth priority, priority p5 is set for image directly above position Le and an overhead view.
- Priority p1 is the highest.
- the priority can be freely set, taking into consideration what angle of view is important for the competition.
- the angle of view here corresponds to the orientation, position, size, etc. of the contestant P.
- a priority can be set for the formation flight position that can capture footage with a specified angle of view. Note that the priority can be set independently of the settings for the leader and sub aircraft.
- the priority is set to each position without overlap, and a different priority is set, but this is not limited to the above, and the same priority may be set to multiple positions. Also, in this example, the priority is set to all positions, but this is not limited to the above, and priority may be set to only some of the positions.
- Fig. 47 shows an example of reassignment of formation flight positions during evacuation as an example of control using the priority as in Fig. 46. In other words, it shows a control example of deploying an alternative drone 1 at the evacuation position. This corresponds to a specific example of the control of step S311 in Fig. 26.
- Fig. 26 shows an example of control in the shooting standby mode M3, but control using priority can also be applied to the tracking shooting mode M4 in the same way.
- the five drones 1 similar to those in Figure 2 are set with priorities as in Figure 46.
- the photography system may also set the need to continue photography for each priority. For example, a high priority of p2 or higher is set as a priority where photography should be continued as much as possible. A low priority of p3 or lower is set as a priority where photography does not necessarily need to be continued. In other words, if there is an abnormality in the formation flight at a certain position and the aircraft at that position is to be evacuated, if the priority of that position is p2 or p1, a replacement aircraft will be reassigned from another position to continue photography at the high priority position.
- state (1) it is determined that drone 1b at rear position Lb with priority p2 is experiencing a formation flight abnormality.
- Drone 1b the sub-vehicle, is determined to be the target for evacuation.
- the evacuated drone 1b then lands in landing mode M6.
- the imaging system reassigns one sub-machine selected from other normal sub-machines with priority p3 or lower to the rear position Lb.
- the candidate sub-machines with priority p3 or lower are drone 1c in the right position Lc, drone 1d in the left position Ld, and drone 1e in the directly above position Le.
- the imaging system selects drone 1e in the directly above position Le with priority p5, which has the lowest priority, and the selected drone 1e is assigned to the rear position Lb. In other words, the imaging system moves drone 1e from the directly above position Le to the rear position Lb.
- the tracking imaging mode M4 the imaging system causes the camera Ce of drone 1e to capture images facing forward and with a rear angle of view.
- drone 1a the leader drone with priority p1
- FIG. 47 it is assumed that the leader drone 1a with priority p1 is determined to have a formation flight abnormality.
- the entire formation 10 is evacuated, but in this example, only the drone 1a in the forward position is evacuated, and a replacement drone 1 is deployed in the forward position.
- the filming system selects one normal sub-machine from the other drones 1 with priority p2 or lower, and deploys it in the forward position La.
- drone 1e with priority p5 is selected and deployed in the forward position La. This allows drone 1e's camera Ce to continue shooting video at a front angle of view from the forward position La. In other words, video shot at a front angle of view can continue to be provided to the viewer.
- leader aircraft when the leader aircraft is evacuated as described above and another sub-aircraft is reassigned to the evacuated position as a replacement, a role change is performed to change the sub-aircraft to a new leader aircraft. Since at least one aircraft in the formation 10 must be a leader aircraft, when the only leader aircraft is evacuated, a new leader aircraft must be set in its place. In the embodiment, it is assumed that the multiple drones 1 in the formation 10 have similar or equivalent performance. Therefore, even if a sub-aircraft is changed to a new leader aircraft, there is no problem in terms of performance. Also, when there is only one leader aircraft in the formation 10 or when a machine with specific performance is used for the leader aircraft, the above-mentioned evacuation and replacement of the leader aircraft may not be allowed. When there are two or more leader aircraft in the formation 10, the above-mentioned evacuation and replacement of the leader aircraft may be allowed.
- the above-mentioned evacuation action of the emergency evacuation mode M5 is basically to transition to the landing mode M6 (transition s56 in FIG. 19), but is not limited to this.
- the emergency evacuation mode M5 may be controlled to return to the tracking photography mode M4 (transition s54 in FIG. 19).
- Figure 48 shows an example of temporary evacuation and return control in a modified example.
- a drone 1c in the formation 10 at the right position Lc1 is determined to be a target for temporary evacuation when there is an abnormality in the formation flight.
- the abnormality in this case may be a minor abnormality such as a communication failure, or a state in which the degree of abnormality is not very high.
- (2) indicates an escape action.
- the filming system moves drone 1c away from athlete P and the formation 10, for example to position Lc2 on the outer right (+X). Position Lc2 is farther from athlete P than position Lc1. This temporary escape is to maintain following of athlete P (similar to FIG. 38B) and includes a movement component in the movement direction PD1 (+Y). After the movement, position Lc2 follows athlete P and the formation 10 while maintaining a certain relative distance, and filming has stopped.
- the filming system monitors the state of drone 1c at position Lc2.
- (3) shows a case where drone 1c is subsequently returned to its original position Lc1 in tracking shooting mode M4 when the drone 1c's condition, such as a communication problem, improves.
- the shooting system determines that drone 1c is in a state where there are no formation flight abnormalities, and moves it from position Lc2 to position Lc1.
- Position Lc1 after the return is a position where drone 1c follows contestant P.
- Drone 1c can continue photographing contestant P from position Lc1 after the return.
- the temporary evacuation position is the outer right position Lc2, but this is not limited to this, and similar to the above-mentioned example of the evacuation action, the temporary evacuation position can be set in various positional relationships depending on the formation flight position.
- the temporary evacuation position may be a position at a higher altitude.
- the above control example is for temporary evacuation of only one drone 1, temporary evacuation of multiple drones 1 is similarly possible.
- FIG. 49 shows a control example in which the retraction action and the camera zoom control are combined as a modified example.
- the drone 1c at the right position Lc1 which is a sub-machine, is photographing the contestant P from the right side with the camera Cc facing left.
- the shooting system evacuates the drone 1c.
- the retraction action of the drone 1c is, for example, a movement 4900 to the right outside (+X), which is a direction away from the contestant P.
- FIG. 1 shows a control example in which the retraction action and the camera zoom control are combined as a modified example.
- 49 shows a schematic diagram of a right side angle of view captured by the camera Cc at the position Lc1 before retraction 4901, a schematic diagram of a right side angle of view captured by the camera Cc at the position Lc2 after retraction when there is no particular control, and a schematic diagram of a right side angle of view captured by the camera Cc at the position Lc2 after retraction when additional camera zoom control is applied 4903.
- the orientation and zoom amount of the camera Cc are controlled so that the position and size of the athlete P in the camera image are optimal, as in captured image 4901.
- the orientation and distance from the camera Cc to the athlete P changes, so the position and size of the athlete P in the image are no longer optimal, as in captured image 4902.
- the distance between the camera Cc and the athlete P increases, so the athlete P in the image becomes smaller.
- the athlete P is captured in a position that is off-center in the image.
- the filming system additionally applies camera zoom control during evacuation.
- the filming system adjusts the orientation and zoom amount of the camera Cc so that the position and size of the athlete P captured in the image of the camera Cc are as favorable as possible even during the evacuation movement 4900.
- the filming system adjusts the orientation and zoom amount of the camera Cc in accordance with the evacuation action of the drone 1c, that is, in accordance with the evacuation direction, distance, speed, etc. This makes it possible to provide favorable filmed footage 4903 in which the athlete P is captured at a favorable size near the center of the filmed image 4903 during and after evacuation, as in the filmed footage 4903.
- filming can be continued if possible while the drone 1 is being evacuated. Even if the same image as that from the original position cannot be obtained from the position after evacuation, it is useful to obtain an image with content as close as possible.
- the imaging system and imaging method of the first embodiment can provide a technique for ensuring safety and improving the quality of the captured image when a plurality of unmanned aerial vehicles 1 are used to follow and capture an athlete P.
- the formation flight control described above can reduce the risk of contact or collision between the unmanned aerial vehicles 1 and people such as the athlete P and surrounding spectators.
- the quality of the image captured by the unmanned aerial vehicle 1 can be improved as much as possible while ensuring safety. For example, it is possible to obtain an image with a preferable imaging angle of view, etc., so that the distance of the unmanned aerial vehicle 1 to the athlete P can be stabilized and the movements of the athlete P can be well understood.
- components can be added, deleted, or replaced, with the exception of essential components. Unless otherwise specified, each component may be singular or plural. A combination of each embodiment and modified examples is also possible.
- the program of the embodiment is a program for causing a computer (e.g., server 30, drone 1, pilot device 20, or other device) to execute processing by the imaging system of the embodiment.
- the program of the embodiment may be provided stored in a storage medium/recording medium.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
L'invention concerne une technique et similaire permettant d'assurer la sécurité lors du suivi et de l'imagerie de compétiteurs par rapport à une technique de photographie aérienne à l'aide d'un aéronef sans pilote. Le système d'imagerie selon l'invention comprend une pluralité d'aéronefs sans pilote (1) constituant une formation (10) qui vole tout en suivant un joueur P qui est une cible d'imagerie. Chaque aéronef sans pilote (1) image la cible d'imagerie au moyen d'une caméra et fournit une image capturée. La pluralité d'aéronefs sans pilote (1) vole à travers une position en formation de vol ayant une relation prescrite de position par rapport à la cible d'imagerie. Au moins un aéronef sans pilote parmi la pluralité d'aéronefs sans pilote (1) commande le vol des aéronefs sans pilote de telle sorte qu'une valeur de paramètre pour au moins l'une parmi une position relative, une distance relative, une position absolue, une vitesse relative et une vitesse absolue par rapport à la cible d'imagerie satisfait une condition prescrite. Un quelconque aéronef sans pilote (1) qui ne satisfait pas la condition prescrite est déterminé comme étant une anomalie de formation de vol, et l'aéronef sans pilote (1) qui est l'anomalie de formation de vol est déplacé dans une direction à l'opposé de la cible d'imagerie.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2023/032286 WO2025052526A1 (fr) | 2023-09-04 | 2023-09-04 | Système d'imagerie utilisant un aéronef sans pilote, procédé d'imagerie et programme |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2023/032286 WO2025052526A1 (fr) | 2023-09-04 | 2023-09-04 | Système d'imagerie utilisant un aéronef sans pilote, procédé d'imagerie et programme |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025052526A1 true WO2025052526A1 (fr) | 2025-03-13 |
Family
ID=94923246
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/032286 Pending WO2025052526A1 (fr) | 2023-09-04 | 2023-09-04 | Système d'imagerie utilisant un aéronef sans pilote, procédé d'imagerie et programme |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025052526A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180213208A1 (en) * | 2017-01-25 | 2018-07-26 | Samsung Electronics Co., Ltd. | Method and apparatus for determining stereoscopic multimedia information |
| JP2019089361A (ja) * | 2017-11-10 | 2019-06-13 | 中国電力株式会社 | 無人飛行体の制御方法 |
| US20190253626A1 (en) * | 2016-10-27 | 2019-08-15 | Autel Robotics Co., Ltd. | Target tracking method and aircraft |
| WO2020032262A1 (fr) * | 2018-08-09 | 2020-02-13 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Aéronef sans pilote et système de livraison |
| JP6885485B2 (ja) * | 2016-02-03 | 2021-06-16 | ソニーグループ株式会社 | 複数カメラネットワークを利用して静止シーン及び/又は移動シーンを取り込むためのシステム及び方法 |
-
2023
- 2023-09-04 WO PCT/JP2023/032286 patent/WO2025052526A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6885485B2 (ja) * | 2016-02-03 | 2021-06-16 | ソニーグループ株式会社 | 複数カメラネットワークを利用して静止シーン及び/又は移動シーンを取り込むためのシステム及び方法 |
| US20190253626A1 (en) * | 2016-10-27 | 2019-08-15 | Autel Robotics Co., Ltd. | Target tracking method and aircraft |
| US20180213208A1 (en) * | 2017-01-25 | 2018-07-26 | Samsung Electronics Co., Ltd. | Method and apparatus for determining stereoscopic multimedia information |
| JP2019089361A (ja) * | 2017-11-10 | 2019-06-13 | 中国電力株式会社 | 無人飛行体の制御方法 |
| WO2020032262A1 (fr) * | 2018-08-09 | 2020-02-13 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Aéronef sans pilote et système de livraison |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12416918B2 (en) | Unmanned aerial image capture platform | |
| US11644832B2 (en) | User interaction paradigms for a flying digital assistant | |
| US10187580B1 (en) | Action camera system for unmanned aerial vehicle | |
| US11401045B2 (en) | Camera ball turret having high bandwidth data transmission to external image processor | |
| US20180246507A1 (en) | Magic wand interface and other user interaction paradigms for a flying digital assistant | |
| US11490056B2 (en) | Drone system and method of capturing image of vehicle by drone | |
| JPWO2018088037A1 (ja) | 可動型撮像装置の制御装置、可動型撮像装置の制御方法及びプログラム | |
| US12210358B2 (en) | Autonomous orbiting method and device and uav | |
| WO2018081952A1 (fr) | Systèmes et procédés de commande de hauteur d'un objet mobile | |
| CN113795805A (zh) | 无人机的飞行控制方法和无人机 | |
| CN106094876A (zh) | 一种无人机目标锁定系统及其方法 | |
| CN109688323A (zh) | 无人机视觉跟踪系统及其控制方法 | |
| CN112334853A (zh) | 航线调整方法、地面端设备、无人机、系统和存储介质 | |
| WO2020110401A1 (fr) | Véhicule aérien sans pilote, procédé de traitement d'informations et programme | |
| CN114697554A (zh) | 无人机的拍摄方法、系统、终端设备及存储介质 | |
| CN105676862A (zh) | 一种飞行装置控制系统及控制方法 | |
| JP2019185406A (ja) | 制御プログラム、制御方法および制御装置 | |
| US20220129017A1 (en) | Flight body, information processing method, and program | |
| WO2025052526A1 (fr) | Système d'imagerie utilisant un aéronef sans pilote, procédé d'imagerie et programme | |
| JP2006270404A (ja) | 撮影制御装置、撮影制御方法および撮影制御プログラム | |
| EP3919374B1 (fr) | Procédé de capture d'image | |
| WO2018198317A1 (fr) | Système, procédé et programme de photographie aérienne de véhicule aérien sans pilote | |
| CN117716313A (zh) | 无人飞行器及其控制方法、系统和存储介质 | |
| WO2025046866A1 (fr) | Système de commande de dispositif de photographie mobile | |
| WO2024166318A1 (fr) | Système d'imagerie, procédé d'imagerie et programme d'imagerie |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23951430 Country of ref document: EP Kind code of ref document: A1 |