[go: up one dir, main page]

WO2024069789A1 - Aerial imaging system, aerial imaging method, and aerial imaging program - Google Patents

Aerial imaging system, aerial imaging method, and aerial imaging program Download PDF

Info

Publication number
WO2024069789A1
WO2024069789A1 PCT/JP2022/036120 JP2022036120W WO2024069789A1 WO 2024069789 A1 WO2024069789 A1 WO 2024069789A1 JP 2022036120 W JP2022036120 W JP 2022036120W WO 2024069789 A1 WO2024069789 A1 WO 2024069789A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
photographing
shooting
event
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/036120
Other languages
French (fr)
Japanese (ja)
Inventor
望 三浦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Reddotdrone Japan Co Ltd
Reddotdronejapan
Drone IPLab Inc
Original Assignee
Reddotdrone Japan Co Ltd
Reddotdronejapan
Drone IPLab Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Reddotdrone Japan Co Ltd, Reddotdronejapan, Drone IPLab Inc filed Critical Reddotdrone Japan Co Ltd
Priority to PCT/JP2022/036120 priority Critical patent/WO2024069789A1/en
Priority to JP2024548908A priority patent/JPWO2024069789A1/ja
Publication of WO2024069789A1 publication Critical patent/WO2024069789A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras

Definitions

  • the present invention relates to an aerial photography system, an aerial photography method, and an aerial photography program.
  • Patent Document 1 discloses a camera viewpoint display system that detects the aircraft's position and nose direction, as well as the pan and tilt angles of a camera device mounted on the aircraft, calculates the camera viewpoint from each of these pieces of data, and displays the viewpoint on a map on a monitor screen. With this system, an operator controls the aircraft's position and attitude, as well as the camera's shooting direction, while grasping the aircraft's position and heading from a ground station.
  • Patent document 2 discloses a technology that automatically controls the position and shooting direction so that a specific object is tracked and photographed by multiple UAVs.
  • the present invention was made in consideration of the above problems, and aims to provide an aerial photography system that reduces the labor required for photography and enables appropriate photography according to the conditions of the subject.
  • an aerial photography system comprises a moving body that flies over a target area, a camera mounted on the moving body that photographs the target area, an event detection unit that detects an event based on an image captured by the camera or an input from an external system, and a photography condition determination unit that determines photography conditions including at least one of a target photography position and a target photography direction of the moving body according to the detected event.
  • the shooting condition determination unit may determine the shooting conditions according to the type of the detected event.
  • the target shooting direction may be achieved by controlling at least one of the nose direction of the moving body and the angle of the camera relative to the moving body.
  • the shooting conditions may include a target zoom amount for the camera.
  • the shooting condition determination unit may determine the shooting conditions based on the operation received via the controller even if the event detection unit detects the event.
  • the camera may further include a controller that accepts input of the shooting conditions by a user, and the shooting condition determination unit may determine the shooting conditions based on the input via the controller if the event detection unit has not detected the event, and may determine the shooting conditions based on the event if the event detection unit detects the event.
  • the aerial photography system may include a plurality of the moving bodies, and photograph a single target area by flying the plurality of moving bodies simultaneously over the target area, and the photography condition determination unit may determine different photography conditions for each of the plurality of moving bodies.
  • the shooting condition determination unit may set, for multiple moving bodies flying simultaneously, shooting conditions for shooting the same shooting range from different target shooting positions, or shooting conditions for shooting an area including the same shooting range with different zoom amounts.
  • the shooting condition determination unit may determine the shooting conditions according to the predicted result of the ball's trajectory predicted by the event detection unit as the detection result of the event.
  • the shooting condition determination unit may determine the shooting conditions so that, when an event indicating a foul has occurred in a competition held in the target area is detected, the shooting condition has a shooting range of the ball used in the competition or the vicinity of the position of the referee of the competition.
  • the aerial photography system may include a plurality of moving bodies, and photograph a single target area by flying the plurality of moving bodies simultaneously over the target area, and the photography condition determination unit may determine the photography conditions, when an event indicating that a foul has occurred in the competition is detected, by using the plurality of moving bodies to photograph the ball or the vicinity of the point of the competition referee at different target photography positions, target photography directions, or zoom amounts.
  • the system may include a flight path generation unit that generates a flight path for the moving body, and the flight path generation unit may automatically generate the flight path to the target shooting position that is determined based on the event detected from the captured image.
  • the flight path generation unit generates the flight path within the court that is configured within the target area, and the flight path generation unit may generate the flight path to the target shooting position by connecting a plurality of shooting positions that have been set in advance, and may change the shooting positions to be connected depending on the detection status of the event.
  • an aerial photography method includes an event detection step for detecting an event based on an image captured by a camera photographing a target area or an input from an external system, and a photography condition determination step for determining photography conditions including at least one of a target photography position and a target photography direction of a moving body equipped with the camera according to the detected event.
  • an aerial photography program causes a computer to execute an event detection command for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a photography condition determination command for determining photography conditions including at least one of a target photography position and a target photography direction of a moving body equipped with the camera in accordance with the detected event.
  • the computer program may be provided by being stored on various data-readable recording media, or may be provided so as to be downloadable via a network such as the Internet.
  • the present invention reduces the labor required for photography and enables appropriate photography according to the subject's circumstances.
  • FIG. 1 is an overall configuration diagram of an aerial photography system according to an embodiment of the present invention
  • FIG. 2 is a simplified external perspective view of the drone according to the embodiment.
  • FIG. 2 is a functional configuration diagram of the drone according to the embodiment.
  • (a) is a simplified front view of the exterior of the control device of the embodiment;
  • (b) is a schematic diagram showing the direction in which the drone moves or turns in response to input from the control device.
  • FIG. 2 is a functional configuration diagram of the control device according to the embodiment.
  • FIG. 2 is a functional configuration diagram of a server according to the embodiment.
  • 1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies.
  • FIG. 1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies.
  • FIG. 2 is a schematic state transition diagram showing the transition of flight modes of the drone. This is a schematic state transition diagram showing the state transition of the drone depending on the aircraft state of the drone. A schematic state transition diagram showing the state transition of the drone according to the aircraft behavior state of the drone.
  • FIG. 11 is a schematic state transition diagram showing state transitions of a game in a stadium as an example of a field to be photographed.
  • FIG. 2 is a schematic state transition diagram showing the state transition of the offensive and defensive states in the stadium.
  • 1 is a table showing an example of the correspondence between the game state in the stadium and the shooting range, camera position, shooting direction, and zoom amount captured by the drone.
  • FIG. 1 is a schematic diagram showing possible photographing positions and flight paths to which the photographing position can be transitioned; 1 is a flowchart of a control executed during flight of the drone.
  • 16 is a flowchart showing control of flight restrictions in the drone (details of S1002 in FIG. 15). 16 is a flowchart showing flight mode switching control of the drone (details of S1010 in FIG. 15 ).
  • FIG. 4 is a diagram showing a first example of a screen displayed on a terminal of the aerial photography system.
  • FIG. 13 is a diagram showing a second example of a screen displayed on the terminal of the aerial photography system.
  • FIG. 13 is a diagram showing a third example of a screen displayed on the terminal of the aerial photography system.
  • FIG. 13 is a diagram showing a fourth example of a screen displayed on the terminal of the aerial photography system.
  • FIG. 1 is an overall configuration diagram of an aerial photography system 1 (hereinafter also referred to as "system 1") according to one embodiment of the present invention.
  • System 1 uses a drone 100 (an example of a moving body) to take aerial photographs of a competition taking place at a stadium F (FIG. 7) (an example of a target area) or an event taking place at an event venue.
  • Stadium F is an example of a target area.
  • a single system 1 may include multiple drones 100. In this case, system 1 can photograph a single stadium F by flying multiple drones simultaneously over the stadium F.
  • the system 1 mainly includes a controller 200 that allows the pilot to operate the drone 100, a server 300 that manages the flight and photography of the drone 100, an external input device 600, and an external system 700.
  • the drone 100 and the controller 200 are connected to each other via wireless communication (which may include communication via a base station 800).
  • the controller 200 and the server 300 are connected to each other via a communication network 400 such as an internet line.
  • the drone 100 acquires satellite signals from an artificial satellite 500 to determine its own position, etc.
  • the external input device 660 is a device capable of transmitting and receiving information to and from the system 1, separate from the controller 200, and is composed of a mobile terminal such as a smartphone or tablet terminal.
  • the external input device 660 can be operated, for example, by the manager, coach, bench player, referee, or court equipment personnel of the competition taking place at the stadium F.
  • the external input device 660 has, for example, a function for receiving an emergency command to suspend filming, and the drone 100 performs emergency evacuation based on the command.
  • the external input device 660 may also receive an input to switch the flight mode of the drone 100.
  • the external input device 660 may be equipped with a display device, and may display information similar to that displayed on the display unit 201 of the controller 200.
  • the external input device 660 may acquire event information that occurs during the competition. The event information is referred to when the user of the external input device 660 makes an input to switch the flight mode of the drone 100.
  • the external system 700 may be any system configured separately from the system 1. For example, systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied. Multiple external systems 700 may be connected to the system 1. The system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700. In addition, the various external systems 700 may acquire event information that occurs during the competition.
  • systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied.
  • Multiple external systems 700 may be connected to the system 1.
  • the system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700
  • the court facilities system which is an example of the external system 700, may obtain the brightness of the captured image from the system 1, for example, and control the illuminance adjustment or blinking of the lighting in the stadium F.
  • the court facilities system may also receive a request for lighting illuminance from the system 1 and control the illuminance adjustment or blinking.
  • the configuration of system 1 is not limited to that shown in FIG. 1, and the drone 100, the controller 200, the server 300, and the base station 800 may each be connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line.
  • the drone 100 may perform wireless communication directly with the communication network 400 using a communication method such as LTE without going through the controller 200. Therefore, the drone 100, the controller 200, and the base station 800 do not need to perform direct wireless communication, and it is sufficient if they can each be connected to the communication network 400 in a remote location. Therefore, this system configuration is suitable for cases where the drone 100 and the controller 200 are in a remote location (for example, when a pilot operates them remotely).
  • the drone 100, the controller 200, the base station 800, and the server 300 are each connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line, and the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
  • a communication network 400 such as an Internet line
  • the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.
  • multiple servers 300 may be connected to one drone 100 via multiple communication networks 400, i.e., the system may be made redundant.
  • the system may be made redundant.
  • the drone 100 and the controller 200 can be controlled even when they are remotely located, making them suitable for remote operation, but this is not limited to this, and they can also be applied to visual flight in which the pilot manually controls the drone 100 while watching it.
  • the device described in the above embodiment may be realized as a single device, or may be realized by multiple devices (e.g., drone 100, controller 200, cloud server 300) that are partially or completely connected by a communication network 400.
  • each functional unit and memory unit of server 300 may be realized by being implemented in different servers 300, drones 100, and controllers 200 that are connected to each other by the communication network 400.
  • Fig. 2 is a simplified external perspective view of the drone 100 of this embodiment.
  • Fig. 3 is a functional configuration diagram of the drone 100 of this embodiment. As described above, the drone 100 takes aerial photographs of competitions held in the stadium F (Fig. 7) and events held in the event venue.
  • drone refers to any flying object that has the ability to autonomously control its attitude, regardless of the power source (electricity, prime mover, etc.), control method (wireless or wired, and fully autonomous or partially manual, etc.), and whether manned or unmanned.
  • Drones are also sometimes referred to as Unmanned Aerial Vehicles (UAVs), flying objects, multicopters, RPAS (Remote Piloted Aircraft Systems), or UAS (Unmanned Aircraft Systems), etc.
  • the exterior of the drone 100 is mainly composed of a housing 101 and multiple propellers 122.
  • the housing 101 is, for example, a roughly rectangular parallelepiped, but may have any shape.
  • Rod-shaped connecting parts 102 extending laterally are connected to the left and right sides of the housing 101.
  • the other ends of the connecting parts 102 are respectively connected to propellers 122 and motors 121 that rotate the propellers 122.
  • the motors 121 are, for example, electric motors.
  • the propellers 122 may be composed of a single propeller, or may be composed of multiple propellers arranged coaxially.
  • the number and shape of the blades of each propeller are not particularly limited.
  • a propeller guard (not shown) may be provided on the outside of the propeller 122 to prevent the propeller from interfering with obstacles.
  • a photographing camera 141 is held by a camera holder 142 below the housing 101.
  • an obstacle detection camera 131 is disposed on the front surface of the housing 101.
  • the obstacle detection camera 131 is a so-called dual camera consisting of two cameras that form a pair.
  • the obstacle detection camera 131 is disposed so as to capture an image in front of the drone 100.
  • the obstacle detection camera 131 may be disposed not only on the front surface but also on all surfaces of the housing 101, for example, on six surfaces in the case of a housing 101 that is a substantially rectangular parallelepiped.
  • the drone 100 is equipped with an alarm device 250 that alerts people around the drone 100 to the presence of the drone 100.
  • the alarm device 250 has, for example, a warning light 251 and a speaker 252.
  • the warning light 251 is provided for each propeller 122 or motor 121, and is disposed, for example, on each side of multiple motors 121.
  • the warning light 251 may be disposed along the cylindrical side of the motor 121 so that it can be seen from all directions in addition to the front.
  • the speaker 252 outputs an alarm sound and is provided in the housing 101 of the drone 100.
  • the speaker 252 is provided, for example, on the underside of the housing 101, and transmits the alarm sound downwards of the drone 100.
  • the drone 100 is equipped with an arithmetic device such as a CPU (Central Processing Unit) for executing information processing, and storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory), and thereby has the following functional blocks: a measurement unit 110, a flight function unit 120, an obstacle detection unit 130, an imaging unit 140, and a communication unit 150.
  • an arithmetic device such as a CPU (Central Processing Unit) for executing information processing
  • storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory)
  • the measurement unit 110 is a functional unit that measures information related to the drone 100 or its surroundings.
  • the measurement unit 110 has, for example, a position measurement unit 111, a direction measurement unit 112, an altitude measurement unit 113, and a speed measurement unit 114.
  • the measurement unit 110 may also include various sensors that acquire information such as temperature, air pressure, wind speed, and acceleration.
  • the position measurement unit 111 receives signals from the artificial satellites 500 and measures the position (absolute position) of the aircraft based on the signals.
  • the position measurement unit 111 measures its current position using, for example, GNSS (Global Navigation Satellite System), GPS (Global Positioning System), etc., but is not limited to this.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • RTK-GNSS Real Time Kinematic - Global Navigation Satellite System
  • the position information includes at least two-dimensional coordinate information in a planar view (e.g., latitude, longitude), and preferably includes three-dimensional coordinate information including altitude information.
  • the base station 800 which provides information on the reference points of fixed stations used for relative positioning such as RTK, is connected to the drone 100 and the controller 200 so as to be able to communicate wirelessly with them, making it possible to measure the position of the drone 100 with greater accuracy.
  • the base station 800 can be omitted, or the accuracy of the position coordinate estimation of the base station 800 or the drone 100 can be further improved.
  • the orientation measurement unit 112 measures the orientation of the aircraft (nose direction, heading direction).
  • the orientation measurement unit 112 is composed of a geomagnetic sensor that measures the nose direction (heading direction) of the drone 100 aircraft by measuring geomagnetism, a compass, etc.
  • the altitude measurement unit 113 measures the altitude above the ground (hereinafter also referred to as "flight altitude”) as the distance from the ground below the drone 100 (vertically downward).
  • the speed measurement unit 114 detects the flight speed of the drone 100.
  • the speed measurement unit 114 may measure the speed using a known sensor such as a gyro sensor.
  • Flight function unit 120 is a mechanism and function unit that causes the drone 100 to fly, and generates thrust in the airframe for lifting the drone 100 and moving it in a desired direction. As shown in Figures 2 and 3, the flight function unit 120 has a plurality of motors 121, a plurality of propellers 122, and a flight control unit 123.
  • the flight control unit 123 independently controls the multiple motors 121 to rotate each propeller 122, causing the drone 100 to perform various operations such as taking off, moving forward, turning, and landing, and controls the attitude angle control and flight operations of the drone 100 from takeoff, during flight, and until landing.
  • the flight control unit 123 has a processing unit, also called a flight controller.
  • the processing unit can have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU), MPU, or DSP).
  • the processing unit has access to a memory (storage unit).
  • the memory stores logic, code, and/or program instructions that the processing unit can execute to perform one or more steps.
  • the memory may include, for example, a separable medium such as an SD card or RAM, or an external storage device.
  • Various data acquired by the measurement unit 110, or video or still image data captured by the imaging camera 141 may be directly transmitted to and stored in the memory. Each data may also be recorded in an external memory.
  • the processing unit includes a control module configured to control the state of the drone 100.
  • the control module controls the flight function section 120 (thrust generating section) of the drone 100 to adjust the spatial arrangement, attitude angle, angular velocity, angular acceleration, angular jerk rate, and/or acceleration of the drone 100 having six degrees of freedom (translational motion x, y, and z, and rotational motion ⁇ x, ⁇ y, and ⁇ z).
  • the flight control unit 123 can control the flight of the drone 100 based on control signals from the pilot 200 or based on a preset autonomous flight program.
  • the flight control unit 123 can also control the flight of the drone 100 by controlling the motor 121 based on various information such as the field to be photographed, flight permitted/prohibited areas, information on the corresponding flight geofences, map information including two-dimensional or three-dimensional map data, the current position information of the drone 100, attitude information (heading information), speed information, and acceleration information, and any combination of these.
  • shooting target field or “target area” refers to a two-dimensional location (for example, the stadium F) that is the subject of shooting.
  • FIG. 7 is a schematic diagram showing an example of a playing field F, which is an example of a field to be photographed by a drone, viewed from above.
  • the playing field F is composed of a court F100 that is roughly rectangular and is defined by, for example, a straight outer edge, and an outer court area F200 that is a predetermined area that covers the outer edge of the court F100.
  • the outer edge of the court F100 is composed of mutually opposing goal lines F110a, F110b and mutually opposing touch lines F111a, F111b that are connected at roughly right angles.
  • the connection points of the goal lines F110a, F110b and the touch lines F111a, F111b are the corners F112a, F113a, F112b, F113b.
  • Goals F120a, F120b are provided approximately in the center of the pair of goal lines F110a, F110b.
  • Penalty areas F130a, F130b are defined in specific areas inside the court F100 adjacent to the goals F120a, F120b, and penalty lines F140a, F140b are drawn on the outer edges of the penalty areas.
  • a halfway line F150 is drawn in the center of the court F100, connecting the midpoints of a pair of touchlines F111a, F111b and dividing the court F100 into approximately equal parts.
  • the halfway line F150 is approximately parallel to the goal lines F110a, F110b.
  • goal lines F110a, F110b, touchlines F111a, F111b, penalty lines F140a, F140b, and halfway line F150 are required by the rules for players to play the game, and therefore all of these lines are generally drawn in a manner that allows them to be seen, but the technical scope of the present invention is not limited to this.
  • a soccer stadium is used as an example, but the sports that are photographed by the system of the present invention are not limited to soccer, and include any type of sports, such as tennis.
  • the subject of the photography is not limited to sports, and the system can also be applied to other events (concerts, ceremonies, etc.).
  • the shooting positions L101-L105, L206-L215 may be two-dimensional coordinates on a plane, or may be three-dimensional coordinate information that also defines the height at the corresponding positions.
  • the flight height of the drone 100 may be manually controllable based on input from the controller 200.
  • Photographing positions L101 to L105 are defined, for example, on the touchline F111b, at approximately equal intervals along the touchline F111b.
  • photographing position L101 is a point located in a range including the intersection of the halfway line F150 and the touchline F111b and slightly outside the court F100.
  • Photographing positions L103 and L105 are points near the corners F112a or F112b on both sides of the touchline F111b.
  • Photographing positions L102 and L104 are points between photographing positions L103 and L105 and photographing position L101. Note that the above positions are merely examples, and are not limited to these and may be any appropriate positions.
  • Photographing positions L206 to L215 are points defined within the court F100.
  • photographing positions L206 and L211 are points near the center of the penalty lines F140a and F140b on a line parallel to the goal lines F110a and F110b, and are so-called goal-front photographing positions.
  • Photographing positions L207 and L212 are photographing positions closer to the touchline or F111a or F111b and closer to the halfway line F150 than photographing positions L206 and L215. More specifically, for example, photographing positions L207 and L212 are points on an imaginary line segment connecting photographing position L101 and goals F120a and F120b, and are, for example, points approximately in the center of the imaginary line segment.
  • Photographing positions L209 and L215 are points that are linearly symmetrical to photographing positions L207 and L212.
  • Shooting position L208 is a point between shooting position L207 and the halfway line F150
  • shooting position L210 is a point between shooting position L209 and the halfway line F150
  • shooting position L213 is a point between shooting position L212 and the halfway line F150
  • shooting position L214 is a point between shooting position L215 and the halfway line F150.
  • an evacuation point H200 is set to which the drone 100 is to be evacuated if an abnormality or malfunction of the drone 100 or the system 1 is detected.
  • the abnormality referred to here is an abnormality related to the stability of the aerial movement of the drone 100.
  • the abnormality includes, for example, a case where the calculation load associated with the operation control (behavior control, shooting control, etc.) of the drone 100 exceeds a load threshold.
  • the abnormality may include a transient abnormality related to the environment, such as a case where the measured value of the behavior control value (e.g. speed) of the drone 100 exceeds an allowable value due to the influence of a strong wind or the like.
  • the evacuation point H200 is set at a point different from the shooting positions L101 to L105 and L206 to L215, and in this embodiment, it is set outside the touchline F111a and along the touchline F111a. There may be multiple evacuation points H200, and in this embodiment, there are three.
  • the evacuation point H220 is set near the extension of the halfway line F150.
  • the evacuation points H210 and H230 are set closer to the goals F120a and F120b than the shooting positions L206 and L211.
  • the evacuation points H210 and H230 are set at the ends of an area partitioned by a geofence G200, which will be described later, for example.
  • the drone 100 is replaced or the battery installed in the drone 100 is replaced.
  • a geofence indicates a virtual boundary line that divides an area, and in particular, the geofence in this embodiment indicates a fence that is the boundary line between a flight-permitted area, where the drone 100 is permitted to fly or move, and a no-fly area.
  • a geofence is a boundary line that divides an area that extends three-dimensionally, including in plane and height. If a moving object such as the drone 100 comes into contact with a geofence, flight or movement is restricted to prevent the aircraft from flying outside the flight-permitted area.
  • the boundary line of the geofence in the height direction may include an upper limit and a lower limit.
  • the geofences G100, G200 that are applied to whether or not flight is permitted are switched according to the control of the system 1 while the drone 100 is flying.
  • the number of geofences G100, G200 depicted in the figure is two, but the number is arbitrary, and specifically may be three or more.
  • the geofence G100 is an area that includes the shooting positions L101 to L105, and defines an area that includes the touchline F111b and the area nearby. In other words, the geofence G100 is defined near the outer edge of the court F100, and a portion of it extends into the outer court area F200.
  • the geofence G200 is a geofence that is primarily applied in the outer edge flight mode M102, which will be described later.
  • the geofence G200 is an area that includes the shooting positions L206 to L215, and is set at least inside the court F100. This geofence G200 is a geofence that is primarily applied in the on-court flight mode M105, which will be described later.
  • the areas defined by the multiple geofences G100, G200 at least partially contact or overlap each other.
  • the areas defined by the multiple geofences G100, G200 also overlap in the height direction.
  • the heights of the multiple geofences G100, G200 may differ from each other. Specifically, the lower limit of the altitude of the geofence G200 set inside the stadium F is set higher than the lower limit of the altitude of the geofence G100 set on the outer edge of the stadium F.
  • the obstacle detection unit 130 is a functional unit that detects obstacles around the drone 100.
  • the obstacles may include, for example, people, for example, players, objects, animals such as birds, fixed equipment, and a ball.
  • the obstacle detection unit 130 measures the position, speed vector, and the like of an obstacle located, for example, below the drone 100 based on the acquired image.
  • the obstacle detection unit 130 includes, for example, an obstacle detection camera 131, a ToF (Time of Flight) sensor 132, and a laser sensor 133.
  • the ToF sensor 132 measures the time it takes for a laser pulse emitted from the sensor to return to the light receiving element in the sensor, and measures the distance to an object by converting this time into distance.
  • the laser sensor 133 uses, for example, the LiDAR (Light Detection And Ranging) method to shine light such as near-infrared light, visible light, or ultraviolet light on the target object and measure the distance by capturing the reflected light with an optical sensor.
  • LiDAR Light Detection And Ranging
  • FIG. 2 shows that the obstacle detection camera 131 is positioned facing forward, but the type, position and number of the camera 131, ToF sensor 132 and laser sensor 133 are arbitrary, and the ToF sensor 132 or laser sensor 133 may be positioned instead of the camera 131, or the ToF sensor 132 or laser sensor 133 may be provided on all six surfaces of the housing 101, i.e., the front, back, top, bottom and both sides.
  • the photographing unit 140 is a functional unit that photographs images of a competition in the stadium F (FIG. 7) or an event at an event venue, and includes a photographing camera 141, a camera holding unit 142, and a photographing control unit 143. As shown in FIG. 2, the photographing camera 141 (imaging device) is disposed at the bottom of the main body of the drone 100, and outputs image data related to a peripheral image photographed around the drone 100.
  • the photographing camera 141 is a video camera (color camera) that photographs moving images.
  • the moving images may include audio data acquired by a microphone (not shown). In addition to or instead of this, the photographing camera 141 may also be configured to photograph still images.
  • the orientation of the photographic camera 141 (the attitude of the photographic camera 141 relative to the housing 101 of the drone 100) can be adjusted by a camera actuator (not shown) built into the camera holding unit 142.
  • the photographic camera 141 may have an automatic control function for parameters such as exposure, contrast, or ISO.
  • the camera holding unit 142 may have a so-called gimbal control mechanism that suppresses the transmission of shaking or vibration of the aircraft to the photographic camera 141.
  • the photographic control unit 143 controls the photographic camera 141 and the camera holding unit 142 to adjust the orientation of the photographic camera 141, the photographic magnification (zoom amount), the camera's photographic conditions, etc.
  • Image data acquired by the photographic camera 141 can be transmitted to the memory unit of the drone 100 itself, the pilot 200, the server 300, etc.
  • the communication unit 150 is capable of radio wave communication via the communication network 400 and includes, for example, a radio wave communication module.
  • the communication unit 150 is capable of communication with the controller 200 and the like via the communication network 400 (including the wireless base station 800).
  • FIG. 4 is a front view of the appearance of the controller 200 of this embodiment in a simplified manner.
  • FIG. 5 is a functional configuration diagram of the controller 200 of this embodiment.
  • the controller 200 is a mobile information terminal that controls the drone 100 by the operation of the pilot and displays information received from the drone 100 (e.g., position, altitude, remaining battery level, camera image, etc.).
  • the flight state (altitude, attitude, etc.) of the drone 100 may be remotely controlled by the pilot 200, or may be autonomously controlled by the drone 100.
  • the drone 100 performs autonomous flight.
  • manual operation may be possible during basic operations such as takeoff and return, and in an emergency.
  • the controller 200 includes a display unit 201 and an input unit 202 as a hardware configuration.
  • the display unit 201 and the input unit 202 are connected to each other so that they can communicate with each other wired or wirelessly.
  • the display unit 201 may be configured as a touch panel or liquid crystal monitor that is integrated into the controller 200, or may be configured as a display device such as a liquid crystal monitor, tablet terminal, or smartphone that is connected to the controller 200 wired or wirelessly.
  • the display unit 201 as a hardware configuration may be configured as a touch panel display by integrally incorporating an element that accepts input such as touch.
  • the input unit 202 is a mechanism through which the pilot inputs operational commands such as flight direction and takeoff/landing when piloting the drone 100. As shown in FIG. 4A, the input unit 202 has a left slider 326L, a right slider 326R, a left input stick 327L, a right input stick 327R, a power button 328, and a return button 329.
  • the left slider 326L and the right slider 326R are operators that accept, for example, an input of 0/1, or an input of one-dimensional stepless or stepwise information, and the operator slides the left and right index fingers to input, for example, while holding the controller 200 in his/her hand.
  • the left input stick 327L and the right input stick 327R are operators that accept an input of multi-dimensional stepless or stepwise information, and are, for example, so-called joysticks.
  • the left input stick 327L and the right input stick 327R may also accept an input of 0/1 by pressing them.
  • the power button 328 and the return button 329 are operators that accept pressing them, and are constituted by mechanical switches or the like.
  • the left input stick 327L and the right input stick 327R accept input operations that instruct the three-dimensional flight operations of the drone 100, including, for example, takeoff, landing, ascent, descent, right turn, left turn, forward movement, backward movement, left movement, and right movement.
  • Figure 4(b) is a schematic diagram showing the movement direction or rotation direction of the drone 100 corresponding to each input of the left input stick 327L and right input stick 327R shown in Figure 4(a). Note that this correspondence is an example.
  • the controller 200 includes a processor such as a CPU for executing information processing, and storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
  • a processor such as a CPU for executing information processing
  • storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.
  • the display control unit 210 displays to the pilot the drone 100 or the status information of the drone 100 acquired from the server 300.
  • the display control unit 210 can display images relating to various information such as the shooting target field, flight permitted/prohibited areas, flight geofence, map information, current position information of the drone 100, attitude information (directional information), speed information, acceleration information, and remaining battery power.
  • the "current position information” referred to here is sufficient to include information on the horizontal position of the current position of the drone 100 (i.e., latitude and longitude), and does not need to include altitude information (absolute altitude or relative altitude).
  • the display control unit 210 has a mode display unit 211 and a shooting status display unit 212.
  • the mode display unit 211 is a functional unit that displays at least the state, i.e., the mode, to which the drone 100 belongs on the display unit 201.
  • the mode to which the drone 100 belongs is, for example, the flight mode shown in FIG. 8, but instead of or in addition to this, the aircraft state shown in FIG. 9, the aircraft action state shown in FIG. 10, the game state shown in FIG. 11, or the offensive and defensive states shown in FIG. 12 may be displayed on the display unit 201.
  • the screen G1 displayed on the display unit 201 displays, for example, a display field G21 for the flight mode to which the drone 100 belongs, as well as a status display field G22 showing the aircraft status, aircraft behavior status, match status, and offensive/defensive status.
  • the display 5 is a functional unit that displays, on the display unit 201, an image captured by the imaging camera 141 mounted on the drone 100.
  • the screen G1 displayed on the display unit 201 displays, for example, an image field G40 in which an image being captured by the drone 100 is displayed.
  • the screen G1 and each state will be described in detail later.
  • the input control unit 220 shown in FIG. 5 receives various inputs from a user such as a pilot.
  • the input control unit 220 of this embodiment mainly has the following functional units: an aircraft position operation unit 221, an aircraft attitude operation unit 222, a camera attitude operation unit 223, a camera zoom operation unit 224, a flight mode switching unit 225, a target position receiving unit 226, a power supply input unit 227, and a return input unit 228.
  • the aircraft position operation unit 221 includes an up/down movement input unit 221a and a left/right movement input unit 221b.
  • the aircraft attitude operation unit 222 includes a forward/backward movement input unit 222a and a yaw rotation input unit 222b.
  • the up-down movement input unit 221a is an input unit for allowing the pilot to move the drone 100 up and down, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved upward (toward the back when held in the hand), the drone 100 rises, and when the right input stick 327R is moved downward (toward the front when held in the hand), the drone 100 descends.
  • the left-right movement input unit 221b is an input unit for allowing the pilot to move the drone 100 left and right, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved to the right, the drone 100 moves to the right, and when the right input stick 327R is moved to the left, the drone 100 moves to the left.
  • the forward/backward movement input unit 222a is an input unit for allowing the pilot to move the drone 100 forward/backward, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved upward (toward the rear when held in the hand), the drone 100 moves forward, and when the left input stick 327L is moved downward (toward the front when held in the hand), the drone 100 moves backward.
  • the yaw rotation input unit 222b is an input unit for allowing the pilot to yaw rotate the drone 100, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved to the right, the drone 100 turns right, and when the left input stick 327L is moved to the left, the drone 100 turns left.
  • the camera attitude operation unit 223 is an input unit for operating the camera holding unit 142 via the shooting control unit 143 and for controlling the orientation of the shooting camera 141 relative to the housing 101 of the drone 100.
  • the camera attitude operation unit 223 obtains input to the right slider 326R.
  • the camera attitude operation unit 223 accepts operation of either or both of the pitch angle and yaw angle of the shooting camera 141 relative to the housing 101.
  • the camera zoom operation unit 224 is an input unit for operating the shooting magnification of the shooting camera 141, i.e., the zoom amount, and obtains input to the left slider 326L.
  • the flight mode switching unit 225 is an input unit for switching flight modes. Flight modes selectable by the flight mode switching unit 225 include at least, for example, the outer edge flight mode M102 (see FIG. 8), the inside court flight mode M105 (see FIG. 8), and the fixed position flight mode M103 or M107 (see FIG. 8).
  • the flight mode switching unit 225 accepts switching of flight modes via, for example, a touch panel display integrated with the display unit 201.
  • the target position receiving unit 226 is a functional unit that receives input of a target shooting position to which the drone 100 should head.
  • the target position receiving unit 226 receives input of a point on the stadium F. For example, when at least a portion of an image or schematic diagram of the stadium F is displayed on the display unit 201, the target position receiving unit 226 may receive input of the target shooting position via a touch panel display that is configured integrally with the display unit 201.
  • the target position receiving unit 226 may receive a selection input of a target shooting position when a point that can be selected as the target shooting position, i.e., a shooting position, is specified in advance.
  • the flight modes of the drone 100 mainly include a pre-preparation mode M100, an off-court takeoff and landing mode M101, an outer edge flight mode M102, an off-court fixed position flight mode M103, an on-court entry mode M104, an on-court flight mode M105, an off-court exit mode M106, an on-court fixed position flight mode M107, and an on-court takeoff and landing mode M108.
  • the advance preparation mode M100 is a mode in which advance settings such as geofence settings are made.
  • the advance preparation mode M100 transitions to an off-court takeoff and landing mode M101.
  • this off-court takeoff and landing mode M101 the drone 100 takes off from point L101g (see FIG. 14). Note that in the off-court takeoff and landing mode M101, the drone 100 may take off from a point outside the court F100 other than point L101g.
  • the off-court takeoff and landing mode M101 is the mode to which the drone 100 belongs when control starts or ends.
  • the drone 100 transitions from the off-court takeoff and landing mode M101 to the perimeter flight mode M102.
  • the outer edge flight mode M102 is a mode in which the drone flies above the outer edge along part or all of the outer edge of the court F100 to photograph the playing field F, and more specifically, flies at one of the photographing positions L101 to L105 (Fig. 14) to photograph.
  • the outer edge flight mode M102 is a mode in which the drone flies above the touchline F111b.
  • the "outer edge" on which the outer edge flight mode M102 flies is a concept that includes not only directly above the touchline F111b but also slightly outside the court F100.
  • the drone 100 receives user instructions via the target position receiving unit 226 of the pilot 200 and flies at one of the specified shooting positions L101 to L105.
  • the shooting direction may be manually controlled according to the user's instructions, or may be fixed at a specified angle.
  • the drone 100 may change its shooting position while keeping the shooting direction fixed, a so-called dolly shooting technique, in which the drone 100 follows and shoots a specific player.
  • the outer edge flight mode M102 can transition to an outside court takeoff and landing mode M101, an outside court fixed position flight mode M103, or an inside court approach mode M104.
  • the outside-court fixed position flight mode M103 is a mode in which the drone 100 flies in a fixed position outside the area of the court F100.
  • the outside-court fixed position flight mode M103 can transition to the outer edge flight mode M102.
  • the on-court entry mode M104 is a mode in which the drone 100 performs a series of processes required for entering the area of the court F100. The drone 100 transitions to the on-court flight mode M105 via the on-court entry mode M104.
  • Intra-court flight mode M105 is a mode in which the drone flies above the court F100 to photograph the stadium F, and more specifically, flies at one of the photographing positions L206 to L215 ( Figure 7) to photograph.
  • the drone accepts a user command to select a photographing position via the target position receiving unit 226 of the controller 200, and flies at one of the specified photographing positions L206 to L215.
  • the photographing direction may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle.
  • the on-court flight mode M105 can transition to an off-court exit mode M106, an on-court fixed position flight mode M107, or an on-court takeoff and landing mode M108.
  • the court exit mode M106 is a mode in which the drone 100 performs a series of processes required for the drone 100 to exit the area of the court F100.
  • the drone 100 transitions from the court exit mode M106 to the outer edge flight mode M102. Note that the court exit mode M106 and the court entry mode M104 can transition back and forth.
  • the on-court fixed position flight mode M107 is a mode in which the drone flies in a fixed position within the area of the court F100.
  • the on-court fixed position flight mode M107 can transition to the on-court flight mode M105.
  • the on-court takeoff and landing mode M108 is a mode in which the drone takes off and lands within the area of the court F100, and is a mode to which the drone transitions mainly when a command to land on the spot is issued by manual intervention.
  • a drone that takes off in the on-court takeoff and landing mode M108 transitions to the on-court flight mode M105.
  • the power input unit 227 is a functional unit that accepts the power on/off command for the controller 200 via the power button 328.
  • the return input unit 228 is a functional unit that accepts a command to return the drone 100 located in the stadium F ( Figure 7) to the target landing point L101g (see Figure 14) via the return button 329.
  • the input control unit 220 may be capable of receiving touch input to the display unit 201 and transmitting control commands to the drone 100 in response to the input. More specifically, for example, when the user selects appropriate information such as a map or schematic diagram displayed on the display unit 201, a route to the selected point may be automatically generated, causing the drone 100 to fly autonomously.
  • the communication unit 240 is a functional unit that transmits and receives signals between the controller 200 and an appropriate configuration included in the system 1.
  • the controller 200 has a communication function that performs wireless communication with the drone 100 by wireless communication using Wi-Fi, 2.4 GHz, and 5.6 to 5.8 GHz frequency bands.
  • the controller 200 also has a wireless communication function that can communicate with the server 300 via the communication network 400 using a communication standard such as LTE (Long Term Evolution).
  • the communication unit 240 transmits various input signals by a user such as a pilot to the drone 100 or the server 300.
  • the communication unit 240 also receives signals from the drone 100 or the server 300.
  • (A-1-4. Server 300) (A-1-4-1. Overview of Server 300) 6 is a functional configuration diagram of the server 300 according to the present embodiment.
  • the server 300 manages or controls the flight and photography of the drone 100.
  • the server 300 includes an input/output unit (not shown) for inputting or outputting various types of information (image output, audio output).
  • the server 300 may be a general-purpose computer such as a workstation or personal computer, or may be logically realized by cloud computing.
  • the server 300 is equipped with a calculation device such as a CPU for executing information processing, and storage devices such as RAM and ROM, and as a software configuration, it mainly configures the following functional blocks: a presetting unit 310, an event detection unit 320, a photography condition determination unit 325, a flight mode switching unit 330, an outer edge flight control unit 340, an in-court flight control unit 350, a fixed position flight control unit 360, a communication unit 370, and a memory unit 380.
  • a calculation device such as a CPU for executing information processing
  • storage devices such as RAM and ROM
  • the pre-setting unit 310 is a functional unit that performs the settings necessary for the flight of the drone 100 before the drone 100 flies over the field to be photographed.
  • the presetting unit 310 mainly includes a geofence setting unit 311 .
  • the geofence setting unit 311 is a functional unit that sets the geofence of the drone 100.
  • the geofence includes information on the planar direction and the height direction.
  • the geofence setting unit 311 sets a geofence according to the flight mode. That is, for example, the geofence setting unit 311 activates the geofence G100 (see FIG. 7) in the outer edge flight mode M102 (see FIG. 8). The geofence setting unit 311 also activates the geofence G200 (see FIG. 7) in the on-court flight mode M105 (see FIG. 8). The geofence setting unit 311 also sets a geofence different from the geofence G100 or geofence G200, a so-called third geofence, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the on-court flight mode M105, that is, the off-court exit mode M106 and the on-court entry mode M104.
  • the third geofence is a geofence that covers a combined area that combines the first area defined by the geofence in the first flight mode and the second area defined by the geofence in the second flight mode.
  • the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105 overlap. Therefore, the third geofence is a geofence that divides the area that combines the geofences G100 and G200.
  • the third geofence may be a geofence that covers an area that combines a first area defined by the first geofence corresponding to the first flight mode, a second area defined by the second geofence corresponding to the second flight mode, and a gap between the first area and the second area.
  • the event detection unit 320 is a functional unit that detects the state of the subject to be photographed or the drone 100.
  • the event detection unit 320 detects an event based on the camera image of the photographing camera 141 or an input from the external system 700.
  • the detection criteria for each event are stored in, for example, the storage unit 380, and the event detection unit 320 detects an event by referring to the storage unit 380.
  • the event detection unit 320 may also detect an event by analysis using a neural network.
  • the detection process by the event detection unit 320 can be performed using any known appropriate image analysis technology.
  • the event detection unit 320 detects events that trigger a change in the flight mode or shooting conditions of the drone 100.
  • the event detection unit 320 mainly has an aircraft state acquisition unit 321 , an aircraft action state acquisition unit 322 , a game state acquisition unit 323 , and an offensive/defensive state acquisition unit 324 .
  • the aircraft status acquisition unit 321 is a functional unit that acquires the aircraft status of the drone 100.
  • 9 is a diagram showing the state transition of the aircraft state of the drone 100.
  • the aircraft state is broadly divided into, for example, a normal operation flight mode M200, a detection and judgment mode M210, and an action mode M220.
  • the drone 100 starts flying, the drone 100 transitions to the normal operation flight mode M200.
  • the detection and judgment mode M210 includes an abnormality detection mode M211, a failure detection mode M212, a manual intervention mode M213, and a low battery mode M214.
  • abnormality detection mode M211 if an abnormality is detected in normal operation flight mode M200, the mode transitions to abnormality detection mode M211.
  • This abnormality is a transient, in other words, reversible disturbance such as a drop in radio wave strength or strong winds. If the abnormality is resolved in abnormality detection mode M211, the mode transitions to normal operation flight mode M200.
  • the drone 100 transitions to failure detection mode M212. If a manual control command is received, the drone transitions to manual intervention mode M213, and if it is detected that the remaining battery charge is less than a predetermined value, the drone transitions to low battery mode M214. In addition, if a manual control command is received in abnormality detection mode M211, failure detection mode M212, or low battery mode M214, the drone transitions to manual intervention mode M213. The drone 100 transitions to an action mode M220 that corresponds to the detection judgment mode M210.
  • the action mode M220 is a state in which the drone 100 performs a series of actions that are preset for each state.
  • the action mode M220 includes a landing mode M221 at an evacuation point, an emergency stop mode M222, a landing on the spot mode M223, a return mode M224, and a fixed position flight mode M225.
  • the landing at evacuation point mode M221 is set to fly the drone 100 to the evacuation point H200 and land it.
  • the landing at evacuation point mode M221 is entered when the abnormality is not resolved in the abnormality detection mode M211.
  • the emergency stop mode M222 is set to stop the propellers 122 on the spot. In the emergency stop mode M222, the drone 100 falls freely.
  • the emergency stop mode M222 can be selected in the manual intervention mode M213 when the propellers 122 are about to come into contact with a person or object.
  • the on-site landing mode M223 is set to perform a soft landing on the spot.
  • the return mode M224 is set to return to the takeoff and landing point.
  • the fixed position flight mode M225 is a state in which the drone flies at a fixed position, and can transition to the normal operation flight mode M200 based on a user operation.
  • the user operation is input, for example, by selecting a button displayed on the display unit 201.
  • the fixed position flight mode M225 if an event that can transition from the normal operation flight mode M200 to the detection and judgment mode M210, i.e., an abnormality, a malfunction, manual intervention, or a low battery, is detected, the drone 100 transitions from the fixed position flight mode M225 to each state of the detection and judgment mode M210 via the normal operation flight mode M200.
  • the drone 100 in the fixed position flight mode M225 can transition to the return mode M224 based on a user operation.
  • the drone 100 in the abnormality detection mode M211 and the failure detection mode M212 transitions to a landing mode M221 at an evacuation point.
  • the drone 100 in the manual intervention mode M213 transitions to one of the following states depending on the input command: landing mode M221 at an evacuation point, emergency stop mode M222, landing on the spot mode M223, return mode M224, and fixed position flight mode M225.
  • the drone 100 in the low battery mode M214 transitions to the return mode M224.
  • the drone 100 in the normal operation flight mode M200 can also transition to the return mode M224 based on a user operation.
  • the user operation is input, for example, by selecting a button displayed on the display unit 201.
  • the aircraft behavior state acquisition unit 322 is a functional unit that acquires the aircraft behavior state of the drone 100.
  • Each mode of the aircraft behavior state is a sub-mode of the aircraft state that is performed to realize a transition of the aircraft state.
  • 10 is a diagram showing state transitions of the aircraft's behavioral states.
  • the aircraft's behavioral states are broadly divided into a takeoff mode M300, an evacuation mode M310, a normal mode M320, and a landing mode M340, for example.
  • Takeoff mode M300 is a mode in which drone 100 takes off.
  • the state transition of the aircraft's behavior state starts from takeoff mode M300.
  • the aircraft's behavior state transitions from takeoff mode M300 to evacuation mode M310 or normal mode M320.
  • Evacuation mode M310 mainly includes evacuation point arrival stationary mode M311 and evacuation moving mode M312.
  • Normal mode M320 also includes point arrival stationary mode M321 and moving mode M322.
  • Evacuation mode M310 and normal mode M320 can transition to each other via temporary suspension mode M330. This is just one example.
  • the evacuation point arrival stationary mode M311 is a mode in which the drone moves to the evacuation point H200 and remains stationary there, i.e., hovers.
  • the drone 100 in the evacuation point arrival stationary mode M311 transitions to evacuation in-motion mode M312.
  • the aircraft behavior state transitions from the evacuation point arrival stationary mode M311 or the evacuation in-motion mode M312 to the temporary suspension mode M330.
  • Point arrival stationary mode M321 is a mode in which the drone moves to a specified destination and remains stationary on the spot, i.e., hovers.
  • moving to another location in normal use conditions the drone 100 in point arrival stationary mode M321 transitions to moving mode M322.
  • the aircraft behavior state transitions from point arrival stationary mode M321 or moving mode M322 to temporary suspension mode M330.
  • the drone 100 in the evacuation point arrival stationary mode M311, the point arrival stationary mode M321, the moving mode M322, and the pause mode M330 can transition to the landing mode M340.
  • the aircraft's operating state ends processing in the landing mode M340.
  • the game status acquisition unit 323 shown in FIG. 6 is a functional unit that acquires the game status of the competition held at the stadium F.
  • the game status acquisition unit 323 detects the game status by performing image processing on the captured image.
  • the game status acquisition unit 323 may also acquire the game status based on decision-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700.
  • the game status acquisition unit 323 may acquire the game status based on information input from the external input device 600 held by a team member, for example, the manager or coach.
  • FIG. 11 is a diagram showing an example of state transitions in a match state.
  • the match state includes a pre-match state M400, a normal play state M410, and an end-of-match state M460.
  • the state transition starts from the pre-match state M400, and transitions from the pre-match state M400 to the normal play state M410.
  • the normal play state M410 is a state in which the game is progressing. When the match ends, transitions from the normal play state M410 to the end-of-match state M460. Note that a transition from the normal play state M410 to the end-of-match state M460 may occur not only at the end of the match, but also during a break during the match, such as halftime.
  • the game state also includes a play suspended without foul play state M420 and a play suspended with foul play state M440.
  • a transition to the play suspended without foul play state M420 occurs.
  • the play suspended without foul play state M420 occurs, for example, when the ball crosses the goal line F110a, F110b or the touch line F111a, F111b and goes out of the court.
  • a transition to a throw-in state M421, a goal kick state M422, or a corner kick state M423 occurs in accordance with events that occur according to the rules of the game, such as the type of line the ball crossed or the affiliation of the player who kicked the ball out of the court.
  • the throw-in state M421, the goal kick state M422, or the corner kick state M423 transitions to the normal play state M410.
  • a transition to the foul state M431 occurs.
  • an offside occurs or is recognized by the referee
  • a transition to the offside state M432 occurs.
  • a transition from the foul state M431 or the offside state M432 occurs to the foul play interruption state M440.
  • a transition to the free kick state M441 or the penalty kick state M442 occurs depending on the location where the foul occurred and the event that occurred.
  • a so-called indirect free kick may be performed instead of a free kick.
  • the free kick state M441 may be subdivided into a free kick state for the attacking side and a free kick state for the defending side.
  • the free kick state M441 and the penalty kick state M442 when the event in each state ends, the match is resumed and the match state transitions to the normal play state M410.
  • the normal play state M410 transitions to the post-match state M460, and the state transition for the match state ends.
  • the normal play state M410 may also transition to a penalty shootout state M443. Although not shown in the figure, the penalty shootout state M443 may transition to an end-of-match state M460, thereby terminating the state transition.
  • Some of the game states shown in FIG. 11 may trigger a change in flight mode, while other game states may not.
  • the flight mode may be changed based on a transition to the shaded states in the figure, i.e., the pre-game state M400, goal kick state M422, corner kick state M423, free kick state M441, penalty kick state M442, player substitution state M450, and end-of-game state M460.
  • the flight mode may be changed to one that corresponds to the offensive or defensive state.
  • the offensive and defensive state acquisition unit 324 is a functional unit that acquires the offensive and defensive states of the teams in the match held at the stadium F.
  • the offensive and defensive state acquisition unit 324 detects the offensive and defensive states by performing image processing on the captured images.
  • the offensive and defensive state acquisition unit 324 may also acquire the offensive and defensive states based on judgment-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700.
  • the offensive and defensive state acquisition unit 324 may acquire the offensive and defensive states based on information input from the external input device 600 held by a team member, for example, the manager or coach.
  • FIG. 12 is a diagram showing an example of a state transition between offensive and defensive states.
  • the figure shows an example of an offensive and defensive state in soccer.
  • the offensive and defensive state transitions to an offensive state M510 or a defensive state M520.
  • the offensive state M510 and the defensive state M520 transition to each other via an offensive/defensive change state M530 or an offensive/defensive uncertainty state M540.
  • the offensive state M510 is a state in which one of the teams (hereinafter also referred to as "Team A") designated in advance is on the offensive.
  • An offensive state is, for example, a state in which Team A is in possession of the ball and is advancing toward the other team (hereinafter also referred to as “Team B”), but is not limited to this and may be a predetermined state determined by any determination criterion stored in advance in the memory unit 380.
  • the attack state M510 includes an A team offensive (own side) state M511, an A team offensive (enemy side) state M512, and an A team quick attack state M513.
  • a transition is possible between the A team offensive (own side) state M511 and the A team offensive (enemy side) state M512, and between the A team offensive (own side) state M511 and the A team quick attack state M513.
  • a transition is also possible from the A team quick attack state M513 to the A team offensive (enemy side) state M512.
  • the defensive state M520 includes a team A defensive (opponent's half) state M521, a team A defensive (own's half) state M522, and a team B fast attack state M523.
  • the team A defensive (own's half) state M521 and the team A defensive (own's half) state M522, and the team A defensive (opponent's half) state M521 and the team B fast attack state M523 can transition to each other. Also, a transition can be made from the team B fast attack state M523 to the team A defensive (own's half) state M522.
  • the offense/defense switching state M530 and the offense/defense uncertain state M540 can be transitioned to from any of the following: Team A offensive (own side) state M511, Team A offensive (opponent's side) state M512, Team A quick attack state M513, Team A defensive (opponent's side) state M521, Team A defensive (own side) state M522, and Team B quick attack state M523.
  • the offensive and defensive state acquisition unit 324 detects a transition to a fast attack state M513, M523 in an offensive/defensive change state M530 or an offensive/defensive uncertain state M540.
  • the offensive and defensive state acquisition unit 324 analyzes, for example, the acceleration of the ball or the players, the fluctuation in the ball's movement direction or the player's orientation, the number of players in a specified area, the movement direction of the players, the number of players moving in a certain direction, etc., from images captured by the image capture camera 141.
  • the offensive and defensive state acquisition unit 324 detects a fast attack state M513, M523 based on the results of this analysis.
  • the offensive and defensive state acquisition unit 324 also determines whether it is an A team fast attack state M513 or a B team fast attack state M523, depending on the movement direction of the players or the ball.
  • the offensive and defensive states are not limited to those described above, and any state that triggers a change in the shooting conditions may be specified. For example, a state that transitions upon detection of a long pass may be specified. In addition, the state may be specified appropriately depending on the content of the sport or event to be shot.
  • the event detection unit 320 may determine an event based on input information from the external system 700, instead of or in addition to the above-mentioned acquisition units 321 to 324.
  • the event detection unit 320 may determine an external disturbance such as a strong wind as an event based on input information from a weather information system, which is an example of the external system 700.
  • the event detection unit 320 may also determine an event based on input information from a court facility system, which is another example of the external system 700, or facility information entered by a person involved with the court facility.
  • the shooting condition determination unit 325 is a functional unit that determines the shooting conditions set in the shooting camera 141 of the drone 100.
  • the shooting condition determination unit 325 determines the shooting conditions according to the event detected by the event detection unit 320.
  • the shooting conditions include at least one of the target shooting position and the target shooting direction of the drone 100.
  • the target shooting direction includes, for example, information on either or both of the pitch angle with respect to the horizontal and the yaw angle with respect to a predetermined reference direction.
  • the target shooting direction may include information on the target zoom amount of the shooting camera 141. In the following description, the shooting direction will be described as including the pitch angle, the yaw angle, and the zoom amount.
  • the shooting conditions may also include information on the shooting range of the shooting camera 141. Note that the technical scope of the present invention is not limited to a configuration in which both are set, but rather that at least one of the target shooting position and the target shooting direction is automatically set depending on the event.
  • the target shooting direction is achieved by controlling at least one of the nose direction of the drone 100 and the shooting direction of the shooting camera 141.
  • the nose direction of the drone 100 is controlled by the flight control unit 123 of the drone 100.
  • the shooting direction of the shooting camera 141 is controlled, for example, by the shooting control unit 143 driving the camera holding unit 142.
  • control of the nose direction and “control of the shooting direction” are concepts that include control not only in the left-right direction (the so-called “pan direction”) but also in the up-down direction (the so-called "tilt direction”).
  • the photographing condition determination unit 325 determines the photographing conditions according to the type of event detected. In the normal play state M410, the photographing condition determination unit 325 allows manual control via the controller 200. When an event is detected by the event detection unit 320, the photographing condition determination unit 325 refers to the event-photographing condition table T1 (see FIG. 13) stored in the memory unit 380 and determines the photographing conditions according to the event.
  • the event-photography condition table T1 is a table in which events detected as game states and the photography conditions selected for the events are stored in association with each other. More specifically, in the event-photography condition table T1, events are associated with the photography range, the photography position where the drone 100 is located, the photography direction of the photography camera 141, and the zoom amount of the photography camera 141.
  • the photography range is the penalty area F130a or F130b where the ball is located
  • the photography position is the photography position L206 or the photography position L211.
  • the photography direction is the direction of the goal F120a or F120b where the ball is located.
  • the zoom amount is, for example, predetermined in stages, such as IN, Middle, and OUT, in descending order of zoom amount, and is IN in the PK state M442 or the PK shootout state M443.
  • the shooting position is one of the shooting positions L101, L102, and L104 in the outer edge flight mode M102. This configuration reduces the risk of the ball colliding with the drone 100.
  • the entire court F100 can be shot by shooting from the shooting positions L101, L102, or L104 along the outer edge.
  • the shooting direction is set so that the shooting range is around the ball or the referee's position.
  • the corner kick state M423 the area in front of the goal can be photographed up close by taking a photo at the shooting positions L207, L209, L212, or L215.
  • the ideal shooting direction required for events occurring in a match such as free kicks, fast breaks, and corner kicks, varies even for the same shooting position.
  • it is difficult to quickly and accurately achieve shooting from the ideal shooting direction by manual operation and there is a risk that important scenes will be missed as a result of operational errors or delays.
  • manual operation was attempted, multiple camera operators would have to be deployed.
  • the shooting direction required for each event is determined to a certain extent, the above-mentioned configuration, which automatically controls the drone 100 to a shooting position and shooting direction in accordance with the event, makes it possible to take appropriate photos according to the match situation. It also reduces the number of camera operators, contributing to labor savings.
  • the shooting condition determination unit 325 may select the shooting position that is closest to the ball among the stored shooting positions.
  • the event-photography condition table T1 may store different photography conditions set for the multiple drones 100.
  • the first drone 100 may photograph the ball location with a large zoom amount
  • the second drone may photograph the ball location with a small zoom amount.
  • the first drone 100 and the second drone 100 may also photograph while yaw rotating in opposite directions.
  • the first drone 100 may photograph from the side, and the second drone 100 may photograph from directly above. While it is even more difficult to manually control multiple drones 100 to the appropriate photography conditions, the above-mentioned automatically controlled configuration makes it possible to quickly photograph from multiple angles using multiple drones 100.
  • Events for which overhead shooting is performed may be associated with opposite shooting directions. That is, for example, one event is associated with a shooting direction from team A's court to team B's court, and a shooting direction from team B's court to team A's court.
  • the offensive/defensive state acquisition unit 324 detects which team is in possession of the ball, and the shooting condition determination unit 325 determines the shooting direction depending on the team in possession of the ball. Specifically, the shooting condition determination unit 325 determines that the shooting direction should be the offensive direction of the team in possession of the ball. With this configuration, it is possible to continuously take overhead shots of the moving ball.
  • the event-photography condition table T1 may be configured to accept correction input by the user, and the changes may be stored in the storage unit 380. With this configuration, the user's photography knowledge is reflected in the event-photography condition table T1, making it possible to realize more optimal automatic photography. Furthermore, multiple types of event-photography condition tables T1 may be stored in the storage unit 380, and the user may select the table to be applied. For example, this is because the content to be photographed differs depending on whether the purpose of photography is to watch a sports game or to instruct athletes, etc. With this configuration, automatic photography suited to the purpose can be easily realized. Note that the event-photography condition table T1 shown in FIG. 13 is an example of a table set for coaching purposes, for example, but it is merely one example, and the specific photography conditions stored in the table are arbitrary.
  • both the shooting positions L101 to L215 and the shooting direction are automatically set according to the detected event, but instead, only the shooting positions L101 to L215 may be automatically set.
  • the shooting direction is determined by input via the controller 200.
  • the shooting condition determination unit 325 may be configured to determine the shooting direction according to the detected event and the selected shooting position L101 to L215.
  • the shooting condition determination unit 325 may determine the shooting conditions based on the input to the controller 200 when the event detection unit 320 has not detected an event, and may determine the shooting conditions based on the event when the event detection unit 320 has detected an event.
  • the shooting condition determination unit 325 may determine the shooting conditions based on the operation received via the controller 200 or the external system 700. That is, the shooting conditions input from the controller 200 are applied in priority over the shooting conditions associated with the event.
  • the user only needs to focus on operations that cannot be handled by automatic control, so the operational burden is reduced compared to a configuration in which everything is manually controlled, and operational errors can be reduced.
  • this configuration it is possible to achieve both convenience and freedom, that is, to ensure the freedom of shooting according to the user's appropriate requests while maintaining the convenience of automatic shooting.
  • the photographing condition determination unit 325 may determine the photographing conditions based on the event, and if the event detection unit 320 does not detect an event, the photographing condition determination unit 325 may automatically track and photograph the ball.
  • the shooting condition determination unit 325 determines different shooting conditions for each of the multiple drones 100.
  • the multiple drones 100 may take pictures from different shooting positions and different shooting directions. For example, one drone may take a picture of a player taking a shot, while the other may take a picture of the goalkeeper of the opposing team.
  • the shooting condition determination unit 325 may also set shooting conditions for multiple drones 100 flying simultaneously such that they each shoot the same shooting range from different target shooting positions. With this configuration, important scenes can be shot from multiple angles.
  • the shooting condition determination unit 325 may also set shooting conditions for multiple drones 100 flying simultaneously such that they each shoot an area including the same shooting range with different zoom amounts. With this configuration, it becomes possible to shoot areas that are particularly noteworthy in the stadium F under multiple shooting conditions, making it possible to more reliably shoot important scenes.
  • the shooting condition determination unit 325 may analyze the captured image, predict the shooting range to be captured based on the analysis results, and determine the shooting conditions. For example, the shooting condition determination unit 325 may predict the movement distance of the ball after a predetermined time by analyzing the movement direction and speed or acceleration of the ball from the captured image, and determine the shooting conditions such that the position of the ball after the predetermined time will be in the shooting range. Note that the speed of the ball may refer to the speed at the start of the prediction, i.e., the initial speed.
  • the photographing condition determination unit 325 may determine the photographing conditions according to the trajectory of the ball predicted by the event detection unit 320 as a result of the detection of an event.
  • the photographing condition determination unit 325 may predict the trajectory of the ball, for example, when a fast attack state M513, M523, a long pass, or the like is detected, and determine the photographing conditions according to the predicted trajectory.
  • the photographing condition determination unit 325 may change the photographing direction to the traveling direction of the ball and may determine to zoom out the photographing magnification. By zooming out, the traveling ball can be photographed more reliably.
  • the ball trajectory prediction is not limited to a mode in which a predetermined trajectory is predicted and then detected as an event by the event detection unit 320. That is, for example, the shooting condition determination unit 325 may perform ball trajectory prediction by the shooting condition determination unit 325 or another functional unit separately from the event detection unit 320, and determine the shooting conditions based only on the result of the trajectory prediction.
  • the shooting direction of the shooting camera 141 when changing the shooting direction in response to a ball trajectory prediction, it is preferable to control the shooting direction of the shooting camera 141 by control of the shooting control unit 143 instead of controlling the nose direction.
  • changing the shooting direction based on a ball trajectory prediction requires changing the shooting direction with high response speed. That is, for example, when changing the shooting direction based on a ball trajectory prediction, the shooting direction of the shooting camera 141 is changed by the shooting control unit 143, and when changing the shooting direction not based on a trajectory prediction, the shooting direction may be changed by controlling the nose direction of the drone 100.
  • the shooting condition determination unit 325 may determine to change the shooting direction to the direction of the ball's movement and to zoom out the shooting magnification.
  • the flight mode switching unit 330 is a functional unit that switches the flight mode depending on the detection result by the event detection unit 320.
  • the flight mode switching unit 330 mainly has a mode switching input acquisition unit 331, a flight permitted area switching unit 332, a geofence switching unit 333, and a flight path generation unit 334.
  • the mode switching input acquisition unit 331 is a functional unit that acquires input information regarding switching of flight modes.
  • the flight modes are, for example, the outer edge flight mode M102, the fixed position flight mode M103 or M107, and the on-court flight mode M105 (all see FIG. 8).
  • Whether the off-court fixed position flight mode M103 or the on-court fixed position flight mode M107 is used for flight in a fixed position is determined according to the position of the drone 100 at the time the flight mode selection input is received. In other words, when the drone 100 is in the area outside the court F200, the off-court fixed position flight mode M103 is used, and when the drone 100 is inside the court F100, the on-court fixed position flight mode M107 is used.
  • the permitted flight area switching unit 332 is a functional unit that switches the permitted flight area in response to switching of the flight mode.
  • the geofence switching unit 333 is a functional unit that switches geofences in response to switching of flight modes. For example, when the flight mode is the outer edge flight mode M102, the geofence switching unit 333 sets a geofence G100. When the flight mode is the inner court flight mode M105, the geofence switching unit 333 sets a geofence G200. In addition, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, i.e., the inner court entry mode M104 and the outer court exit mode M106, a third geofence different from the geofences G100 and G200 in the outer edge flight mode M102 and the inner court flight mode M105 is set.
  • the flight path generating unit 334 is a functional unit that generates a flight path of the drone 100 during movement involving switching of flight modes.
  • the flight path generating unit 334 determines, for example, the shooting position at which the mode transitions to the inside-court entry mode M104 or the outside-court exit mode M106.
  • the flight path generating unit 334 also determines the shooting position at which the mode transitions from the inside-court entry mode M104 to the inside-court flight mode M105, or the shooting position at which the mode transitions from the outside-court exit mode M106 to the outer edge flight mode M102.
  • the flight path generating unit 334 generates a specific flight path in the inside-court entry mode M104 or the outside-court exit mode M106. This flight path is generated, in principle, within the area of the third geofence.
  • the outer edge flight control unit 340 is a functional unit that controls the flight of the drone 100 in the outer edge flight mode.
  • the outer edge flight control unit 340 has an imaging condition command unit 341 and a flight path generation unit 342.
  • the shooting condition command unit 341 is a functional unit that transmits commands regarding shooting conditions to the drone 100.
  • the shooting conditions are, for example, a target shooting position or shooting direction.
  • This shooting condition command unit 341 acquires a target shooting position located within the range of the flight area in the outer edge flight mode M102 from the shooting condition determination unit 325.
  • the shooting condition command unit 341 may also acquire a target shooting position input by the user, for example, received via the target position receiving unit 226 of the controller 200.
  • the flight path generating unit 342 is a functional unit that generates a flight path along which the drone 100 moves in the outer perimeter flight mode M102. In other words, the flight path generating unit 342 generates a flight path in a flyable area in the outer perimeter flight mode M102.
  • the flight path generating unit 342 generates a flight path from the current position to the target position.
  • the flight path generating unit 342 may generate a flight path in the flight range in the outer perimeter flight mode M102.
  • the flight path generation unit 342 moves the drone 100 to the outer court area F200 outside the touchline F111b.
  • the flight path generation unit 342 may also perform shooting directly below the drone 100 or from that point toward the court F100. With this configuration, it is possible to follow and shoot the ball even if the ball rolls into the outer court area F200.
  • the geofence G100 of the outer edge flight mode M102 may be set in advance to extend beyond the touchline F111b to the outside of the court F100. With this configuration, the drone 100 can be reliably maintained within the geofence G100 even when the drone 100 follows the ball and flies slightly outside the touchline F111b as described above.
  • the flight path generating unit 342 When the flight path generating unit 342 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle on the inside of the court F100. When the flight path generating unit 342 detects an obstacle, it may hover for a predetermined time and then move along the originally generated flight path. This is because while the safety of the drone 100 is not ensured in the area outside the court F200 in the stadium F, the safety of the drone 100 inside the court F100 is highly likely to be ensured.
  • the obstacle may be detected, for example, by the obstacle detecting unit 130 of the drone 100, or by information from an external system 700 or the like.
  • the outer edge flight control unit 340 may switch to manual control after causing the flight path generation unit 342 to hover for a predetermined time. Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may cause the flight path generation unit 342 to hover for a predetermined time, and then display a message prompting the user to re-input the target position via the display control unit 210.
  • the in-court flight control unit 350 is a functional unit that controls the flight of the drone 100 in the in-court flight mode M105.
  • the in-court flight control unit 350 has an image capture condition command unit 351 and a flight path generation unit 352.
  • the image capture condition command unit 351 is a functional unit that transmits commands to the drone 100 regarding image capture conditions within the range of the flight area in the in-court flight mode M105.
  • the image capture conditions are, for example, a target image capture position or image capture direction.
  • the flight path generating unit 352 generates a flight path along which the drone 100 moves in the on-court flight mode M105. That is, the flight path generating unit 352 generates a flight path in a flyable area in the on-court flight mode M105. More specifically, the flight path generating unit 352 generates a flight path by connecting multiple preset shooting positions in the on-court flight mode M105. Like the flight path generating unit 342 of the outer edge flight control unit 340, the flight path generating unit 352 may generate a flight path in a flight range in the on-court flight mode M105 when the current location and the target shooting position belong to flyable areas of different flight modes.
  • the flight path generating unit 352 changes the connected shooting positions depending on the event detection status. That is, when an event is detected, the flight path generating unit 352 changes the connection relationship of the shooting positions on the flight path and generates a flight path to the target shooting position.
  • the flight path generating unit 352 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle.
  • the obstacle is detected by, for example, the obstacle detection unit 130.
  • the flight path generating unit 352 may regenerate the flight path by changing the connection between multiple shooting positions that have been set in advance, or may change the flight path to a higher altitude while maintaining the flight path on a plane.
  • the flight path generating unit 352 may hover for a predetermined time when it detects an obstacle, and then move along the flight path that was initially generated.
  • the in-court flight control unit 350 may switch to manual control after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle.
  • the in-court flight control unit 350 may display a message prompting the user to re-input the target position via the display control unit 210 after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle.
  • the obstacle may be, for example, a bird, a fixed facility, or a player.
  • the obstacles also include balls.
  • flight control in the outer edge flight mode M102 is performed by the outer edge flight control unit 340
  • flight control in the inner court flight mode M105 is performed by the inner court flight control unit 350.
  • the shooting positions defined in each of the outer edge flight mode M102 and the inner court flight mode M105 are presented as options, and a flight path to the selected target position is generated.
  • the technical scope of the present invention is not limited to this, and the pilot 200 may control the shooting position and orientation of the drone 100 to fly freely at any position in the area within the geofences G100 and G200 set corresponding to each flight mode.
  • flight path generation units 334, 342, and 352 are an example, and for example, a single flight path generation unit may generate the flight path without subdivision.
  • FIG. 14 is a schematic diagram showing the routes that the drone 100 can follow, defined by the shooting positions L101-L105, L206-L215, and the evacuation point H200.
  • Point L101g on the ground at the shooting position L101 is the takeoff and landing point for the drone 100.
  • the position transition of the drone 100 begins with the step of taking off from point L101g and arriving at the shooting position L101.
  • the drone 100 also descends at the shooting position L101, and lands at point L101g to end shooting.
  • the drone 100 may only be able to transition to adjacent shooting positions.
  • the point to which the drone 100 at shooting position L105 can transition while maintaining the outer edge flight mode is shooting position L104.
  • the points to which the drone 100 at shooting position L105 can transition after switching to the inside court flight mode M105 are shooting positions L106 and L107.
  • the flight path generation unit 334, 342, or 352 (see FIG. 6) generates a flight path for the drone 100 by referring to the possible transition paths.
  • the drone 100 transitions to the selected shooting position via the available shooting positions. For example, when the drone 100 is at shooting position L105 and shooting position L215 is selected, the flight path generating unit 334, 342, or 352 (see FIG. 6) generates a flight path that transitions through shooting positions L105, L207, L208, L213, L212, and L215 in that order, and the drone 100 flies along this flight path.
  • the drone 100 may fly on a flight path that connects the current position to the target shooting position in a straight line. Furthermore, the drone 100 may transition to an adjacent shooting position in the case of a transition of the shooting position accompanied by a flight mode switch, and may be able to transition directly to a non-adjacent shooting position in the case of a transition without a flight mode switch. That is, for example, when the drone moves from shooting position L105 to shooting position L215, it may transition from shooting position L105 to shooting position L207 accompanied by a mode switch, and then move linearly from shooting position L107 to shooting position L215 in the area within the geofence G200.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 fly the drone 100 autonomously in each flight area according to the flight mode.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 may perform dolly shooting within each flight area, that is, the drone 100 may automatically follow and shoot a specific object such as a ball or a specified player.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 may automatically control the flight height of the drone 100.
  • the autonomous flight mode may differ depending on the flight mode. For example, dolly shooting may be performed when controlled by the outer edge flight control unit 340, while automatic follow-up shooting of only the shooting direction with a fixed shooting position, or automatic follow-up shooting of the position and shooting direction may be performed when controlled by the inner court flight control unit 350.
  • the outer edge flight control unit 340 and the inner court flight control unit 350 may also generate a flight path within the court (within the stadium F) for moving the drone 100 to a target position specified by the user in each flight area.
  • the fixed position flight control unit 360 is a functional unit that controls the flight of the drone 100 in the off-court fixed position flight mode M103 and the on-court fixed position flight mode M107. In the fixed position flight mode, the fixed position flight control unit 360 hovers at a predetermined position and controls the nose direction or the direction of the shooting camera 141 to follow a specific player or the ball and perform automatic shooting. Note that the above-mentioned "control of direction” is a concept that includes control not only in the left-right direction (so-called “pan direction”) but also in the up-down direction (so-called "tilt direction").
  • the fixed position flight control unit 360 includes an image capture condition command unit 361.
  • the image capture condition command unit 361 is a functional unit that transmits commands for the target position and the target image capture direction in the fixed position flight mode M103 or M107.
  • the image capture direction may be information determined by the image capture condition determination unit 325, or may be information input by the user via the controller 200.
  • the communication unit 370 has a modem or the like (not shown) and is capable of communicating with the drone 100, the controller 200, and the like via the communication network 400.
  • the communication unit 370 may, for example, monitor the state of the drone 100 and its surroundings and notify the controller 200.
  • the storage unit 380 is a functional unit that stores information related to flight control of the drone 100, and is, for example, a database.
  • the storage unit 380 stores, for example, the coordinates of multiple shooting positions L101 to L105, L210 to L215 in the stadium F. These coordinates may be two-dimensional coordinates on a plane or three-dimensional coordinates including information in the height direction.
  • the storage unit 380 also stores an event-shooting condition table T1 shown in FIG. 13. As described above, the event-shooting condition table T1 is recorded in a rewritable manner. Furthermore, multiple event-shooting condition tables T1 may be stored.
  • Flowcharts Fig. 15 is a flowchart showing the overall flow of aerial photography control in this embodiment.
  • Fig. 16 is a subroutine of the flight restriction process S1002 in Fig. 15.
  • Fig. 17 is a subroutine of the photography condition switching process S1010 in Fig. 15.
  • the control described in the flowchart shown in FIG. 15 is executed in a regular loop. As shown in FIG. 15, if it is detected that the drone 100 is approaching the vicinity of geofences G100, G200 while flying (YES in step S1001), the process proceeds to flight restriction processing in step S1002. The subroutine of the flight restriction processing S1002 is explained in FIG. 15.
  • step S1003 If it is not detected in step S1001 that the drone is approaching the vicinity of geofences G100, G200 (NO in step S1001), the presence or absence of an obstacle in the path or near the drone 100 is detected (step S1003). If an obstacle is detected in step S1003 (YES in step S1003), the drone 100 is caused to hover or a detour route is generated, and the flight route of the drone 100 is changed to the detour route (step S1004).
  • step S1003 If no obstacle is detected in step S1003 (NO in step S1003), the presence or absence of an action determination of the aircraft state is detected (step S1005). If an action determination of the aircraft state is detected in step S1005 (YES in step S1005), the process proceeds to step S1006, where an event type determination process is executed (step S1006).
  • step S1005 If no action determination is detected in step S1005 (NO in step S1005), it is determined whether or not there is an input from the controller 200 by the user (step S1007). If an input from the controller 200 is detected in step S1007 (YES in step S1007), a command based on the input is executed (step S1008).
  • step S1007 If no input from the controller 200 is detected in step S1007 (NO in step S1007), the presence or absence of an event is determined (step S1009). If an event is detected (YES in step S1009), the process proceeds to the shooting condition switching process in step S1010 (step S1010). If no event is detected in step S1009 (NO in step S1009), the process returns to step S1001 and steps S1001 to S1009 are repeated.
  • the overall processing of aerial photography control is performed in the following order: geofence restriction, obstacle detection, control based on aircraft status, control based on user input, and control based on in-game events such as the game status or offensive and defensive status.
  • each control is executed before control based on in-game events.
  • This order is based on the priority of performing safe control processing. With this configuration, the safety of flying the drone 100 can be more reliably guaranteed.
  • step S1101 the flight control unit 123 of the drone 100 issues an operation command to restrict the drone 100 from advancing outside the geofence.
  • step S1101 a restriction is set on the flight target position so that the drone 100 does not advance outside the geofence even when the drone 100 is manually operated.
  • step S1102 if the drone 100 does not advance outside the geofence (NO in step S1102), the process ends.
  • step S1103 If the drone 100 advances outside the geofence (YES in step S1102), proceed to step S1103. Possible causes of this situation include, for example, the wind being too strong and causing the aircraft to be swept away, or a malfunction in the drone 100 preventing it from flying in the intended direction.
  • the flight control unit 123 issues an operation command to return to the geofence. More specifically, in step S1103, a flight target position command to return to the geofence, that is, an operation command to set the flight target position to a specified point inside the geofence, is given to the drone 100.
  • step S1104 by referring to information measured by the measurement unit 110 of the drone 100, such as information on position, direction, altitude, or speed, it is determined whether the drone 100 is approaching the inside of the geofence (step S1104).
  • step S1104 is executed a predetermined time after step S1103. Note that in step S1104, it is sufficient if the drone 100 is closer to the inside of the geofence than at the time of step S1103, and it is not necessary to determine whether the drone 100 is located inside the geofence.
  • step S1104 If it is determined in step S1104 that the drone 100 is not approaching the geofence (NO in step S1104), it is determined that the above operation command is ineffective, and the flight control unit 123 forces the drone 100 to land (step S1105).
  • step S1106 information measured by the measurement unit 110 of the drone 100, such as the position and altitude, is referenced to determine whether the drone 100 is located within the geofence. If the drone 100 is located within the geofence (YES in S1106), end the process. If the drone 100 is not located within the geofence (NO in S1106), return to step 1104 and continue operation based on the operation command to return to the geofence until the drone 100 returns within the geofence.
  • FIG. 17 is a diagram showing an example of a switching process flow for switching shooting conditions.
  • the shooting condition determination unit 325 refers to the event-shooting condition table T1 to determine the target values for the shooting position and shooting direction, and transmits a control command to the drone 100 (step S1301).
  • step S1303 When the target shooting position and the target shooting direction are reached in step S1303, the process proceeds to step S1305.
  • step S1305 the mode transitions to manual control mode, and a message is displayed on the operation screen G3 (see FIG. 20) indicating that manual operation is permitted.
  • FIGS. 18 and 19 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200.
  • FIG. 18 and 19 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200.
  • the screen G1 shown in FIG. 18 displays a field map G10 that shows a schematic bird's-eye view of the stadium F and the shooting positions L101 to L215, an icon G11 that shows the position information of the drone 100, the shooting range G12 captured by the shooting camera 141, a display field G21 of the flight mode to which the drone 100 belongs, a status display field G22 that shows the status detected by the event detection unit 320, such as the aircraft status, aircraft behavior status, match status, and offensive and defensive status, a landing button G30 for landing the drone 100, and a video field G40 in which images captured by the drone 100 are displayed.
  • the display field G21 displays the main control modes, either automatic control mode or manual control mode.
  • the drone is flying in the outer edge flight mode at the shooting position L101.
  • the position and shooting direction of the drone 100 may be controlled manually, or automatic tracking control of the ball or a specific player may be performed.
  • automatic tracking control information about the ball or specific player being tracked may be displayed on the screen G1.
  • the icon G11 representing the drone 100 displays an arrow indicating the direction of travel of the drone 100.
  • the direction of the nose of the drone 100 is not limited to the direction of travel of the drone 100, and may be pointing in any direction.
  • the direction of the nose of the drone 100 does not have to be constant while moving, and for example, the drone 100 may move while photographing a player or the ball by rotating in a yaw motion.
  • FIG. 19 shows an example of the display on screen G2 when multiple drones 100 are photographing one stadium F.
  • FIG. 19 particularly shows an example of the display when a free kick state M441 is detected.
  • icons 11a and 11b of two drones are displayed on a field map G10.
  • the shooting ranges G13a and G13b photographed by each of the multiple drones 100, and video columns G40a and G40b showing the captured images are displayed in association with the icons 11a and 11b of the corresponding drones 100.
  • the first drone 100 corresponding to icon 11a is taking localized shots near the goal 120a, while the second drone 100 corresponding to icon 11b is taking bird's-eye shots from a shooting position L101 on the outer edge.
  • the shooting angles of the first drone and the second drone are different from each other. It is preferable that the shooting conditions of the first drone and the second drone are such that they complement each other's positions where they cannot shoot. In this way, with a configuration in which multiple drones 100 shoot under different shooting conditions, the stadium F can be photographed from multiple angles.
  • FIG. 20 is an example of screen G3 displayed on the display unit 201 when an event has not been detected by the event detection unit 320. Because an event has not been detected, automatic control of the drone 100 is not performed, and the display field G21 indicates that the mode is "manual control mode.”
  • FIG. 21 is an example of screen G4 when the event detection unit 320 detects an event in the state of screen G3.
  • the drone 100 switches to automatic control in response to the detection of the event, and the display field G21 indicates that it is in "automatic control mode.”
  • the status display field G22 indicates that a team A quick attack state M513 has been detected as the offensive and defensive state.
  • an arrow G15 indicating the predicted trajectory of the ball and the ball G16 after movement based on the trajectory are displayed on the field map G10. Also, an arrow G17 indicating the change in the shooting direction of the shooting camera 141 is displayed.
  • the present invention is not limited to the above embodiment, and various configurations can be adopted based on the contents of this specification.
  • the series of processes described in relation to the above embodiment may be implemented using software, hardware, or a combination of software and hardware.
  • a computer program for implementing each function of the server 300 according to this embodiment may be created and implemented in a PC or the like.
  • a computer-readable recording medium on which such a computer program is stored may also be provided. Examples of the recording medium include a magnetic disk, an optical disk, a magneto-optical disk, and a flash memory.
  • the above computer program may also be distributed, for example, via the communication network 400, without using a recording medium.
  • Aerial photography system 100 Drone (mobile body) 141
  • Shooting camera 200
  • Pilot 220
  • Input control unit 300
  • Event detection unit 330
  • Flight mode switching unit 331
  • Mode switching determination unit 380
  • Memory unit F Stadium F100 Court F200 Area outside the court
  • Outer edge flight mode M105 Inner court flight mode

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

[Problem] To ensure safety in aerial imaging. [Solution] An aerial imaging system 1 comprises a mobile object 100 that flies over a target area F, a camera 141 mounted on the mobile object to capture an image of the target area, an event detection unit 320 that detects an event on the basis of the image captured by the camera or an input from an external system, and an imaging condition determination unit 325 that determines an imaging condition, including at least one of the target imaging position and target imaging direction of the mobile object, according to the detected event.

Description

空中撮影システム、空中撮影方法および空中撮影プログラムAerial photography system, aerial photography method, and aerial photography program

 本発明は、空中撮影システム、空中撮影方法および空中撮影プログラムに関する。 The present invention relates to an aerial photography system, an aerial photography method, and an aerial photography program.

 特許文献1には、飛行体の機体位置と機首方向を検出するとともに、飛行体に搭載されるカメラ装置のパン角度とチルト角度を検出し、これらの各データから、カメラの視点を計算し、モニタ画面の地図上に視点を表示するカメラの視点表示システムが開示されている。このシステムでは、操作者は、飛行体の機体の位置や進行方向を地上局で把握しながら、飛行体の位置や姿勢、およびカメラの撮影方向等を操縦する。 Patent Document 1 discloses a camera viewpoint display system that detects the aircraft's position and nose direction, as well as the pan and tilt angles of a camera device mounted on the aircraft, calculates the camera viewpoint from each of these pieces of data, and displays the viewpoint on a map on a monitor screen. With this system, an operator controls the aircraft's position and attitude, as well as the camera's shooting direction, while grasping the aircraft's position and heading from a ground station.

 特許文献2には、複数のUAVにより特定の物体を追跡して撮影するように、位置および撮影方向を自動的に制御する技術が開示されている。 Patent document 2 discloses a technology that automatically controls the position and shooting direction so that a specific object is tracked and photographed by multiple UAVs.

特開2006-281830号明細書JP 2006-281830 A 特開2020-115642号明細書JP 2020-115642 A

 スポーツ等を効果的に撮影するためには、試合状況等、撮影対象の詳細な内容に適した撮影が必要である。より具体的には、試合状況等に応じて撮影すべき範囲が変化するため、状況に応じて撮影位置や撮影方向を適切に制御できることが望ましい。 In order to effectively capture sports and other events, it is necessary to capture images that are suited to the details of the subject, such as the game situation. More specifically, since the range to be captured changes depending on the game situation, it is desirable to be able to appropriately control the shooting position and direction according to the situation.

 この点、特許文献1記載のシステムでは、カメラと飛行体を同時に制御する必要があり複雑であった。また、特許文献2記載のシステムは、撮影位置および撮影方向が物体に追従するのみであり、利便性に乏しかった。 In this regard, the system described in Patent Document 1 was complicated because it required simultaneous control of the camera and the flying object. Furthermore, the system described in Patent Document 2 only tracked the shooting position and direction of the object, which was inconvenient.

 本発明は上記のような課題を考慮してなされたものであり、撮影作業を省力化するとともに、撮影対象の状況に応じた適切な撮影が可能な空中撮影システムを提供することを目的とする。 The present invention was made in consideration of the above problems, and aims to provide an aerial photography system that reduces the labor required for photography and enables appropriate photography according to the conditions of the subject.

 上記目的を達成するため、本発明の一の観点に係る空中撮影システムは、対象エリアを飛行する移動体と、前記移動体に搭載される、前記対象エリアを撮影するカメラと、前記カメラにより取得される撮影画像又は外部システムからの入力に基づいてイベントを検出するイベント検出部と、検出される前記イベントに応じて、前記移動体の目標撮影位置および目標撮影方向の少なくともいずれかを含む撮影条件を決定する撮影条件決定部と、を備える。 In order to achieve the above object, an aerial photography system according to one aspect of the present invention comprises a moving body that flies over a target area, a camera mounted on the moving body that photographs the target area, an event detection unit that detects an event based on an image captured by the camera or an input from an external system, and a photography condition determination unit that determines photography conditions including at least one of a target photography position and a target photography direction of the moving body according to the detected event.

 前記撮影条件決定部は、検出される前記イベントの種類に応じて、前記撮影条件を決定するものとしてもよい。 The shooting condition determination unit may determine the shooting conditions according to the type of the detected event.

 前記目標撮影方向は、前記移動体の機首方向および前記カメラの前記移動体に対する角度の少なくともいずれかの制御により達成されるものとしてもよい。 The target shooting direction may be achieved by controlling at least one of the nose direction of the moving body and the angle of the camera relative to the moving body.

 前記撮影条件は、前記カメラの目標ズーム量を含むものとしてもよい。 The shooting conditions may include a target zoom amount for the camera.

 前記撮影条件決定部は、前記移動体の操縦器又は前記外部システムから前記撮影条件の入力を受け付けた場合には、前記イベント検出部により前記イベントを検出している場合であっても、前記操縦器を介して受け付けられる操作に基づいて前記撮影条件を決定するものとしてもよい。 When the shooting condition determination unit receives input of the shooting conditions from a controller of the moving object or the external system, the shooting condition determination unit may determine the shooting conditions based on the operation received via the controller even if the event detection unit detects the event.

 ユーザによる前記撮影条件の入力を受け付ける操縦器をさらに備え、前記撮影条件決定部は、前記イベント検出部が前記イベントを検出していない場合には、前記操縦器を介する入力に基づいて前記撮影条件を決定し、前記イベント検出部が前記イベントを検出した場合には、当該イベントに基づいて前記撮影条件を決定するものとしてもよい。 The camera may further include a controller that accepts input of the shooting conditions by a user, and the shooting condition determination unit may determine the shooting conditions based on the input via the controller if the event detection unit has not detected the event, and may determine the shooting conditions based on the event if the event detection unit detects the event.

 前記移動体を複数備え、1個の前記対象エリアに複数の前記移動体を同時に飛行させることで当該対象エリアを撮影する空中撮影システムであって、前記撮影条件決定部は、複数の前記移動体にそれぞれ異なる前記撮影条件を決定するものとしてもよい。 The aerial photography system may include a plurality of the moving bodies, and photograph a single target area by flying the plurality of moving bodies simultaneously over the target area, and the photography condition determination unit may determine different photography conditions for each of the plurality of moving bodies.

 前記撮影条件決定部は、同時に飛行する複数の前記移動体に対し、互いに同一の撮影範囲を互いに異なる前記目標撮影位置から撮影する前記撮影条件、又は前記同一の撮影範囲を含むエリアを互いに異なるズーム量で撮影する前記撮影条件を設定するものとしてもよい。 The shooting condition determination unit may set, for multiple moving bodies flying simultaneously, shooting conditions for shooting the same shooting range from different target shooting positions, or shooting conditions for shooting an area including the same shooting range with different zoom amounts.

 前記撮影条件決定部は、前記イベント検出部が前記イベントの検出結果として予測したボールの軌跡の予測結果に応じて、前記撮影条件を決定するものとしてもよい。 The shooting condition determination unit may determine the shooting conditions according to the predicted result of the ball's trajectory predicted by the event detection unit as the detection result of the event.

 前記撮影条件決定部は、前記対象エリアで行われる競技において反則が発生した旨のイベントが検出された場合に、前記競技で使用されるボール又は前記競技の審判の地点周辺を撮影範囲とする前記撮影条件を決定するものとしてもよい。 The shooting condition determination unit may determine the shooting conditions so that, when an event indicating a foul has occurred in a competition held in the target area is detected, the shooting condition has a shooting range of the ball used in the competition or the vicinity of the position of the referee of the competition.

 前記移動体を複数備え、1個の前記対象エリアに複数の前記移動体を同時に飛行させることで当該対象エリアを撮影する空中撮影システムであって、前記撮影条件決定部は、前記競技において前記反則が発生した旨のイベントが検出された場合に、複数の前記移動体により、互いに異なる前記目標撮影位置、前記目標撮影方向又はズーム量で、前記ボール又は前記競技の審判の地点周辺を撮影範囲とする前記撮影条件を決定するものとしてもよい。 The aerial photography system may include a plurality of moving bodies, and photograph a single target area by flying the plurality of moving bodies simultaneously over the target area, and the photography condition determination unit may determine the photography conditions, when an event indicating that a foul has occurred in the competition is detected, by using the plurality of moving bodies to photograph the ball or the vicinity of the point of the competition referee at different target photography positions, target photography directions, or zoom amounts.

 前記移動体の飛行経路を生成する飛行経路生成部を備え、前記飛行経路生成部は、前記撮影画像により検出される前記イベントに基づいて決定される前記目標撮影位置までの前記飛行経路を自動生成するものとしてもよい。 The system may include a flight path generation unit that generates a flight path for the moving body, and the flight path generation unit may automatically generate the flight path to the target shooting position that is determined based on the event detected from the captured image.

 前記飛行経路生成部は、前記対象エリア内に構成されるコート内における前記飛行経路を生成し、前記飛行経路生成部は、あらかじめ設定された複数の撮影位置を接続して前記目標撮影位置までの前記飛行経路を生成し、前記イベントの検出状況に応じて、接続する前記撮影位置を変更するものとしてもよい。 The flight path generation unit generates the flight path within the court that is configured within the target area, and the flight path generation unit may generate the flight path to the target shooting position by connecting a plurality of shooting positions that have been set in advance, and may change the shooting positions to be connected depending on the detection status of the event.

 上記目的を達成するため、本発明の別の観点に係る空中撮影方法は、対象エリアを撮影するカメラにより取得される撮影画像又は外部システムからの入力に基づいてイベントを検出するイベント検出ステップと、検出される前記イベントに応じて、前記カメラを搭載する移動体の目標撮影位置および目標撮影方向の少なくともいずれかを含む撮影条件を決定する撮影条件決定ステップと、を備える。 In order to achieve the above object, an aerial photography method according to another aspect of the present invention includes an event detection step for detecting an event based on an image captured by a camera photographing a target area or an input from an external system, and a photography condition determination step for determining photography conditions including at least one of a target photography position and a target photography direction of a moving body equipped with the camera according to the detected event.

 上記目的を達成するため、本発明のさらに別の観点に係る空中撮影プログラムは、対象エリアを撮影するカメラにより取得される撮影画像又は外部システムからの入力に基づいてイベントを検出するイベント検出命令と、検出される前記イベントに応じて、前記カメラを搭載する移動体の目標撮影位置および目標撮影方向の少なくともいずれかを含む撮影条件を決定する撮影条件決定命令と、をコンピュータにより実行させる。
 なお、コンピュータプログラムは、各種のデータ読取可能な記録媒体に格納して提供したり、インターネット等のネットワークを介してダウンロード可能に提供したりすることができる。
In order to achieve the above-mentioned object, an aerial photography program according to yet another aspect of the present invention causes a computer to execute an event detection command for detecting an event based on an image acquired by a camera photographing a target area or an input from an external system, and a photography condition determination command for determining photography conditions including at least one of a target photography position and a target photography direction of a moving body equipped with the camera in accordance with the detected event.
The computer program may be provided by being stored on various data-readable recording media, or may be provided so as to be downloadable via a network such as the Internet.

 本発明によれば、撮影作業を省力化するとともに、撮影対象の状況に応じた適切な撮影が可能である。 The present invention reduces the labor required for photography and enables appropriate photography according to the subject's circumstances.

本発明の一実施形態に係る空中撮影システムの全体構成図である。1 is an overall configuration diagram of an aerial photography system according to an embodiment of the present invention; 前記実施形態のドローンを簡略的に示す外観斜視図である。FIG. 2 is a simplified external perspective view of the drone according to the embodiment. 前記実施形態のドローンの機能構成図である。FIG. 2 is a functional configuration diagram of the drone according to the embodiment. (a)前記実施形態の操縦装置を簡略的に示す外観正面図、(b)前記操縦装置の入力に応じてドローンが移動又は旋回する方向を示す模式図、である。(a) is a simplified front view of the exterior of the control device of the embodiment; (b) is a schematic diagram showing the direction in which the drone moves or turns in response to input from the control device. 前記実施形態の操縦装置の機能構成図である。FIG. 2 is a functional configuration diagram of the control device according to the embodiment. 前記実施形態のサーバの機能構成図である。FIG. 2 is a functional configuration diagram of a server according to the embodiment. ドローンが飛行する撮影対象フィールドにおいてあらかじめ設定される前記ドローンの撮影位置の例を示す模式図である。1 is a schematic diagram showing an example of a shooting position of a drone that is set in advance in a shooting target field where the drone flies. 前記ドローンの飛行モードの遷移の様子を示す概略状態遷移図である。FIG. 2 is a schematic state transition diagram showing the transition of flight modes of the drone. 前記ドローンの機体状態に応じたドローンの状態遷移の様子を示す概略状態遷移図である。This is a schematic state transition diagram showing the state transition of the drone depending on the aircraft state of the drone. 前記ドローンの機体行動状態に応じたドローンの状態遷移の様子を示す概略状態遷移図である。A schematic state transition diagram showing the state transition of the drone according to the aircraft behavior state of the drone. 撮影対象フィールドの例としての競技場における試合状態の状態遷移の様子を示す概略状態遷移図である。FIG. 11 is a schematic state transition diagram showing state transitions of a game in a stadium as an example of a field to be photographed. 前記競技場における攻守状態の状態遷移の様子を示す概略状態遷移図である。FIG. 2 is a schematic state transition diagram showing the state transition of the offensive and defensive states in the stadium. 前記競技場における試合状態と、前記ドローンが撮影する撮影範囲、カメラ位置、撮影方向およびズーム量との対応関係の1例を示す表である。1 is a table showing an example of the correspondence between the game state in the stadium and the shooting range, camera position, shooting direction, and zoom amount captured by the drone. 前記撮影位置から遷移可能な撮影位置および飛行経路を示す模式図である。1 is a schematic diagram showing possible photographing positions and flight paths to which the photographing position can be transitioned; 前記ドローンの飛行中に実施される制御のフローチャートである。1 is a flowchart of a control executed during flight of the drone. 前記ドローンにおける飛行制限の制御にかかるフローチャート(図15のS1002の詳細)である。16 is a flowchart showing control of flight restrictions in the drone (details of S1002 in FIG. 15). 前記ドローンにおける飛行モードの切替制御に係るフローチャート(図15のS1010の詳細)である。16 is a flowchart showing flight mode switching control of the drone (details of S1010 in FIG. 15 ). 前記空中撮影システムの端末に表示される画面の第1例を示す図である。FIG. 4 is a diagram showing a first example of a screen displayed on a terminal of the aerial photography system. 前記空中撮影システムの端末に表示される画面の第2例を示す図である。FIG. 13 is a diagram showing a second example of a screen displayed on the terminal of the aerial photography system. 前記空中撮影システムの端末に表示される画面の第3例を示す図である。FIG. 13 is a diagram showing a third example of a screen displayed on the terminal of the aerial photography system. 前記空中撮影システムの端末に表示される画面の第4例を示す図である。FIG. 13 is a diagram showing a fourth example of a screen displayed on the terminal of the aerial photography system.

 以下では、添付図面を参照しながら、本発明の好適な実施形態について詳細に説明する。本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。また、以下に示す実施形態は、例を表すに過ぎず、その用途、目的又は規模等に応じて、他の既知の要素や代替手段を採用可能である。 Below, a preferred embodiment of the present invention will be described in detail with reference to the attached drawings. In this specification and drawings, components having substantially the same functional configuration will be denoted with the same reference numerals to avoid repetitive explanation. Furthermore, the embodiments shown below are merely examples, and other known elements or alternative means may be adopted depending on the application, purpose, scale, etc.

<A.一実施形態>
[A-1.構成]
(A-1-1.全体構成)
 図1は、本発明の一実施形態に係る空中撮影システム1(以下「システム1」ともいう。)の全体構成図である。システム1は、競技場F(図7)(対象エリアの例である。)で行われている競技、又は催物会場で行われている催物等をドローン100(移動体の例である。)で空中撮影するものである。競技場Fは、対象エリアの一例である。ドローン100は、1個のシステム1に複数含まれていてもよい。この場合、システム1は、1個の競技場Fに複数のドローンを同時に飛行させることで当該競技場Fを撮影することができる。
A. One embodiment
[A-1. composition]
(A-1-1. Overall configuration)
FIG. 1 is an overall configuration diagram of an aerial photography system 1 (hereinafter also referred to as "system 1") according to one embodiment of the present invention. System 1 uses a drone 100 (an example of a moving body) to take aerial photographs of a competition taking place at a stadium F (FIG. 7) (an example of a target area) or an event taking place at an event venue. Stadium F is an example of a target area. A single system 1 may include multiple drones 100. In this case, system 1 can photograph a single stadium F by flying multiple drones simultaneously over the stadium F.

 以降の説明においては、必要に応じてサッカーを撮影するシステム1を例に説明するが、本システム1はサッカー以外の競技や催物にも適用可能である。 In the following explanation, we will use as an example the system 1 that films soccer as needed, but this system 1 can also be applied to sports and events other than soccer.

 図1に示すように、システム1は、ドローン100に加えて、主として、操縦者がドローン100を操作するための操縦器200と、ドローン100の飛行及び撮影を管理するサーバ300と、外部入力装置600と、外部システム700と、を有する。 As shown in FIG. 1, in addition to the drone 100, the system 1 mainly includes a controller 200 that allows the pilot to operate the drone 100, a server 300 that manages the flight and photography of the drone 100, an external input device 600, and an external system 700.

 ドローン100と操縦器200は、無線通信(基地局800を介するものを含み得る。)を介して互いに接続される。操縦器200とサーバ300は、インターネット回線等の通信ネットワーク400を介して互いに接続される。ドローン100は、自己位置の特定等のため、人工衛星500から衛星信号を取得する。 The drone 100 and the controller 200 are connected to each other via wireless communication (which may include communication via a base station 800). The controller 200 and the server 300 are connected to each other via a communication network 400 such as an internet line. The drone 100 acquires satellite signals from an artificial satellite 500 to determine its own position, etc.

 外部入力装置660は、操縦器200とは別に本システム1との間で情報を送受信できる装置であり、例えばスマートホン又はタブレット端末等のモバイル端末で構成される。外部入力装置660は、例えば、競技場Fで行われている競技の監督、コーチ、ベンチの選手、審判、又はコート設備関係者等により操作可能である。外部入力装置660は、例えば、緊急の撮影中断指令を受け付ける機能を有し、当該撮影中断指令に基づいてドローン100は緊急避難を行う。また、外部入力装置660は、ドローン100の飛行モードの切替入力を受け付けてもよい。さらに、外部入力装置660は表示装置を備え、操縦器200の表示部201と同様の情報が表示されてもよい。特に、外部入力装置660は、競技で発生するイベント情報を取得してもよい。当該イベント情報は、外部入力装置660のユーザにより、ドローン100の飛行モードを切り替える入力を行う際に参照される。 The external input device 660 is a device capable of transmitting and receiving information to and from the system 1, separate from the controller 200, and is composed of a mobile terminal such as a smartphone or tablet terminal. The external input device 660 can be operated, for example, by the manager, coach, bench player, referee, or court equipment personnel of the competition taking place at the stadium F. The external input device 660 has, for example, a function for receiving an emergency command to suspend filming, and the drone 100 performs emergency evacuation based on the command. The external input device 660 may also receive an input to switch the flight mode of the drone 100. Furthermore, the external input device 660 may be equipped with a display device, and may display information similar to that displayed on the display unit 201 of the controller 200. In particular, the external input device 660 may acquire event information that occurs during the competition. The event information is referred to when the user of the external input device 660 makes an input to switch the flight mode of the drone 100.

 外部システム700は、システム1とは別途に構成される任意のシステムであってよく、例えば、競技場Fで行われる競技に関して配備されるシステムとして、コート設備システム、試合運営システム、審判支援システム、といったシステムが適用可能である他、競技とは独立して配備されている気象観測システム又は地震観測システムといったシステムが適用可能である。複数の外部システム700がシステム1に接続されていてもよい。システム1は、種々の外部システム700から、緊急の撮影中断指令やドローン100の飛行モードの切替指令を受け付けてもよい。また、種々の外部システム700は、競技で発生するイベント情報を取得してもよい。 The external system 700 may be any system configured separately from the system 1. For example, systems such as a court facility system, a match management system, and a referee support system may be applied as systems deployed in relation to the competition held at the stadium F, and systems such as a weather observation system or an earthquake observation system deployed independently of the competition may also be applied. Multiple external systems 700 may be connected to the system 1. The system 1 may receive an emergency command to stop filming or a command to switch the flight mode of the drone 100 from the various external systems 700. In addition, the various external systems 700 may acquire event information that occurs during the competition.

 外部システム700の1例としてのコート設備システムは、例えばシステム1から撮影画像の輝度を取得し、競技場Fの照明の照度調整又は明滅を制御してもよい。また、コート設備システムは、システム1から照明照度の要求を受信して照度調整又は明滅を制御してもよい。 The court facilities system, which is an example of the external system 700, may obtain the brightness of the captured image from the system 1, for example, and control the illuminance adjustment or blinking of the lighting in the stadium F. The court facilities system may also receive a request for lighting illuminance from the system 1 and control the illuminance adjustment or blinking.

 システム1の構成は、図1に示すものに限らず、例えばインターネット回線等の通信ネットワーク400を介して、ドローン100と操縦器200とサーバ300と基地局800とがそれぞれ相互に通信可能に接続されていてもよい。この場合、ドローン100は操縦器200を介さずにLTE等の通信方法によって直接通信ネットワーク400と無線通信を行ってよい。そのため、ドローン100と操縦器200及び基地局800は、直接無線通信を行う必要がなく、遠隔地においてそれぞれ通信ネットワーク400に接続できればよい。そのため、ドローン100と操縦器200が遠隔地に存在する場合(例えば、操縦者が遠隔操作を行う場合等)に適したシステム構成である。 The configuration of system 1 is not limited to that shown in FIG. 1, and the drone 100, the controller 200, the server 300, and the base station 800 may each be connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line. In this case, the drone 100 may perform wireless communication directly with the communication network 400 using a communication method such as LTE without going through the controller 200. Therefore, the drone 100, the controller 200, and the base station 800 do not need to perform direct wireless communication, and it is sufficient if they can each be connected to the communication network 400 in a remote location. Therefore, this system configuration is suitable for cases where the drone 100 and the controller 200 are in a remote location (for example, when a pilot operates them remotely).

 また、システム1は、インターネット回線等の通信ネットワーク400を介して、ドローン100と操縦器200と基地局800とサーバ300とがそれぞれ相互に通信可能に接続され、且つドローン100及び基地局800は人工衛星500を介した衛星通信により通信ネットワーク400と通信接続されてもよい。 In addition, in the system 1, the drone 100, the controller 200, the base station 800, and the server 300 are each connected to each other so that they can communicate with each other via a communication network 400 such as an Internet line, and the drone 100 and the base station 800 may be communicatively connected to the communication network 400 by satellite communication via an artificial satellite 500.

 さらに、システム1は、1台のドローン100に対して複数のサーバ300が複数の通信ネットワーク400を介して接続され、すなわちシステムが冗長化されていてもよい。この場合、サーバ300、又は通信ネットワーク400に異常が生じた場合であっても、冗長化された他のサーバ300や通信ネットワーク400によりシステム1の動作、ひいてはドローン100による撮影を継続することができるため、システム1の信頼性を向上させることができる。なお、上記の2形態においても、ドローン100と操縦器200が遠隔にあっても操縦可能であるため、遠隔操作に適した構成ではあるが、これに限られず、操縦者がドローン100を見ながら手動制御する有視界飛行にも適用可能である。 Furthermore, in the system 1, multiple servers 300 may be connected to one drone 100 via multiple communication networks 400, i.e., the system may be made redundant. In this case, even if an abnormality occurs in the server 300 or communication network 400, the operation of the system 1, and therefore shooting by the drone 100, can be continued by the other redundant servers 300 and communication networks 400, thereby improving the reliability of the system 1. Note that in both of the above forms, the drone 100 and the controller 200 can be controlled even when they are remotely located, making them suitable for remote operation, but this is not limited to this, and they can also be applied to visual flight in which the pilot manually controls the drone 100 while watching it.

 上記実施形態において説明した装置は、単独の装置として実現されてもよく、一部又は全部が通信ネットワーク400で接続された複数の装置(例えばドローン100、操縦器200、クラウドサーバ300、)等により実現されてもよい。例えば、サーバ300の各機能部及び記憶部は、互いに通信ネットワーク400で接続された異なるサーバ300、ドローン100、操縦器200に実装されることにより実現されてもよい。 The device described in the above embodiment may be realized as a single device, or may be realized by multiple devices (e.g., drone 100, controller 200, cloud server 300) that are partially or completely connected by a communication network 400. For example, each functional unit and memory unit of server 300 may be realized by being implemented in different servers 300, drones 100, and controllers 200 that are connected to each other by the communication network 400.

(A-1-2.ドローン100)
(A-1-2-1.ドローン100の概要)
 図2は、本実施形態のドローン100を簡略的に示す外観斜視図である。図3は、本実施形態のドローン100の機能構成図である。上記の通り、ドローン100は、競技場F(図7)で行われている競技、催物会場で行われている催物等を空中撮影する。
(A-1-2. Drone 100)
(A-1-2-1. Overview of the drone 100)
Fig. 2 is a simplified external perspective view of the drone 100 of this embodiment. Fig. 3 is a functional configuration diagram of the drone 100 of this embodiment. As described above, the drone 100 takes aerial photographs of competitions held in the stadium F (Fig. 7) and events held in the event venue.

 本明細書において、「ドローン」とは、動力手段(電力、原動機等)、操縦方式(無線であるか有線であるか、及び、完全自律飛行型であるか部分手動操縦型であるか等)を問わず、また、有人か無人かを問わず、自律的に姿勢制御を行う機能を有する飛行体全般を指すこととする。また、ドローンは、無人航空機(Unmanned Aerial Vehicle:UAV)、飛行体、マルチコプター(Multi Copter)、RPAS(Remote Piloted Aircraft Systems)、又はUAS(Unmanned Aircraft Systems)等と称呼されることがある。 In this specification, "drone" refers to any flying object that has the ability to autonomously control its attitude, regardless of the power source (electricity, prime mover, etc.), control method (wireless or wired, and fully autonomous or partially manual, etc.), and whether manned or unmanned. Drones are also sometimes referred to as Unmanned Aerial Vehicles (UAVs), flying objects, multicopters, RPAS (Remote Piloted Aircraft Systems), or UAS (Unmanned Aircraft Systems), etc.

 図2に示すように、ドローン100の外観は主として、筐体101と、複数のプロペラ122と、により構成される。筐体101は例えば略直方体であるが、形状は任意である。筐体101の左右側面には、側方に伸び出る棒状の連結部102が連結されている。連結部102の他端には、それぞれプロペラ122と、各プロペラ122を回転させるモータ121が連結される。モータ121は、例えば電動モータである。なお、本実施形態においては、連結部102、プロペラ122およびモータ121は4個ずつ備えられているが、個数はこれに限られない。プロペラ122は単独のプロペラで構成されていてもよいし、同軸配置された複数のプロペラで構成されていてもよい。各プロペラの羽根(ブレード)の枚数及び形状は特に限定されない。 As shown in FIG. 2, the exterior of the drone 100 is mainly composed of a housing 101 and multiple propellers 122. The housing 101 is, for example, a roughly rectangular parallelepiped, but may have any shape. Rod-shaped connecting parts 102 extending laterally are connected to the left and right sides of the housing 101. The other ends of the connecting parts 102 are respectively connected to propellers 122 and motors 121 that rotate the propellers 122. The motors 121 are, for example, electric motors. Note that in this embodiment, there are four connecting parts 102, propellers 122, and motors 121, but the number is not limited to this. The propellers 122 may be composed of a single propeller, or may be composed of multiple propellers arranged coaxially. The number and shape of the blades of each propeller are not particularly limited.

 また、プロペラ122の外側には、障害物に対するプロペラの干渉を防ぐためのプロペラガード(図示せず)を設けてもよい。 In addition, a propeller guard (not shown) may be provided on the outside of the propeller 122 to prevent the propeller from interfering with obstacles.

 筐体101には、例えば撮影用カメラ141が、筐体101下方にカメラ保持部142により保持されている。また、筐体101の前方面には、障害物検知カメラ131が配設されている。障害物検知カメラ131は、本実施形態においては対をなす2個のカメラにより構成される、いわゆるデュアルカメラである。障害物検知カメラ131は、ドローン100の前方を撮像するように配設されている。なお、障害物検知カメラ131は、前方面だけではなく筐体101のすべての面、例えば略直方体の筐体101においては6面に設けられていてもよい。 In the housing 101, for example, a photographing camera 141 is held by a camera holder 142 below the housing 101. In addition, an obstacle detection camera 131 is disposed on the front surface of the housing 101. In this embodiment, the obstacle detection camera 131 is a so-called dual camera consisting of two cameras that form a pair. The obstacle detection camera 131 is disposed so as to capture an image in front of the drone 100. Note that the obstacle detection camera 131 may be disposed not only on the front surface but also on all surfaces of the housing 101, for example, on six surfaces in the case of a housing 101 that is a substantially rectangular parallelepiped.

 ドローン100は、ドローン100の周囲にいる人々に対して、ドローン100の存在について注意喚起を行う警報装置250を備える。警報装置250は、例えば警告灯251及びスピーカ252を有する。警告灯251は、プロペラ122又はモータ121毎に設けられ、例えば複数のモータ121の各側面に配設される。警告灯251は正面の他、あらゆる方向から視認できるようモータ121の円筒状の側面に沿って配設されてよい。スピーカ252は、警告音を出力するものであり、ドローン100の筐体101に設けられる。スピーカ252は、例えば筐体101下面に設けられ、警告音をドローン100の下方に向かって伝達させる。 The drone 100 is equipped with an alarm device 250 that alerts people around the drone 100 to the presence of the drone 100. The alarm device 250 has, for example, a warning light 251 and a speaker 252. The warning light 251 is provided for each propeller 122 or motor 121, and is disposed, for example, on each side of multiple motors 121. The warning light 251 may be disposed along the cylindrical side of the motor 121 so that it can be seen from all directions in addition to the front. The speaker 252 outputs an alarm sound and is provided in the housing 101 of the drone 100. The speaker 252 is provided, for example, on the underside of the housing 101, and transmits the alarm sound downwards of the drone 100.

(A-1-2-2.ドローン100の機能ブロック)
 図3に示すように、ドローン100は、情報処理を実行するためのCPU(Central Processing Unit)等の演算装置、RAM(Random Access Memory)及びROM(Read Only Memory)等の記憶装置を備え、これにより、主として、測定部110、飛行機能部120,障害物検知部130、撮影部140および通信部150の各機能ブロックを有する。
(A-1-2-2. Functional blocks of drone 100)
As shown in FIG. 3, the drone 100 is equipped with an arithmetic device such as a CPU (Central Processing Unit) for executing information processing, and storage devices such as a RAM (Random Access Memory) and a ROM (Read Only Memory), and thereby has the following functional blocks: a measurement unit 110, a flight function unit 120, an obstacle detection unit 130, an imaging unit 140, and a communication unit 150.

 測定部110は、ドローン100又はその周辺に関する情報を測定する機能部である。測定部110は、例えば位置測定部111、方位測定部112、高度測定部113および速度測定部114等を有する。測定部110はこれらに加えて、温度、気圧、風速、加速度等の情報を取得する種々のセンサ等を含んでもよい。 The measurement unit 110 is a functional unit that measures information related to the drone 100 or its surroundings. The measurement unit 110 has, for example, a position measurement unit 111, a direction measurement unit 112, an altitude measurement unit 113, and a speed measurement unit 114. In addition to these, the measurement unit 110 may also include various sensors that acquire information such as temperature, air pressure, wind speed, and acceleration.

 位置測定部111は、人工衛星500からの信号を受信し、それに基づいて機体の位置(絶対位置)を測定する。位置測定部111は、特に限定されないが、例えば、GNSS(Global Navigation Satellite System)、GPS(Global Positioning System)等を用いて、現時点での自己位置を測定する。自己位置の測定方法として、例えば、RTK-GNSS(Real Time Kinematic - Global Navigation Satellite System)を用いることもできる。位置情報は、少なくとも平面視での2次元での座標情報(例えば緯度、経度)を含み、好ましくは高度情報を含む3次元での座標情報を含む。 The position measurement unit 111 receives signals from the artificial satellites 500 and measures the position (absolute position) of the aircraft based on the signals. The position measurement unit 111 measures its current position using, for example, GNSS (Global Navigation Satellite System), GPS (Global Positioning System), etc., but is not limited to this. As a method for measuring the position, for example, RTK-GNSS (Real Time Kinematic - Global Navigation Satellite System) can also be used. The position information includes at least two-dimensional coordinate information in a planar view (e.g., latitude, longitude), and preferably includes three-dimensional coordinate information including altitude information.

 また、RTK等の相対測位に用いる固定局の基準点の情報を提供する基地局800がドローン100及び操縦器200と無線通信可能に接続されることで、ドローン100の位置をより高い精度で計測することが可能となる。ここで、VRS(Virtual Reference Station)による仮想基準点方式を用いたRTK計測を行う場合には、基地局800を省略すること、又は、基地局800又はドローン100の位置座標推定の精度をさらに向上することができる。 In addition, the base station 800, which provides information on the reference points of fixed stations used for relative positioning such as RTK, is connected to the drone 100 and the controller 200 so as to be able to communicate wirelessly with them, making it possible to measure the position of the drone 100 with greater accuracy. Here, when performing RTK measurement using a virtual reference point method using a VRS (Virtual Reference Station), the base station 800 can be omitted, or the accuracy of the position coordinate estimation of the base station 800 or the drone 100 can be further improved.

 方位測定部112は、機体の向き(機首方向、ヘディング方向)を測定する。方位測定部112は、例えば地磁気の測定によりドローン100の機体の機首方位(ヘディング方向)を測定する地磁気センサ、コンパス等で構成される。 The orientation measurement unit 112 measures the orientation of the aircraft (nose direction, heading direction). The orientation measurement unit 112 is composed of a geomagnetic sensor that measures the nose direction (heading direction) of the drone 100 aircraft by measuring geomagnetism, a compass, etc.

 高度測定部113は、ドローン100下方(鉛直下向き)の地面に対する距離としての対地高度(以下「飛行高度」ともいう。)を測定する。 The altitude measurement unit 113 measures the altitude above the ground (hereinafter also referred to as "flight altitude") as the distance from the ground below the drone 100 (vertically downward).

 速度測定部114は、ドローン100の飛行速度を検出する。速度測定部114は、例えばジャイロセンサ等公知のセンサにより速度を測定してよい。 The speed measurement unit 114 detects the flight speed of the drone 100. The speed measurement unit 114 may measure the speed using a known sensor such as a gyro sensor.

(A-1-2-3.飛行機能部120)
 飛行機能部120は、ドローン100を飛行させる機構および機能部であり、ドローン100を浮上させて、所望の方向に移動するための推力を機体に発生させる。図2及び図3に示すように、飛行機能部120は、複数のモータ121と、複数のプロペラ122と、飛行制御部123と、を有する。
(A-1-2-3. Flight function unit 120)
The flight function unit 120 is a mechanism and function unit that causes the drone 100 to fly, and generates thrust in the airframe for lifting the drone 100 and moving it in a desired direction. As shown in Figures 2 and 3, the flight function unit 120 has a plurality of motors 121, a plurality of propellers 122, and a flight control unit 123.

 飛行制御部123は、複数のモータ121を独立して制御することにより各プロペラ122を回転させ、ドローン100に浮上、前進、旋回、着陸等の各動作を行わせ、離陸から飛行中、着陸までのドローン100の姿勢角制御及び飛行動作を制御する。 The flight control unit 123 independently controls the multiple motors 121 to rotate each propeller 122, causing the drone 100 to perform various operations such as taking off, moving forward, turning, and landing, and controls the attitude angle control and flight operations of the drone 100 from takeoff, during flight, and until landing.

 飛行制御部123は、フライトコントローラとも呼ばれる処理ユニットを有する。処理ユニットは、プログラマブルプロセッサ(例えば、中央処理ユニット(CPU)、MPU又はDSP)等の1つ以上のプロセッサを有することができる。処理ユニットは、メモリ(記憶部)にアクセス可能である。メモリは、1つ以上のステップを行うために処理ユニットが実行可能であるロジック、コード、及び/又はプログラム命令を記憶している。メモリは、例えば、SDカードやRAM等の分離可能な媒体又は外部の記憶装置を含んでいてもよい。測定部110により取得される各種データ、又は撮影用カメラ141で撮影した動画もしくは静止画のデータは、当該メモリに直接に伝達され且つ記憶されてもよい。なお、各データは外部メモリに記録することもできる。 The flight control unit 123 has a processing unit, also called a flight controller. The processing unit can have one or more processors, such as a programmable processor (e.g., a central processing unit (CPU), MPU, or DSP). The processing unit has access to a memory (storage unit). The memory stores logic, code, and/or program instructions that the processing unit can execute to perform one or more steps. The memory may include, for example, a separable medium such as an SD card or RAM, or an external storage device. Various data acquired by the measurement unit 110, or video or still image data captured by the imaging camera 141, may be directly transmitted to and stored in the memory. Each data may also be recorded in an external memory.

 処理ユニットは、ドローン100の機体の状態を制御するように構成された制御モジュールを含んでいる。例えば、制御モジュールは、6自由度(並進運動x、y及びz、並びに回転運動θx、θy及びθz)を有するドローン100の空間的配置、姿勢角角度、角速度、角加速度、角躍度速度及び/又は加速度を調整するためにドローン100の飛行機能部120(推力発生部)を制御する。 The processing unit includes a control module configured to control the state of the drone 100. For example, the control module controls the flight function section 120 (thrust generating section) of the drone 100 to adjust the spatial arrangement, attitude angle, angular velocity, angular acceleration, angular jerk rate, and/or acceleration of the drone 100 having six degrees of freedom (translational motion x, y, and z, and rotational motion θx, θy, and θz).

 飛行制御部123は、操縦器200からの操縦信号に基づいて、又は予め設定された自律飛行プログラムに基づいて、ドローン100の飛行を制御することができる。また飛行制御部123は、撮影対象フィールド、飛行許可/禁止エリア、これに対応する飛行ジオフェンスの情報、2次元又は3次元の地図データを含む地図情報、ドローン100の現在の位置情報、姿勢情報(機首方位情報)、速度情報、及び加速度情報等の各種情報及びこれらの任意の組み合わせに基づいてモータ121を制御することにより、ドローン100の飛行を制御することができる。 The flight control unit 123 can control the flight of the drone 100 based on control signals from the pilot 200 or based on a preset autonomous flight program. The flight control unit 123 can also control the flight of the drone 100 by controlling the motor 121 based on various information such as the field to be photographed, flight permitted/prohibited areas, information on the corresponding flight geofences, map information including two-dimensional or three-dimensional map data, the current position information of the drone 100, attitude information (heading information), speed information, and acceleration information, and any combination of these.

●撮影対象フィールドと撮影位置の例
 本明細書において、「撮影対象フィールド」又は「対象エリア」は、撮影対象となる2次元の場所(例えば、競技場F)を意味する。
Examples of Shooting Target Field and Shooting Positions In this specification, the term "shooting target field" or "target area" refers to a two-dimensional location (for example, the stadium F) that is the subject of shooting.

 図7は、ドローンが飛行する撮影対象フィールドの例である競技場Fの1例を示す模式図であり、同図は競技場Fを上から見た図である。競技場Fは、例えば直線状の外縁により区画される略矩形のコートF100と、コートF100の外縁を覆う所定の領域であるコート外領域F200により構成される。コートF100の外縁は、互いに向かい合うゴールラインF110a、F110bと、互いに向かい合うタッチラインF111a、F111bと、が略直角にそれぞれ接続されることにより構成される。ゴールラインF110a、F110bとタッチラインF111a、F111bの接続点は、コーナーF112a、F113a、F112b、F113bとなっている。 FIG. 7 is a schematic diagram showing an example of a playing field F, which is an example of a field to be photographed by a drone, viewed from above. The playing field F is composed of a court F100 that is roughly rectangular and is defined by, for example, a straight outer edge, and an outer court area F200 that is a predetermined area that covers the outer edge of the court F100. The outer edge of the court F100 is composed of mutually opposing goal lines F110a, F110b and mutually opposing touch lines F111a, F111b that are connected at roughly right angles. The connection points of the goal lines F110a, F110b and the touch lines F111a, F111b are the corners F112a, F113a, F112b, F113b.

 1対のゴールラインF110a、F110bの略中央にはそれぞれゴールF120a、F120bが設けられている。コートF100の内部であってゴールF120a、F120bに連続する所定領域にはそれぞれペナルティエリアF130a、F130bが規定され、当該ペナルティエリアの外縁にはペナルティラインF140a、F140bが描画されている。 Goals F120a, F120b are provided approximately in the center of the pair of goal lines F110a, F110b. Penalty areas F130a, F130b are defined in specific areas inside the court F100 adjacent to the goals F120a, F120b, and penalty lines F140a, F140b are drawn on the outer edges of the penalty areas.

 コートF100の中央には、1対のタッチラインF111a、F111bの中点間を接続することでコートF100を略等分するハーフウェーラインF150が描画されている。ハーフウェーラインF150は、ゴールラインF110a、F110bと略平行である。 A halfway line F150 is drawn in the center of the court F100, connecting the midpoints of a pair of touchlines F111a, F111b and dividing the court F100 into approximately equal parts. The halfway line F150 is approximately parallel to the goal lines F110a, F110b.

 なお、ゴールラインF110a、F110b、タッチラインF111a、F111b、ペナルティラインF140a、F140bおよびハーフウェーラインF150は、競技者が競技を行うためにルール上必要な線であるため、いずれの線も視認できる態様で描画されることが一般的であるが、本発明の技術的範囲はこれに限られない。また、本説明においてはサッカーの競技場を例に説明するが、本発明にかかるシステムにより撮影される競技はサッカーに限られず、テニス等任意のあらゆる競技を含む。さらに、撮影対象はスポーツに限られず、その他の催物(コンサート、式典等)にも適用することが可能である。 Note that the goal lines F110a, F110b, touchlines F111a, F111b, penalty lines F140a, F140b, and halfway line F150 are required by the rules for players to play the game, and therefore all of these lines are generally drawn in a manner that allows them to be seen, but the technical scope of the present invention is not limited to this. Also, in this explanation, a soccer stadium is used as an example, but the sports that are photographed by the system of the present invention are not limited to soccer, and include any type of sports, such as tennis. Furthermore, the subject of the photography is not limited to sports, and the system can also be applied to other events (concerts, ceremonies, etc.).

 競技場Fには、あらかじめ定められた複数の撮影位置L101~L105、L206~L215が定義されている。撮影位置L101~L105、L206~L215は、平面上の2次元座標であってもよいし、当該位置における高さも合わせて定義された3次元座標の情報でもよい。ドローン100の飛行高さは、操縦器200からの入力に基づいて手動で制御可能になっていてもよい。 In the stadium F, multiple predetermined shooting positions L101-L105, L206-L215 are defined. The shooting positions L101-L105, L206-L215 may be two-dimensional coordinates on a plane, or may be three-dimensional coordinate information that also defines the height at the corresponding positions. The flight height of the drone 100 may be manually controllable based on input from the controller 200.

 撮影位置L101~L105は、例えばタッチラインF111b上に、タッチラインF111bに沿って略等間隔に定義されている。例えば、撮影位置L101は、ハーフウェーラインF150とタッチラインF111bの交点およびコートF100からやや外側を含む範囲に位置する地点である。撮影位置L103、L105は、タッチラインF111b両側のコーナーF112a又はF112b付近の地点である。撮影位置L102、L104は、撮影位置L103、L105と撮影位置L101の間の地点である。なお、上述の位置は例示であり、これに限られず適宜の位置であってよい。 Photographing positions L101 to L105 are defined, for example, on the touchline F111b, at approximately equal intervals along the touchline F111b. For example, photographing position L101 is a point located in a range including the intersection of the halfway line F150 and the touchline F111b and slightly outside the court F100. Photographing positions L103 and L105 are points near the corners F112a or F112b on both sides of the touchline F111b. Photographing positions L102 and L104 are points between photographing positions L103 and L105 and photographing position L101. Note that the above positions are merely examples, and are not limited to these and may be any appropriate positions.

 撮影位置L206~L215は、コートF100内部に定義される地点である。例えば、撮影位置L206、L211は、ペナルティラインF140a、F140bのうちゴールラインF110a、F110bと平行な線上の中央付近の地点であり、いわゆるゴール前撮影位置である。撮影位置L207、L212は、撮影位置L206、L215よりもタッチライン又はF111a又はF111b寄りかつハーフウェーラインF150寄りの撮影位置である。より具体的には例えば、撮影位置L207、L212は、撮影位置L101とゴールF120a、F120bとを結んだ仮想線分上の地点であり、例えば当該仮想線分の略中央の地点である。撮影位置L209、L215は、撮影位置L207、L212の線対称の地点である。撮影位置L208は、撮影位置L207とハーフウェーラインF150との間の地点、撮影位置L210は、撮影位置L209とハーフウェーラインF150との間の地点、撮影位置L213は、撮影位置L212とハーフウェーラインF150との間の地点、撮影位置L214は、撮影位置L215とハーフウェーラインF150との間の地点である。 Photographing positions L206 to L215 are points defined within the court F100. For example, photographing positions L206 and L211 are points near the center of the penalty lines F140a and F140b on a line parallel to the goal lines F110a and F110b, and are so-called goal-front photographing positions. Photographing positions L207 and L212 are photographing positions closer to the touchline or F111a or F111b and closer to the halfway line F150 than photographing positions L206 and L215. More specifically, for example, photographing positions L207 and L212 are points on an imaginary line segment connecting photographing position L101 and goals F120a and F120b, and are, for example, points approximately in the center of the imaginary line segment. Photographing positions L209 and L215 are points that are linearly symmetrical to photographing positions L207 and L212. Shooting position L208 is a point between shooting position L207 and the halfway line F150, shooting position L210 is a point between shooting position L209 and the halfway line F150, shooting position L213 is a point between shooting position L212 and the halfway line F150, and shooting position L214 is a point between shooting position L215 and the halfway line F150.

 コート外領域F200には、ドローン100又はシステム1の異常や故障を検知した場合に、ドローン100を退避させる退避地点H200が設定されている。ここにいう異常とは、ドローン100の空中移動の安定性に関する異常である。当該異常は、例えば、ドローン100の動作制御(挙動制御、撮影制御等)に伴う演算負荷が負荷閾値を上回る場合を含む。或いは、当該異常は、環境に関する一過性の異常、例えば強風等の影響によりドローン100の挙動制御値(例えば速度)の測定値が許容値を超えている場合を含んでもよい。 In the outside court area F200, an evacuation point H200 is set to which the drone 100 is to be evacuated if an abnormality or malfunction of the drone 100 or the system 1 is detected. The abnormality referred to here is an abnormality related to the stability of the aerial movement of the drone 100. The abnormality includes, for example, a case where the calculation load associated with the operation control (behavior control, shooting control, etc.) of the drone 100 exceeds a load threshold. Alternatively, the abnormality may include a transient abnormality related to the environment, such as a case where the measured value of the behavior control value (e.g. speed) of the drone 100 exceeds an allowable value due to the influence of a strong wind or the like.

 退避地点H200は、撮影位置L101~L105、L206~L215とは異なる地点に設定され、本実施形態においてはタッチラインF111aの外側に、タッチラインF111aに沿って設定されている。退避地点H200は複数あってよく、本実施例においては、3個である。退避地点H220は、ハーフウェーラインF150の延長線上付近に設定されている。退避地点H210、H230は、撮影位置L206、L211よりもゴールF120a、F120b寄りに設定されている。退避地点H210、H230は、例えば後述するジオフェンスG200に区画される領域内の端部に設定される。退避地点H200では、例えばドローン100の機体の交代やドローン100に搭載されているバッテリの交換が行われる。 The evacuation point H200 is set at a point different from the shooting positions L101 to L105 and L206 to L215, and in this embodiment, it is set outside the touchline F111a and along the touchline F111a. There may be multiple evacuation points H200, and in this embodiment, there are three. The evacuation point H220 is set near the extension of the halfway line F150. The evacuation points H210 and H230 are set closer to the goals F120a and F120b than the shooting positions L206 and L211. The evacuation points H210 and H230 are set at the ends of an area partitioned by a geofence G200, which will be described later, for example. At the evacuation point H200, for example, the drone 100 is replaced or the battery installed in the drone 100 is replaced.

 競技場Fにおいては、少なくとも各撮影位置L101~L105、L206~L215を包含する複数のジオフェンスG100、G200が設定されている。ジオフェンスは、領域を区画する仮想的な境界線を示すものであり、特に、本実施形態におけるジオフェンスは、ドローン100が飛行又は移動が許可される飛行許可エリアと飛行禁止エリアの境界線のフェンスを示す。ジオフェンスは平面および高さを含む3次元的に広がる領域を区画する境界線である。ドローン100等の移動体がジオフェンスに接触した場合には、飛行許可エリアの外側に機体が飛び出さないように飛行又は移動が制限される。 In the stadium F, multiple geofences G100, G200 are set, encompassing at least each of the shooting positions L101-L105, L206-L215. A geofence indicates a virtual boundary line that divides an area, and in particular, the geofence in this embodiment indicates a fence that is the boundary line between a flight-permitted area, where the drone 100 is permitted to fly or move, and a no-fly area. A geofence is a boundary line that divides an area that extends three-dimensionally, including in plane and height. If a moving object such as the drone 100 comes into contact with a geofence, flight or movement is restricted to prevent the aircraft from flying outside the flight-permitted area.

 ジオフェンスの高さ方向の境界線は、上限および下限を含んでいてよい。本実施形態では、飛行可否に適用されるジオフェンスG100、G200は、ドローン100の飛行中に、システム1の制御に応じて切り替えられる。同図に描画されるジオフェンスG100、G200の数は2個であるが、個数は任意であり、具体的には3個以上であってもよい。 The boundary line of the geofence in the height direction may include an upper limit and a lower limit. In this embodiment, the geofences G100, G200 that are applied to whether or not flight is permitted are switched according to the control of the system 1 while the drone 100 is flying. The number of geofences G100, G200 depicted in the figure is two, but the number is arbitrary, and specifically may be three or more.

 ジオフェンスG100は、撮影位置L101~L105を包含する領域であり、タッチラインF111b上およびその近傍の領域を包含する領域を区画している。言い換えれば、ジオフェンスG100は、コートF100の外縁付近に規定され、一部はコート外領域F200に伸び出ている。ジオフェンスG200は、後述する外縁飛行モードM102において主に適用されるジオフェンスである。ジオフェンスG200は、撮影位置L206~L215を包含する領域であり、少なくともコートF100の内部に設定されている。このジオフェンスG200は、後述するコート内飛行モードM105において主に適用されるジオフェンスである。 The geofence G100 is an area that includes the shooting positions L101 to L105, and defines an area that includes the touchline F111b and the area nearby. In other words, the geofence G100 is defined near the outer edge of the court F100, and a portion of it extends into the outer court area F200. The geofence G200 is a geofence that is primarily applied in the outer edge flight mode M102, which will be described later. The geofence G200 is an area that includes the shooting positions L206 to L215, and is set at least inside the court F100. This geofence G200 is a geofence that is primarily applied in the on-court flight mode M105, which will be described later.

 複数のジオフェンスG100、G200に区画される領域は、少なくとも一部が互いに接触又は重複する。複数のジオフェンスG100、G200に区画される領域は、高さ方向にも重複する。なお、複数のジオフェンスG100、G200の高さは互いに異なっていてもよい。具体的には、競技場F内部に設定されるジオフェンスG200の高度の下限は、競技場Fの外縁に設定されるジオフェンスG100の高度の下限よりも高く設定される。競技場Fの内部には競技のプレイヤー等の撮影対象者が存在している蓋然性が高いため、ジオフェンスの高度の下限を高くし、ドローン100を十分な高さで飛行させることで、撮影対象者の動作の妨害や撮影対象者又はボールとの衝突を防ぐことができる。 The areas defined by the multiple geofences G100, G200 at least partially contact or overlap each other. The areas defined by the multiple geofences G100, G200 also overlap in the height direction. The heights of the multiple geofences G100, G200 may differ from each other. Specifically, the lower limit of the altitude of the geofence G200 set inside the stadium F is set higher than the lower limit of the altitude of the geofence G100 set on the outer edge of the stadium F. Since there is a high probability that subjects to be photographed, such as players, are present inside the stadium F, by setting the lower limit of the altitude of the geofence high and flying the drone 100 at a sufficient height, it is possible to prevent interference with the movements of the subjects to be photographed and collisions with the subjects to be photographed or the ball.

(A-1-2-4.障害物検知部130)
 図3の説明に戻る。障害物検知部130は、ドローン100の周辺の障害物を検知する機能部である。障害物は、例えば人、例えば選手、物、鳥等の動物、固定設備およびボールを含んでよい。障害物検知部130は、取得画像に基づいてドローン100の下方等に位置する障害物の位置、速度ベクトル等を測定する。
(A-1-2-4. Obstacle detection unit 130)
Returning to the description of Fig. 3, the obstacle detection unit 130 is a functional unit that detects obstacles around the drone 100. The obstacles may include, for example, people, for example, players, objects, animals such as birds, fixed equipment, and a ball. The obstacle detection unit 130 measures the position, speed vector, and the like of an obstacle located, for example, below the drone 100 based on the acquired image.

 障害物検知部130は、例えば障害物検知カメラ131、ToF(Time of Flight)センサ132およびレーザーセンサ133を有する。ToFセンサ132は、センサからパルス投光されたレーザがセンサ内の受光素子に戻ってくるまでの時間を計測し、その時間を距離に換算することで物体までの距離を測定する。レーザーセンサ133は、例えばLiDAR(Light Detection And Ranging)方式により、近赤外光や可視光、紫外線等の光線を対象物に光を照射し、その反射光を光センサでとらえ距離を測定する。 The obstacle detection unit 130 includes, for example, an obstacle detection camera 131, a ToF (Time of Flight) sensor 132, and a laser sensor 133. The ToF sensor 132 measures the time it takes for a laser pulse emitted from the sensor to return to the light receiving element in the sensor, and measures the distance to an object by converting this time into distance. The laser sensor 133 uses, for example, the LiDAR (Light Detection And Ranging) method to shine light such as near-infrared light, visible light, or ultraviolet light on the target object and measure the distance by capturing the reflected light with an optical sensor.

 図2には、本実施形態では障害物検知カメラ131が前方を向いて配置されていることが図示されているが、このカメラ131、ToFセンサ132、およびレーザーセンサ133の種類、位置および数は任意であり、カメラ131に代えてToFセンサ132又はレーザーセンサ133が配置されていてもよいし、ToFセンサ132又はレーザーセンサ133が筐体101の6面、すなわち前面、背面、上面、底面、および両側面のすべてに設けられていてもよい。 In this embodiment, FIG. 2 shows that the obstacle detection camera 131 is positioned facing forward, but the type, position and number of the camera 131, ToF sensor 132 and laser sensor 133 are arbitrary, and the ToF sensor 132 or laser sensor 133 may be positioned instead of the camera 131, or the ToF sensor 132 or laser sensor 133 may be provided on all six surfaces of the housing 101, i.e., the front, back, top, bottom and both sides.

(A-1-2-5.撮影部140)
 撮影部140は、競技場F(図7)における競技、又は催物会場における催物等の映像を撮影する機能部であり、撮影用カメラ141、カメラ保持部142及び撮影制御部143を有する。図2に示すように、撮影用カメラ141(撮像装置)は、ドローン100の本体の下部に配置され、ドローン100の周辺を撮影した周辺画像に関する画像データを出力する。撮影用カメラ141は、動画を撮影するビデオカメラ(カラーカメラ)である。動画には、図示しないマイクロホンで取得した音声データを含めてもよい。これに加えて又はこれに代えて、撮影用カメラ141は、静止画の撮影を行うものとすることも可能である。
(A-1-2-5. Photographing unit 140)
The photographing unit 140 is a functional unit that photographs images of a competition in the stadium F (FIG. 7) or an event at an event venue, and includes a photographing camera 141, a camera holding unit 142, and a photographing control unit 143. As shown in FIG. 2, the photographing camera 141 (imaging device) is disposed at the bottom of the main body of the drone 100, and outputs image data related to a peripheral image photographed around the drone 100. The photographing camera 141 is a video camera (color camera) that photographs moving images. The moving images may include audio data acquired by a microphone (not shown). In addition to or instead of this, the photographing camera 141 may also be configured to photograph still images.

 撮影用カメラ141は、カメラ保持部142に組み込まれた図示しないカメラアクチュエータにより向き(ドローン100の筐体101に対する撮影用カメラ141の姿勢)を調整可能である。撮影用カメラ141は、露出、コントラスト又はISO等のパラメータの自動制御機能を有していてもよい。カメラ保持部142は、機体の揺れ又は振動が撮影用カメラ141に伝わるのを抑制する、いわゆるジンバル制御の機構を有していてもよい。撮影制御部143は、撮影用カメラ141およびカメラ保持部142を制御し撮影用カメラ141の向き、撮影倍率(ズーム量)およびカメラの撮影条件等を調整する。撮影用カメラ141が取得した画像データは、ドローン100自体の記憶部、操縦器200、サーバ300等にデータを送信することができる。 The orientation of the photographic camera 141 (the attitude of the photographic camera 141 relative to the housing 101 of the drone 100) can be adjusted by a camera actuator (not shown) built into the camera holding unit 142. The photographic camera 141 may have an automatic control function for parameters such as exposure, contrast, or ISO. The camera holding unit 142 may have a so-called gimbal control mechanism that suppresses the transmission of shaking or vibration of the aircraft to the photographic camera 141. The photographic control unit 143 controls the photographic camera 141 and the camera holding unit 142 to adjust the orientation of the photographic camera 141, the photographic magnification (zoom amount), the camera's photographic conditions, etc. Image data acquired by the photographic camera 141 can be transmitted to the memory unit of the drone 100 itself, the pilot 200, the server 300, etc.

(A-1-2-6.通信部150)
 通信部150は、通信ネットワーク400を介しての電波通信が可能であり、例えば、電波通信モジュールを含む。通信部150は、通信ネットワーク400(無線基地局800を含む。)を介することで、操縦器200等との通信が可能である。
(A-1-2-6. Communication unit 150)
The communication unit 150 is capable of radio wave communication via the communication network 400 and includes, for example, a radio wave communication module. The communication unit 150 is capable of communication with the controller 200 and the like via the communication network 400 (including the wireless base station 800).

(A-1-3.操縦器200)
(A-1-3-1.操縦器200の概要)
 図4は、本実施形態の操縦器200を簡略的に示す外観正面図である。図5は、本実施形態の操縦器200の機能構成図である。操縦器200は、操縦者の操作によりドローン100を制御すると共に、ドローン100から受信した情報(例えば、位置、高度、電池残量、カメラ映像等)を表示する携帯情報端末である。なお、本実施形態では、ドローン100の飛行状態(高度、姿勢等)は、操縦器200が遠隔制御可能であってもよいし、ドローン100が自律的に制御してもよい。例えば、操縦器200を介して操縦者からドローン100に飛行指令が送信されると、ドローン100は自律飛行を行う。また、離陸や帰還等の基本操作時、及び緊急時にはマニュアル操作が行なえるようになっていてもよい。
(A-1-3. Controller 200)
(A-1-3-1. Overview of Controller 200)
FIG. 4 is a front view of the appearance of the controller 200 of this embodiment in a simplified manner. FIG. 5 is a functional configuration diagram of the controller 200 of this embodiment. The controller 200 is a mobile information terminal that controls the drone 100 by the operation of the pilot and displays information received from the drone 100 (e.g., position, altitude, remaining battery level, camera image, etc.). In this embodiment, the flight state (altitude, attitude, etc.) of the drone 100 may be remotely controlled by the pilot 200, or may be autonomously controlled by the drone 100. For example, when a flight command is transmitted from the pilot to the drone 100 via the pilot 200, the drone 100 performs autonomous flight. In addition, manual operation may be possible during basic operations such as takeoff and return, and in an emergency.

 図4に示すように、操縦器200は、ハードウェア構成として、表示部201および入力部202を備える。表示部201および入力部202は、互いに有線又は無線で通信可能に接続されている。表示部201は、操縦器200に一体に組み込まれたタッチパネル又は液晶モニタ等で構成されていてもよいし、操縦器200に有線接続又は無線接続された液晶モニタ、タブレット端末、スマートホン等の表示装置で構成されていてもよい。ハードウェア構成としての表示部201には、タッチ等の入力を受け付ける素子が一体的に組み込まれ、タッチパネルディスプレイとなっていてもよい。 As shown in FIG. 4, the controller 200 includes a display unit 201 and an input unit 202 as a hardware configuration. The display unit 201 and the input unit 202 are connected to each other so that they can communicate with each other wired or wirelessly. The display unit 201 may be configured as a touch panel or liquid crystal monitor that is integrated into the controller 200, or may be configured as a display device such as a liquid crystal monitor, tablet terminal, or smartphone that is connected to the controller 200 wired or wirelessly. The display unit 201 as a hardware configuration may be configured as a touch panel display by integrally incorporating an element that accepts input such as touch.

 入力部202は、操縦者がドローン100を操縦する際に飛行方向や離陸/着陸等の動作指令を入力する機構である。
 図4(a)に示すように、入力部202は、左スライダ326L、右スライダ326R、左入力スティック327L、右入力スティック327R、電源ボタン328及び帰還ボタン329を有する。左スライダ326Lおよび右スライダ326Rは、例えば0/1の入力、又は1次元の無段階もしくは段階的な情報の入力を受け付ける操作子であり、操縦者は例えば操縦器200を手で保持した状態で、左右の人差し指によりスライドさせて入力を行う。左入力スティック327Lおよび右入力スティック327Rは、複数次元の無段階又は段階的な情報の入力を受け付ける操作子であり、例えばいわゆるジョイスティックである。また、左入力スティック327Lおよび右入力スティック327Rは、押下による0/1の入力を受け付けてもよい。電源ボタン328及び帰還ボタン329は、押下を受け付ける操作子であり、機械式スイッチ等により構成される。
The input unit 202 is a mechanism through which the pilot inputs operational commands such as flight direction and takeoff/landing when piloting the drone 100.
As shown in FIG. 4A, the input unit 202 has a left slider 326L, a right slider 326R, a left input stick 327L, a right input stick 327R, a power button 328, and a return button 329. The left slider 326L and the right slider 326R are operators that accept, for example, an input of 0/1, or an input of one-dimensional stepless or stepwise information, and the operator slides the left and right index fingers to input, for example, while holding the controller 200 in his/her hand. The left input stick 327L and the right input stick 327R are operators that accept an input of multi-dimensional stepless or stepwise information, and are, for example, so-called joysticks. The left input stick 327L and the right input stick 327R may also accept an input of 0/1 by pressing them. The power button 328 and the return button 329 are operators that accept pressing them, and are constituted by mechanical switches or the like.

 左入力スティック327Lおよび右入力スティック327Rは、例えば、離陸、着陸、上昇、下降、右旋回、左旋回、前進、後進、左移動、および右移動等を含めた3次元のドローン100の飛行動作を指示する入力操作を受け付ける。図4(b)は、図4(a)に示す左入力スティック327Lおよび右入力スティック327Rの各入力に対応させて、ドローン100の移動方向又は旋回方向を示した模式図である。なお、この対応関係は例示である。 The left input stick 327L and the right input stick 327R accept input operations that instruct the three-dimensional flight operations of the drone 100, including, for example, takeoff, landing, ascent, descent, right turn, left turn, forward movement, backward movement, left movement, and right movement. Figure 4(b) is a schematic diagram showing the movement direction or rotation direction of the drone 100 corresponding to each input of the left input stick 327L and right input stick 327R shown in Figure 4(a). Note that this correspondence is an example.

 図5に示すように、操縦器200は、情報処理を実行するためのCPU等の演算装置、RAM及びROM等の記憶装置を備え、これによりソフトウェア構成として、主として表示制御部210、入力制御部220および通信部240の各機能ブロックを構成する。 As shown in FIG. 5, the controller 200 includes a processor such as a CPU for executing information processing, and storage devices such as RAM and ROM, which constitute the software configuration of the main functional blocks of the display control unit 210, input control unit 220, and communication unit 240.

(A-1-3-2.表示制御部210)
 表示制御部210は、ドローン100又はサーバ300から取得したドローン100のステータス情報等を操縦者に表示する。表示制御部210は、撮影対象フィールド、飛行許可/禁止エリア、飛行ジオフェンス、地図情報、ドローン100の現在の位置情報、姿勢情報(方向情報)、速度情報、加速度情報及びバッテリ残量等の各種情報に関する画像を表示することができる。ここにいう「現在の位置情報」は、ドローン100の現在位置の水平方向位置の情報(すなわち緯度及び経度)を含んでいればよく、高度情報(絶対高度又は相対高度)は含まなくてもよい。
(A-1-3-2. Display control unit 210)
The display control unit 210 displays to the pilot the drone 100 or the status information of the drone 100 acquired from the server 300. The display control unit 210 can display images relating to various information such as the shooting target field, flight permitted/prohibited areas, flight geofence, map information, current position information of the drone 100, attitude information (directional information), speed information, acceleration information, and remaining battery power. The "current position information" referred to here is sufficient to include information on the horizontal position of the current position of the drone 100 (i.e., latitude and longitude), and does not need to include altitude information (absolute altitude or relative altitude).

 表示制御部210は、モード表示部211と、撮影状態表示部212と、を有する。 The display control unit 210 has a mode display unit 211 and a shooting status display unit 212.

 モード表示部211は、少なくともドローン100が属する状態、すなわちモードを表示部201に表示させる機能部である。ドローン100が属するモードは、例えば図8に示す飛行モードであるが、これに代えて又は加えて、図9に示す機体状態、図10に示す機体の行動状態、図11に示す試合状態又は図12に示す攻守状態を表示部201に表示させてもよい。 The mode display unit 211 is a functional unit that displays at least the state, i.e., the mode, to which the drone 100 belongs on the display unit 201. The mode to which the drone 100 belongs is, for example, the flight mode shown in FIG. 8, but instead of or in addition to this, the aircraft state shown in FIG. 9, the aircraft action state shown in FIG. 10, the game state shown in FIG. 11, or the offensive and defensive states shown in FIG. 12 may be displayed on the display unit 201.

 図18に示すように、表示部201に表示される画面G1には、例えば、ドローン100が属する飛行モードの表示欄G21、ならびに機体状態、機体行動状態、試合状態および攻守状態を示す状態表示欄G22が表示される。 As shown in FIG. 18, the screen G1 displayed on the display unit 201 displays, for example, a display field G21 for the flight mode to which the drone 100 belongs, as well as a status display field G22 showing the aircraft status, aircraft behavior status, match status, and offensive/defensive status.

 図5に示す撮影状態表示部212は、ドローン100に搭載される撮影用カメラ141により撮影される映像を、表示部201に表示させる機能部である。図18に示すように、表示部201に表示される画面G1には、例えば、ドローン100により撮影されている画像が表示される映像欄G40が表示される。
 画面G1および各状態の詳細については、後述する。
5 is a functional unit that displays, on the display unit 201, an image captured by the imaging camera 141 mounted on the drone 100. As shown in FIG 18, the screen G1 displayed on the display unit 201 displays, for example, an image field G40 in which an image being captured by the drone 100 is displayed.
The screen G1 and each state will be described in detail later.

(A-1-3-3.入力制御部220)
 図5に示す入力制御部220は、操縦者等のユーザによる各種の入力を受け付ける。
 本実施形態の入力制御部220は、主として、飛行体位置の操作部221、飛行体姿勢の操作部222、カメラ姿勢の操作部223、カメラズームの操作部224、飛行モード切替部225、目標位置受付部226、電源入力部227および帰還入力部228の各機能部を有する。
(A-1-3-3. Input control unit 220)
The input control unit 220 shown in FIG. 5 receives various inputs from a user such as a pilot.
The input control unit 220 of this embodiment mainly has the following functional units: an aircraft position operation unit 221, an aircraft attitude operation unit 222, a camera attitude operation unit 223, a camera zoom operation unit 224, a flight mode switching unit 225, a target position receiving unit 226, a power supply input unit 227, and a return input unit 228.

 飛行体位置の操作部221は、上下移動入力部221aおよび左右移動入力部221bを備える。飛行体姿勢の操作部222は、前後移動入力部222aおよびヨー旋回入力部222bを備える。 The aircraft position operation unit 221 includes an up/down movement input unit 221a and a left/right movement input unit 221b. The aircraft attitude operation unit 222 includes a forward/backward movement input unit 222a and a yaw rotation input unit 222b.

 上下移動入力部221aは、操縦者によりドローン100を上下移動させるための入力部であり、右入力スティック327Rへの入力を取得する。すなわち、右入力スティック327Rが上側(手に保持した状態において奥側)に移動されるとドローン100が上昇し、右入力スティック327Rが下側(手に保持した状態において手前側)に移動されるとドローン100が下降する。左右移動入力部221bは、操縦者によりドローン100を左右移動させるための入力部であり、右入力スティック327Rへの入力を取得する。すなわち、右入力スティック327Rが右側に移動されるとドローン100が右移動し、右入力スティック327Rが左側に移動されるとドローン100が左移動する。 The up-down movement input unit 221a is an input unit for allowing the pilot to move the drone 100 up and down, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved upward (toward the back when held in the hand), the drone 100 rises, and when the right input stick 327R is moved downward (toward the front when held in the hand), the drone 100 descends. The left-right movement input unit 221b is an input unit for allowing the pilot to move the drone 100 left and right, and acquires input to the right input stick 327R. That is, when the right input stick 327R is moved to the right, the drone 100 moves to the right, and when the right input stick 327R is moved to the left, the drone 100 moves to the left.

 前後移動入力部222aは、操縦者によりドローン100を前後移動させるための入力部であり、左入力スティック327Lへの入力を取得する。すなわち、左入力スティック327Lが上側(手に保持した状態において奥側)に移動されるとドローン100が前進し、左入力スティック327Lが下側(手に保持した状態において手前側)に移動されるとドローン100が後進する。ヨー旋回入力部222bは、操縦者によりドローン100をヨー旋回させるための入力部であり、左入力スティック327Lへの入力を取得する。すなわち、左入力スティック327Lが右側に移動されるとドローン100が右旋回し、左入力スティック327Lが左側に移動されるとドローン100が左旋回する。 The forward/backward movement input unit 222a is an input unit for allowing the pilot to move the drone 100 forward/backward, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved upward (toward the rear when held in the hand), the drone 100 moves forward, and when the left input stick 327L is moved downward (toward the front when held in the hand), the drone 100 moves backward. The yaw rotation input unit 222b is an input unit for allowing the pilot to yaw rotate the drone 100, and acquires input to the left input stick 327L. That is, when the left input stick 327L is moved to the right, the drone 100 turns right, and when the left input stick 327L is moved to the left, the drone 100 turns left.

 カメラ姿勢の操作部223は、撮影制御部143を介してカメラ保持部142を動作させ、ドローン100の筐体101に対する撮影用カメラ141の向きを操作するための入力部である。カメラ姿勢の操作部223は、右スライダ326Rへの入力を取得する。カメラ姿勢の操作部223は、筐体101に対する撮影用カメラ141のピッチ角およびヨー角のいずれか又は両方の操作を受け付ける。 The camera attitude operation unit 223 is an input unit for operating the camera holding unit 142 via the shooting control unit 143 and for controlling the orientation of the shooting camera 141 relative to the housing 101 of the drone 100. The camera attitude operation unit 223 obtains input to the right slider 326R. The camera attitude operation unit 223 accepts operation of either or both of the pitch angle and yaw angle of the shooting camera 141 relative to the housing 101.

 カメラズームの操作部224は、撮影用カメラ141の撮影倍率、すなわちズーム量を操作するための入力部であり、左スライダ326Lへの入力を取得する。 The camera zoom operation unit 224 is an input unit for operating the shooting magnification of the shooting camera 141, i.e., the zoom amount, and obtains input to the left slider 326L.

 飛行モード切替部225は、飛行モードを切り替えるための入力部である。飛行モード切替部225により選択可能な飛行モードは、少なくとも例えば外縁飛行モードM102(図8参照)、コート内飛行モードM105(図8参照)および定位置飛行モードM103又はM107(図8参照)を含む。飛行モード切替部225は、例えば表示部201と一体となったタッチパネルディスプレイを介して、飛行モードの切替を受け付ける。 The flight mode switching unit 225 is an input unit for switching flight modes. Flight modes selectable by the flight mode switching unit 225 include at least, for example, the outer edge flight mode M102 (see FIG. 8), the inside court flight mode M105 (see FIG. 8), and the fixed position flight mode M103 or M107 (see FIG. 8). The flight mode switching unit 225 accepts switching of flight modes via, for example, a touch panel display integrated with the display unit 201.

 目標位置受付部226は、ドローン100が向かうべき目標撮影位置の入力を受け付ける機能部である。目標位置受付部226は、競技場F上の地点の入力を受け付ける。目標位置受付部226は、例えば表示部201に競技場Fの映像又は模式図の少なくとも一部が表示されている状態において、表示部201と一体的に構成されるタッチパネルディスプレイを介して、目標撮影位置の入力を受け付けてよい。目標位置受付部226は、目標撮影位置として選択可能な地点、すなわち撮影位置があらかじめ規定されている場合に、目標撮影位置の選択入力を受け付けてもよい。 The target position receiving unit 226 is a functional unit that receives input of a target shooting position to which the drone 100 should head. The target position receiving unit 226 receives input of a point on the stadium F. For example, when at least a portion of an image or schematic diagram of the stadium F is displayed on the display unit 201, the target position receiving unit 226 may receive input of the target shooting position via a touch panel display that is configured integrally with the display unit 201. The target position receiving unit 226 may receive a selection input of a target shooting position when a point that can be selected as the target shooting position, i.e., a shooting position, is specified in advance.

●飛行モード
 ここで、ドローン100に設定される飛行モードの種類、およびその状態遷移の例について説明する。
 図8に示すように、ドローン100の飛行モードは、主として、事前準備モードM100、コート外離着陸モードM101、外縁飛行モードM102、コート外定位置飛行モードM103、コート内進入モードM104、コート内飛行モードM105、コート外退出モードM106、コート内定位置飛行モードM107、およびコート内離着陸モードM108を含む。
Flight Modes Here, we will explain the types of flight modes that can be set for the drone 100 and examples of their state transitions.
As shown in FIG. 8, the flight modes of the drone 100 mainly include a pre-preparation mode M100, an off-court takeoff and landing mode M101, an outer edge flight mode M102, an off-court fixed position flight mode M103, an on-court entry mode M104, an on-court flight mode M105, an off-court exit mode M106, an on-court fixed position flight mode M107, and an on-court takeoff and landing mode M108.

 事前準備モードM100は、ジオフェンス設定等の事前設定が行われるモードである。事前準備モードM100は、コート外離着陸モードM101に遷移する。このコート外離着陸モードM101において、ドローン100は、地点L101g(図14参照)から離陸する。なお、コート外離着陸モードM101において、ドローン100は地点L101gとは異なるコートF100外の地点で離陸してもよい。 The advance preparation mode M100 is a mode in which advance settings such as geofence settings are made. The advance preparation mode M100 transitions to an off-court takeoff and landing mode M101. In this off-court takeoff and landing mode M101, the drone 100 takes off from point L101g (see FIG. 14). Note that in the off-court takeoff and landing mode M101, the drone 100 may take off from a point outside the court F100 other than point L101g.

 コート外離着陸モードM101は、制御開始又は制御終了時にドローン100が属するモードである。ドローン100は、コート外離着陸モードM101から、外縁飛行モードM102に遷移する。 The off-court takeoff and landing mode M101 is the mode to which the drone 100 belongs when control starts or ends. The drone 100 transitions from the off-court takeoff and landing mode M101 to the perimeter flight mode M102.

 外縁飛行モードM102は、コートF100の外縁の一部又は全部に沿って外縁の上空を飛行して競技場Fの撮影を行うモードであり、具体的には、撮影位置L101~L105(図14)のいずれかで飛行して撮影を行うモードである。なお、本説明における主な実施態様においては、外縁飛行モードM102はタッチラインF111b上を飛行するモードである。ただし、外縁飛行モードM102が飛行する「外縁」は、タッチラインF111bの真上に加えてコートF100のやや外側も含まれる概念である。 The outer edge flight mode M102 is a mode in which the drone flies above the outer edge along part or all of the outer edge of the court F100 to photograph the playing field F, and more specifically, flies at one of the photographing positions L101 to L105 (Fig. 14) to photograph. In the main embodiment in this description, the outer edge flight mode M102 is a mode in which the drone flies above the touchline F111b. However, the "outer edge" on which the outer edge flight mode M102 flies is a concept that includes not only directly above the touchline F111b but also slightly outside the court F100.

 外縁飛行モードM102においては、操縦器200の目標位置受付部226を介してユーザの指示を受け付け、指定された撮影位置L101~L105のいずれかで飛行を行う。なお、撮影方向は、ユーザの指示に応じてマニュアルで操作可能であってもよいし、所定角度に固定されていてもよい。また、外縁飛行モードM102においては、撮影方向を固定したままドローン100の撮影位置を変更する、いわゆるドリー撮影により、特定の選手に追従して撮影してもよい。 In the outer edge flight mode M102, the drone 100 receives user instructions via the target position receiving unit 226 of the pilot 200 and flies at one of the specified shooting positions L101 to L105. The shooting direction may be manually controlled according to the user's instructions, or may be fixed at a specified angle. In the outer edge flight mode M102, the drone 100 may change its shooting position while keeping the shooting direction fixed, a so-called dolly shooting technique, in which the drone 100 follows and shoots a specific player.

 外縁飛行モードM102は、コート外離着陸モードM101、コート外定位置飛行モードM103又はコート内進入モードM104に遷移可能である。 The outer edge flight mode M102 can transition to an outside court takeoff and landing mode M101, an outside court fixed position flight mode M103, or an inside court approach mode M104.

 コート外定位置飛行モードM103は、コートF100の領域外において定位置で飛行するモードである。コート外定位置飛行モードM103は、外縁飛行モードM102に遷移可能である。コート内進入モードM104は、ドローン100がコートF100の領域内に進入するのに必要な一連の処理を行うモードである。ドローン100は、コート内進入モードM104を経てコート内飛行モードM105に遷移する。 The outside-court fixed position flight mode M103 is a mode in which the drone 100 flies in a fixed position outside the area of the court F100. The outside-court fixed position flight mode M103 can transition to the outer edge flight mode M102. The on-court entry mode M104 is a mode in which the drone 100 performs a series of processes required for entering the area of the court F100. The drone 100 transitions to the on-court flight mode M105 via the on-court entry mode M104.

 コート内飛行モードM105は、コートF100内の上空を飛行して競技場Fの撮影を行うモードであり、具体的には、撮影位置L206~L215(図7)のいずれかで飛行して撮影を行うモードである。コート内飛行モードM105においては、外縁飛行モードM102と同様、操縦器200の目標位置受付部226を介してユーザの撮影位置の選択指令を受け付け、指定された撮影位置L206~L215のいずれかで飛行を行う。なお、撮影方向は、ユーザの指示に応じてマニュアルで操作可能であってもよいし、所定角度に固定されていてもよい。 Intra-court flight mode M105 is a mode in which the drone flies above the court F100 to photograph the stadium F, and more specifically, flies at one of the photographing positions L206 to L215 (Figure 7) to photograph. In intra-court flight mode M105, as in outer edge flight mode M102, the drone accepts a user command to select a photographing position via the target position receiving unit 226 of the controller 200, and flies at one of the specified photographing positions L206 to L215. The photographing direction may be manually controlled according to the user's instructions, or may be fixed at a predetermined angle.

 コート内飛行モードM105は、コート外退出モードM106、コート内定位置飛行モードM107又はコート内離着陸モードM108に遷移可能である。 The on-court flight mode M105 can transition to an off-court exit mode M106, an on-court fixed position flight mode M107, or an on-court takeoff and landing mode M108.

 コート外退出モードM106は、ドローン100がコートF100の領域外に退出するのに必要な一連の処理を行うモードである。ドローン100は、コート外退出モードM106を経て外縁飛行モードM102に遷移する。なお、コート外退出モードM106およびコート内進入モードM104は互いに遷移可能である。 The court exit mode M106 is a mode in which the drone 100 performs a series of processes required for the drone 100 to exit the area of the court F100. The drone 100 transitions from the court exit mode M106 to the outer edge flight mode M102. Note that the court exit mode M106 and the court entry mode M104 can transition back and forth.

 コート内定位置飛行モードM107は、コートF100の領域内において定位置で飛行するモードである。コート内定位置飛行モードM107は、コート内飛行モードM105に遷移可能である。コート内離着陸モードM108は、コートF100の領域内で離着陸を行うモードであり、主に手動介入によりその場で着陸する指令が出された場合に遷移するモードである。コート内離着陸モードM108において離陸したドローンは、コート内飛行モードM105に遷移する。 The on-court fixed position flight mode M107 is a mode in which the drone flies in a fixed position within the area of the court F100. The on-court fixed position flight mode M107 can transition to the on-court flight mode M105. The on-court takeoff and landing mode M108 is a mode in which the drone takes off and lands within the area of the court F100, and is a mode to which the drone transitions mainly when a command to land on the spot is issued by manual intervention. A drone that takes off in the on-court takeoff and landing mode M108 transitions to the on-court flight mode M105.

 図5に戻る。電源入力部227は、電源ボタン328を介して操縦器200の電源のオンオフを受け付ける機能部である。 Returning to FIG. 5, the power input unit 227 is a functional unit that accepts the power on/off command for the controller 200 via the power button 328.

 帰還入力部228は、帰還ボタン329を介して、競技場F(図7)にいるドローン100を目標着陸地点L101g(図14参照)に帰還させる指令を受け付ける機能部である。 The return input unit 228 is a functional unit that accepts a command to return the drone 100 located in the stadium F (Figure 7) to the target landing point L101g (see Figure 14) via the return button 329.

 また、入力制御部220は、上述の構成に代えて又は加えて、表示部201へのタッチ入力を受け付け、当該入力に応じてドローン100に制御命令を送信可能になっていてもよい。より具体的には例えば、表示部201に表示されている、地図又は模式図等の適宜の情報に対してユーザが選択操作することで、選択された地点に向かう経路を自動的に生成し、ドローン100が自律的に飛行するようになっていてもよい。 In addition to or instead of the above configuration, the input control unit 220 may be capable of receiving touch input to the display unit 201 and transmitting control commands to the drone 100 in response to the input. More specifically, for example, when the user selects appropriate information such as a map or schematic diagram displayed on the display unit 201, a route to the selected point may be automatically generated, causing the drone 100 to fly autonomously.

(A-1-3-4.通信部240)
 通信部240は、操縦器200と、システム1に含まれる適宜の構成との間で信号を送受信する機能部である。操縦器200は、Wi-Fi、2.4GHz、5.6~5.8GHzの周波数帯域を用いた無線通信によりドローン100と無線通信を行う通信機能を備えている。また、操縦器200は、LTE(Long Term Evolution)等の通信規格を利用して通信ネットワーク400を介してサーバ300と通信を行うことができる無線通信機能を備えている。通信部240は、例えば、操縦者等のユーザによる各種の入力信号をドローン100又はサーバ300等に送信する。また、通信部240は、ドローン100又はサーバ300等からの信号を受信する。
(A-1-3-4. Communication unit 240)
The communication unit 240 is a functional unit that transmits and receives signals between the controller 200 and an appropriate configuration included in the system 1. The controller 200 has a communication function that performs wireless communication with the drone 100 by wireless communication using Wi-Fi, 2.4 GHz, and 5.6 to 5.8 GHz frequency bands. The controller 200 also has a wireless communication function that can communicate with the server 300 via the communication network 400 using a communication standard such as LTE (Long Term Evolution). The communication unit 240 transmits various input signals by a user such as a pilot to the drone 100 or the server 300. The communication unit 240 also receives signals from the drone 100 or the server 300.

(A-1-4.サーバ300)
(A-1-4-1.サーバ300の概要)
 図6は、本実施形態のサーバ300の機能構成図である。サーバ300は、ドローン100の飛行及び撮影を管理又は制御する。サーバ300は、各種情報の入力又は出力(画像出力、音声出力)のための入出力部(図示せず)を備える。
(A-1-4. Server 300)
(A-1-4-1. Overview of Server 300)
6 is a functional configuration diagram of the server 300 according to the present embodiment. The server 300 manages or controls the flight and photography of the drone 100. The server 300 includes an input/output unit (not shown) for inputting or outputting various types of information (image output, audio output).

 サーバ300は、例えばワークステーション又はパーソナルコンピュータのような汎用コンピュータとしてもよいし、或いはクラウド・コンピューティングによって論理的に実現されてもよい。 The server 300 may be a general-purpose computer such as a workstation or personal computer, or may be logically realized by cloud computing.

 サーバ300は、情報処理を実行するためのCPU等の演算装置、RAM及びROM等の記憶装置を備え、これによりソフトウェア構成として、主として事前設定部310、イベント検出部320、撮影条件決定部325、飛行モード切替部330、外縁飛行制御部340、コート内飛行制御部350、定位置飛行制御部360、通信部370および記憶部380の各機能ブロックを構成する。 The server 300 is equipped with a calculation device such as a CPU for executing information processing, and storage devices such as RAM and ROM, and as a software configuration, it mainly configures the following functional blocks: a presetting unit 310, an event detection unit 320, a photography condition determination unit 325, a flight mode switching unit 330, an outer edge flight control unit 340, an in-court flight control unit 350, a fixed position flight control unit 360, a communication unit 370, and a memory unit 380.

(A-1-4-2.事前設定部310)
 事前設定部310は、ドローン100が撮影対象フィールドを飛行する前に、ドローン100の飛行に必要な設定を行う機能部である。
 事前設定部310は、主としてジオフェンス設定部311を有する。
(A-1-4-2. Presetting unit 310)
The pre-setting unit 310 is a functional unit that performs the settings necessary for the flight of the drone 100 before the drone 100 flies over the field to be photographed.
The presetting unit 310 mainly includes a geofence setting unit 311 .

 ジオフェンス設定部311は、ドローン100のジオフェンスを設定する機能部である。ジオフェンスは、平面方向および高さ方向の情報を含む。 The geofence setting unit 311 is a functional unit that sets the geofence of the drone 100. The geofence includes information on the planar direction and the height direction.

 ジオフェンス設定部311は、飛行モードに応じたジオフェンスを設定する。すなわち例えば、ジオフェンス設定部311は、外縁飛行モードM102(図8参照)においてジオフェンスG100(図7参照)を有効化する。また、ジオフェンス設定部311は、コート内飛行モードM105(図8参照)においてジオフェンスG200(図7参照)を有効化する。また、ジオフェンス設定部311は、外縁飛行モードM102とコート内飛行モードM105との遷移にあたり介在する介在モード、すなわちコート外退出モードM106およびコート内進入モードM104において、ジオフェンスG100又はジオフェンスG200とは異なるジオフェンス、いわゆる第3ジオフェンスを設定する。 The geofence setting unit 311 sets a geofence according to the flight mode. That is, for example, the geofence setting unit 311 activates the geofence G100 (see FIG. 7) in the outer edge flight mode M102 (see FIG. 8). The geofence setting unit 311 also activates the geofence G200 (see FIG. 7) in the on-court flight mode M105 (see FIG. 8). The geofence setting unit 311 also sets a geofence different from the geofence G100 or geofence G200, a so-called third geofence, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the on-court flight mode M105, that is, the off-court exit mode M106 and the on-court entry mode M104.

 第1飛行モードにおけるジオフェンスと移動先での飛行モードである第2飛行モードのジオフェンスの少なくとも一部が重複(オーバーラップ)している場合、第3ジオフェンスは、第1飛行モードにおけるジオフェンスで定義される第1エリアと、第2飛行モードにおけるジオフェンスで定義される第2エリアと、を統合した統合エリアを覆うジオフェンスである。本実施形態において、外縁飛行モードM102のジオフェンスG100とコート内飛行モードM105のジオフェンスG200とは重複している。したがって、第3ジオフェンスは、ジオフェンスG100とジオフェンスG200とを統合した領域を区画するジオフェンスとなる。 If the geofence in the first flight mode and the geofence in the second flight mode, which is the flight mode at the destination, overlap at least in part, the third geofence is a geofence that covers a combined area that combines the first area defined by the geofence in the first flight mode and the second area defined by the geofence in the second flight mode. In this embodiment, the geofence G100 of the outer edge flight mode M102 and the geofence G200 of the inside court flight mode M105 overlap. Therefore, the third geofence is a geofence that divides the area that combines the geofences G100 and G200.

 また、第3ジオフェンスは、第1飛行モードにおけるジオフェンスと第2飛行モードのジオフェンスとが重複(オーバーラップ)していない場合、第1飛行モードに対応する第1ジオフェンスで定義される第1エリアと、第2飛行モードに対応する第2ジオフェンスで定義される第2エリアと、第1エリアと第2エリアとの空隙と、を統合したエリアを覆うジオフェンスとなっていてもよい。この構成によれば、ジオフェンスが重複していない飛行モード間を遷移する場合にも、モードの遷移中にジオフェンスから逸脱することがなく、安全性を担保できる。
 ジオフェンス設定部311は、設定されたジオフェンスの情報を記憶部380に格納する。
In addition, when the geofence in the first flight mode and the geofence in the second flight mode do not overlap, the third geofence may be a geofence that covers an area that combines a first area defined by the first geofence corresponding to the first flight mode, a second area defined by the second geofence corresponding to the second flight mode, and a gap between the first area and the second area. With this configuration, even when transitioning between flight modes in which the geofences do not overlap, the vehicle does not deviate from the geofence during the mode transition, ensuring safety.
The geofence setting unit 311 stores information about the set geofence in the memory unit 380.

(A-1-4-3.イベント検出部320)
 イベント検出部320は、撮影対象又はドローン100の状態を検出する機能部である。イベント検出部320は、撮影用カメラ141のカメラ画像又は外部システム700からの入力に基づいてイベントを検出する。各イベントの検出基準は、例えば記憶部380に記憶されており、イベント検出部320は、記憶部380を参照してイベントを検出する。また、イベント検出部320は、ニューラルネットワークを用いた解析によりイベントを検出してもよい。イベント検出部320による検出処理の態様は、公知の適宜の画像解析技術を適用することができる。
(A-1-4-3. Event detection unit 320)
The event detection unit 320 is a functional unit that detects the state of the subject to be photographed or the drone 100. The event detection unit 320 detects an event based on the camera image of the photographing camera 141 or an input from the external system 700. The detection criteria for each event are stored in, for example, the storage unit 380, and the event detection unit 320 detects an event by referring to the storage unit 380. The event detection unit 320 may also detect an event by analysis using a neural network. The detection process by the event detection unit 320 can be performed using any known appropriate image analysis technology.

 イベント検出部320は、特にドローン100の飛行モード又は撮影条件の変更の契機となるイベントを検出する。
 イベント検出部320は、主として、機体状態取得部321、機体行動状態取得部322、試合状態取得部323、および攻守状態取得部324を有する。
The event detection unit 320 detects events that trigger a change in the flight mode or shooting conditions of the drone 100.
The event detection unit 320 mainly has an aircraft state acquisition unit 321 , an aircraft action state acquisition unit 322 , a game state acquisition unit 323 , and an offensive/defensive state acquisition unit 324 .

 機体状態取得部321は、ドローン100の機体状態を取得する機能部である。
 図9は、ドローン100の機体状態の状態遷移を示す図である。機体状態は、例えば、通常操作飛行モードM200、検知判断モードM210およびアクションモードM220に大別される。ドローン100の飛行が開始されると、ドローン100は通常操作飛行モードM200に遷移する。
The aircraft status acquisition unit 321 is a functional unit that acquires the aircraft status of the drone 100.
9 is a diagram showing the state transition of the aircraft state of the drone 100. The aircraft state is broadly divided into, for example, a normal operation flight mode M200, a detection and judgment mode M210, and an action mode M220. When the drone 100 starts flying, the drone 100 transitions to the normal operation flight mode M200.

 通常操作飛行モードM200において、何等かの検知又は受信が生じると、機体状態は検知判断モードM210に遷移する。検知判断モードM210は、異常検知モードM211、故障検知モードM212、手動介入モードM213およびバッテリ不足モードM214を含む。 If any detection or reception occurs in the normal operation flight mode M200, the aircraft state transitions to the detection and judgment mode M210. The detection and judgment mode M210 includes an abnormality detection mode M211, a failure detection mode M212, a manual intervention mode M213, and a low battery mode M214.

 具体的には、通常操作飛行モードM200において異常が検知された場合には異常検知モードM211に遷移する。この異常は、電波強度の低下又は強風等の一過性、言い換えれば可逆性の外乱である。異常検知モードM211において異常が解消された場合は、通常操作飛行モードM200に遷移する。 Specifically, if an abnormality is detected in normal operation flight mode M200, the mode transitions to abnormality detection mode M211. This abnormality is a transient, in other words, reversible disturbance such as a drop in radio wave strength or strong winds. If the abnormality is resolved in abnormality detection mode M211, the mode transitions to normal operation flight mode M200.

 通常操作飛行モードM200において、機体又はシステムの故障が検知された場合には故障検知モードM212に遷移する。手動操縦の命令が受信された場合には手動介入モードM213に遷移し、バッテリの充電残量が所定値より少ないことが検知されるとバッテリ不足モードM214に遷移する。また、異常検知モードM211、故障検知モードM212、バッテリ不足モードM214において手動操縦の命令が受信された場合には、手動介入モードM213に遷移する。ドローン100は、検知判断モードM210に応じたアクションモードM220に遷移する。 If an aircraft or system failure is detected in normal operation flight mode M200, the drone 100 transitions to failure detection mode M212. If a manual control command is received, the drone transitions to manual intervention mode M213, and if it is detected that the remaining battery charge is less than a predetermined value, the drone transitions to low battery mode M214. In addition, if a manual control command is received in abnormality detection mode M211, failure detection mode M212, or low battery mode M214, the drone transitions to manual intervention mode M213. The drone 100 transitions to an action mode M220 that corresponds to the detection judgment mode M210.

 アクションモードM220は、状態ごとにあらかじめ設定された一連の動作をドローン100に行わせる状態である。アクションモードM220は、退避地点に着陸モードM221、緊急停止モードM222、その場着陸モードM223、帰還モードM224、および定位置飛行モードM225を含む。 The action mode M220 is a state in which the drone 100 performs a series of actions that are preset for each state. The action mode M220 includes a landing mode M221 at an evacuation point, an emergency stop mode M222, a landing on the spot mode M223, a return mode M224, and a fixed position flight mode M225.

 退避地点に着陸モードM221は、ドローン100を退避地点H200まで飛行させて着陸させる動作が設定されている。退避地点に着陸モードM221は、異常検知モードM211において異常が解消されない場合に遷移する。 The landing at evacuation point mode M221 is set to fly the drone 100 to the evacuation point H200 and land it. The landing at evacuation point mode M221 is entered when the abnormality is not resolved in the abnormality detection mode M211.

 緊急停止モードM222は、その場でプロペラ122を停止させる動作が設定されている。緊急停止モードM222においてドローン100は自由落下する。緊急停止モードM222は、プロペラ122が人や物に接触しそうな場合に、手動介入モードM213において選択されうる。 The emergency stop mode M222 is set to stop the propellers 122 on the spot. In the emergency stop mode M222, the drone 100 falls freely. The emergency stop mode M222 can be selected in the manual intervention mode M213 when the propellers 122 are about to come into contact with a person or object.

 その場着陸モードM223は、その場に軟着陸する動作が設定されている。帰還モードM224は、離着陸地点に帰還する動作が設定されている。 The on-site landing mode M223 is set to perform a soft landing on the spot. The return mode M224 is set to return to the takeoff and landing point.

 定位置飛行モードM225は、一定の位置で飛行する状態であり、ユーザの操作に基づいて通常操作飛行モードM200に遷移可能である。ユーザの操作は、例えば表示部201に表示されるボタンの選択により入力される。また、定位置飛行モードM225において、通常操作飛行モードM200から検知判断モードM210に遷移し得る事象、すなわち異常、故障、手動介入又はバッテリ不足を検知すると、定位置飛行モードM225から通常操作飛行モードM200を介して検知判断モードM210の各状態に遷移する。また、定位置飛行モードM225におけるドローン100は、ユーザの操作に基づいて帰還モードM224への遷移が可能である。 The fixed position flight mode M225 is a state in which the drone flies at a fixed position, and can transition to the normal operation flight mode M200 based on a user operation. The user operation is input, for example, by selecting a button displayed on the display unit 201. In addition, in the fixed position flight mode M225, if an event that can transition from the normal operation flight mode M200 to the detection and judgment mode M210, i.e., an abnormality, a malfunction, manual intervention, or a low battery, is detected, the drone 100 transitions from the fixed position flight mode M225 to each state of the detection and judgment mode M210 via the normal operation flight mode M200. In addition, the drone 100 in the fixed position flight mode M225 can transition to the return mode M224 based on a user operation.

 異常検知モードM211および故障検知モードM212のドローン100は、退避地点に着陸モードM221に遷移する。手動介入モードM213のドローン100は、入力された命令に応じて、退避地点に着陸モードM221、緊急停止モードM222、その場着陸モードM223、帰還モードM224、および定位置飛行モードM225のいずれかの状態に遷移する。バッテリ不足モードM214のドローン100は、帰還モードM224に遷移する。 The drone 100 in the abnormality detection mode M211 and the failure detection mode M212 transitions to a landing mode M221 at an evacuation point. The drone 100 in the manual intervention mode M213 transitions to one of the following states depending on the input command: landing mode M221 at an evacuation point, emergency stop mode M222, landing on the spot mode M223, return mode M224, and fixed position flight mode M225. The drone 100 in the low battery mode M214 transitions to the return mode M224.

 なお、通常操作飛行モードM200のドローン100は、ユーザの操作に基づいて帰還モードM224に遷移することも可能である。ユーザの操作は、例えば表示部201に表示されるボタンの選択により入力される。 The drone 100 in the normal operation flight mode M200 can also transition to the return mode M224 based on a user operation. The user operation is input, for example, by selecting a button displayed on the display unit 201.

 機体行動状態取得部322は、ドローン100の機体の行動状態を取得する機能部である。機体の行動状態が有する各モードは、機体状態の遷移を実現するために行われる機体状態のサブモードである。
 図10は、機体の行動状態の状態遷移を示す図である。機体の行動状態は、例えば、離陸モードM300、退避モードM310、通常モードM320および着陸モードM340に大別される。
The aircraft behavior state acquisition unit 322 is a functional unit that acquires the aircraft behavior state of the drone 100. Each mode of the aircraft behavior state is a sub-mode of the aircraft state that is performed to realize a transition of the aircraft state.
10 is a diagram showing state transitions of the aircraft's behavioral states. The aircraft's behavioral states are broadly divided into a takeoff mode M300, an evacuation mode M310, a normal mode M320, and a landing mode M340, for example.

 離陸モードM300は、ドローン100が離陸するモードである。機体の行動状態の状態遷移は、離陸モードM300から開始する。ドローン100が離陸すると、機体の行動状態は離陸モードM300から退避モードM310又は通常モードM320に遷移する。退避モードM310は、主として、退避地点到着静止モードM311と、退避移動中モードM312とを含む。また、通常モードM320は、地点到着静止モードM321と、移動中モードM322とを含む。退避モードM310と通常モードM320は、一時停止モードM330を介して互いに遷移可能である。なお、この態様は1例である。 Takeoff mode M300 is a mode in which drone 100 takes off. The state transition of the aircraft's behavior state starts from takeoff mode M300. When drone 100 takes off, the aircraft's behavior state transitions from takeoff mode M300 to evacuation mode M310 or normal mode M320. Evacuation mode M310 mainly includes evacuation point arrival stationary mode M311 and evacuation moving mode M312. Normal mode M320 also includes point arrival stationary mode M321 and moving mode M322. Evacuation mode M310 and normal mode M320 can transition to each other via temporary suspension mode M330. This is just one example.

 離陸モードM300から退避モードM310に遷移する場合において、離陸モードM300のドローン100は、退避地点到着静止モードM311に遷移する。退避地点到着静止モードM311は、退避地点H200に移動してその場で静止、すなわちホバリングを行うモードである。退避を目的に、退避地点H200から別の地点に移動する場合、退避地点到着静止モードM311のドローン100は、退避移動中モードM312に遷移する。ドローン100が所定の目的地に到達、又は手動介入による移動命令がない状態となると、機体行動状態は、退避地点到着静止モードM311又は退避移動中モードM312から一時停止モードM330に遷移する。 When transitioning from takeoff mode M300 to evacuation mode M310, the drone 100 in takeoff mode M300 transitions to evacuation point arrival stationary mode M311. The evacuation point arrival stationary mode M311 is a mode in which the drone moves to the evacuation point H200 and remains stationary there, i.e., hovers. When moving from the evacuation point H200 to another point for the purpose of evacuation, the drone 100 in the evacuation point arrival stationary mode M311 transitions to evacuation in-motion mode M312. When the drone 100 reaches the specified destination or there is no movement command by manual intervention, the aircraft behavior state transitions from the evacuation point arrival stationary mode M311 or the evacuation in-motion mode M312 to the temporary suspension mode M330.

 また、離陸モードM300から通常モードM320に遷移する場合において、離陸モードM300のドローン100は、地点到着静止モードM321に遷移する。地点到着静止モードM321は、所定の目的地に移動してその場で静止、すなわちホバリングを行うモードである。通常の使用状態において別の地点に移動する場合、地点到着静止モードM321のドローン100は、移動中モードM322に遷移する。ドローン100が所定の目的地に到達、又は手動介入による移動命令がない状態となると、機体行動状態は、地点到着静止モードM321又は移動中モードM322から一時停止モードM330に遷移する。 In addition, when transitioning from takeoff mode M300 to normal mode M320, the drone 100 in takeoff mode M300 transitions to point arrival stationary mode M321. Point arrival stationary mode M321 is a mode in which the drone moves to a specified destination and remains stationary on the spot, i.e., hovers. When moving to another location in normal use conditions, the drone 100 in point arrival stationary mode M321 transitions to moving mode M322. When the drone 100 reaches the specified destination or there is no movement command by manual intervention, the aircraft behavior state transitions from point arrival stationary mode M321 or moving mode M322 to temporary suspension mode M330.

 退避地点到着静止モードM311、地点到着静止モードM321、移動中モードM322および一時停止モードM330のドローン100は、着陸モードM340に遷移可能である。機体の行動状態は、着陸モードM340において処理を終了する。 The drone 100 in the evacuation point arrival stationary mode M311, the point arrival stationary mode M321, the moving mode M322, and the pause mode M330 can transition to the landing mode M340. The aircraft's operating state ends processing in the landing mode M340.

 図6に示す試合状態取得部323は、競技場Fで行われる競技の試合状態を取得する機能部である。試合状態取得部323は、撮影した画像を画像処理することにより試合状態を検出する。また、試合状態取得部323は、審判員が外部入力装置600又は外部システム700の1例である審判支援システムに入力する判定関連情報に基づいて試合状態を取得してもよい。さらに、試合状態取得部323は、チーム関係者、例えば監督もしくはコーチが所持する外部入力装置600から入力される情報に基づいて試合状態を取得してもよい。 The game status acquisition unit 323 shown in FIG. 6 is a functional unit that acquires the game status of the competition held at the stadium F. The game status acquisition unit 323 detects the game status by performing image processing on the captured image. The game status acquisition unit 323 may also acquire the game status based on decision-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700. Furthermore, the game status acquisition unit 323 may acquire the game status based on information input from the external input device 600 held by a team member, for example, the manager or coach.

 図11は、試合状態の状態遷移の1例を示す図である。同図においては、サッカーにおける試合状態の例を示している。試合状態は、試合開始前状態M400および通常プレー状態M410、および試合終了後状態M460を含む。状態遷移は、試合開始前状態M400から開始し、試合開始前状態M400から通常プレー状態M410に遷移する。通常プレー状態M410は、競技が進行している状態である。試合が終了すると、通常プレー状態M410から試合終了後状態M460に遷移する。なお、試合の終了時点の他、ハーフタイム等試合中の休憩時間において、通常プレー状態M410から試合終了後状態M460に遷移してもよい。 FIG. 11 is a diagram showing an example of state transitions in a match state. In this figure, an example of a match state in soccer is shown. The match state includes a pre-match state M400, a normal play state M410, and an end-of-match state M460. The state transition starts from the pre-match state M400, and transitions from the pre-match state M400 to the normal play state M410. The normal play state M410 is a state in which the game is progressing. When the match ends, transitions from the normal play state M410 to the end-of-match state M460. Note that a transition from the normal play state M410 to the end-of-match state M460 may occur not only at the end of the match, but also during a break during the match, such as halftime.

 また、試合状態は、反則なしプレー中断状態M420と、反則有りプレー中断状態M440とを含む。通常プレー状態M410において反則が発生せずにプレーが中断されると、反則なしプレー中断状態M420に遷移する。反則なしプレー中断状態M420は、例えばボールがゴールラインF110a、F110b又はタッチラインF111a、F111bを超えてコート外に出た場合に遷移する。反則なしプレー中断状態M420においては、ボールが超えたラインの種類又はボールをコート外に出した選手の所属等、競技のルールに応じて生じるイベントに則して、スローイン状態M421、ゴールキック状態M422又はコーナーキック状態M423に遷移する。スローイン状態M421、ゴールキック状態M422又はコーナーキック状態M423は、各状態でのイベントが終了すると、通常プレー状態M410に遷移する。 The game state also includes a play suspended without foul play state M420 and a play suspended with foul play state M440. When play is suspended without a foul play in the normal play state M410, a transition to the play suspended without foul play state M420 occurs. The play suspended without foul play state M420 occurs, for example, when the ball crosses the goal line F110a, F110b or the touch line F111a, F111b and goes out of the court. In the play suspended without foul play state M420, a transition to a throw-in state M421, a goal kick state M422, or a corner kick state M423 occurs in accordance with events that occur according to the rules of the game, such as the type of line the ball crossed or the affiliation of the player who kicked the ball out of the court. When the event in each state ends, the throw-in state M421, the goal kick state M422, or the corner kick state M423 transitions to the normal play state M410.

 通常プレー状態M410において、反則が発生した場合、より正確には審判が反則の発生を認定した場合には、反則状態M431に遷移する。また、オフサイドが発生又は審判により認定された場合には、オフサイド状態M432に遷移する。そして、反則状態M431又はオフサイド状態M432から、反則有りプレー中断状態M440に遷移する。反則有りプレー中断状態M440においては、反則が発生した地点や発生した事象に応じて、フリーキック状態M441又はPK状態M442に遷移する。なお、フリーキック状態M441では、フリーキックに代えていわゆる間接フリーキックが行われる場合がある。また、フリーキック状態M441は、攻めている側のフリーキック状態と、守っている側のフリーキック状態とに細分化されてもよい。フリーキック状態M441およびPK状態M442は、各状態でのイベントが終了すると、試合は再開され、試合状態は通常プレー状態M410に遷移する。 In the normal play state M410, if a foul occurs, or more precisely, if the referee recognizes the occurrence of a foul, a transition to the foul state M431 occurs. In addition, if an offside occurs or is recognized by the referee, a transition to the offside state M432 occurs. Then, a transition from the foul state M431 or the offside state M432 occurs to the foul play interruption state M440. In the foul play interruption state M440, a transition to the free kick state M441 or the penalty kick state M442 occurs depending on the location where the foul occurred and the event that occurred. In the free kick state M441, a so-called indirect free kick may be performed instead of a free kick. In addition, the free kick state M441 may be subdivided into a free kick state for the attacking side and a free kick state for the defending side. In the free kick state M441 and the penalty kick state M442, when the event in each state ends, the match is resumed and the match state transitions to the normal play state M410.

 通常プレー状態M410は、試合が終了すると試合終了後状態M460に遷移し、試合状態の状態遷移は終了する。 When the match ends, the normal play state M410 transitions to the post-match state M460, and the state transition for the match state ends.

 また、通常プレー状態M410は、PK戦状態M443に遷移しうる。なお、図示は省略するが、PK戦状態M443から試合終了後状態M460に遷移し、状態遷移を終了してよい。 The normal play state M410 may also transition to a penalty shootout state M443. Although not shown in the figure, the penalty shootout state M443 may transition to an end-of-match state M460, thereby terminating the state transition.

 図11に示す複数の試合状態のうち、一部の試合状態が飛行モードの切替の契機となり、他の試合状態は飛行モード切替の契機とならないものとしてもよい。例えば、図中の網掛けの状態、すなわち、試合開始前状態M400、ゴールキック状態M422、コーナーキック状態M423、フリーキック状態M441、PK状態M442、選手交代状態M450および試合終了後状態M460への遷移に基づいて、飛行モードの切替が行われてもよい。なお、他の試合状態から通常プレー状態M410に遷移した場合には、攻守状態に応じた飛行モードに切り替わってよい。 Some of the game states shown in FIG. 11 may trigger a change in flight mode, while other game states may not. For example, the flight mode may be changed based on a transition to the shaded states in the figure, i.e., the pre-game state M400, goal kick state M422, corner kick state M423, free kick state M441, penalty kick state M442, player substitution state M450, and end-of-game state M460. Note that when a transition occurs from another game state to the normal play state M410, the flight mode may be changed to one that corresponds to the offensive or defensive state.

 攻守状態取得部324は、競技場Fで行われる試合におけるチームの攻守状態を取得する機能部である。攻守状態取得部324は、撮影した画像を画像処理することにより攻守状態を検出する。また、攻守状態取得部324は、審判員が外部入力装置600又は外部システム700の1例である審判支援システムに入力する判定関連情報に基づいて攻守状態を取得してもよい。さらに、攻守状態取得部324は、チーム関係者、例えば監督もしくはコーチが所持する外部入力装置600から入力される情報に基づいて攻守状態を取得してもよい。 The offensive and defensive state acquisition unit 324 is a functional unit that acquires the offensive and defensive states of the teams in the match held at the stadium F. The offensive and defensive state acquisition unit 324 detects the offensive and defensive states by performing image processing on the captured images. The offensive and defensive state acquisition unit 324 may also acquire the offensive and defensive states based on judgment-related information input by the umpire to the external input device 600 or the umpire support system, which is an example of the external system 700. Furthermore, the offensive and defensive state acquisition unit 324 may acquire the offensive and defensive states based on information input from the external input device 600 held by a team member, for example, the manager or coach.

 図12は、攻守状態の状態遷移の1例を示す図である。同図においては、サッカーにおける攻守状態の例を示している。試合が開始されると、攻守状態は攻撃状態M510又は守備状態M520に遷移する。また、攻撃状態M510および守備状態M520は、攻守交替状態M530又は攻守不確定状態M540を介して互いに遷移する。 FIG. 12 is a diagram showing an example of a state transition between offensive and defensive states. The figure shows an example of an offensive and defensive state in soccer. When a match starts, the offensive and defensive state transitions to an offensive state M510 or a defensive state M520. In addition, the offensive state M510 and the defensive state M520 transition to each other via an offensive/defensive change state M530 or an offensive/defensive uncertainty state M540.

 攻撃状態M510は、あらかじめ指定する一方のチーム(以下、「Aチーム」ともいう。)が攻勢である状態である。攻勢である状態は例えばAチームがボールを保持した状態で他方のチーム(以下、「Bチーム」ともいう。)に向かって進行している状態であるが、これに限られず、記憶部380にあらかじめ格納される任意の判定基準に判定される所定の状態であってよい。 The offensive state M510 is a state in which one of the teams (hereinafter also referred to as "Team A") designated in advance is on the offensive. An offensive state is, for example, a state in which Team A is in possession of the ball and is advancing toward the other team (hereinafter also referred to as "Team B"), but is not limited to this and may be a predetermined state determined by any determination criterion stored in advance in the memory unit 380.

 攻撃状態M510は、Aチーム攻勢(自陣)状態M511、Aチーム攻勢(敵陣)状態M512、およびAチーム速攻状態M513を含む。Aチーム攻勢(自陣)状態M511とAチーム攻勢(敵陣)状態M512、Aチーム攻勢(自陣)状態M511とAチーム速攻状態M513は、互いに遷移可能である。また、Aチーム速攻状態M513からAチーム攻勢(敵陣)状態M512に遷移可能である。 The attack state M510 includes an A team offensive (own side) state M511, an A team offensive (enemy side) state M512, and an A team quick attack state M513. A transition is possible between the A team offensive (own side) state M511 and the A team offensive (enemy side) state M512, and between the A team offensive (own side) state M511 and the A team quick attack state M513. A transition is also possible from the A team quick attack state M513 to the A team offensive (enemy side) state M512.

 守備状態M520は、Aチーム守備(敵陣)状態M521、Aチーム守備(自陣)状態M522、およびBチーム速攻状態M523を含む。Aチーム守備(自陣)状態M521とAチーム守備(自陣)状態M522、Aチーム守備(敵陣)状態M521とBチーム速攻状態M523は、互いに遷移可能である。また、Bチーム速攻状態M523からAチーム守備(自陣)状態M522に遷移可能である。 The defensive state M520 includes a team A defensive (opponent's half) state M521, a team A defensive (own's half) state M522, and a team B fast attack state M523. The team A defensive (own's half) state M521 and the team A defensive (own's half) state M522, and the team A defensive (opponent's half) state M521 and the team B fast attack state M523 can transition to each other. Also, a transition can be made from the team B fast attack state M523 to the team A defensive (own's half) state M522.

 攻守交替状態M530および攻守不確定状態M540は、Aチーム攻勢(自陣)状態M511、Aチーム攻勢(敵陣)状態M512、Aチーム速攻状態M513、Aチーム守備(敵陣)状態M521、Aチーム守備(自陣)状態M522、およびBチーム速攻状態M523のいずれからも遷移しうる。 The offense/defense switching state M530 and the offense/defense uncertain state M540 can be transitioned to from any of the following: Team A offensive (own side) state M511, Team A offensive (opponent's side) state M512, Team A quick attack state M513, Team A defensive (opponent's side) state M521, Team A defensive (own side) state M522, and Team B quick attack state M523.

 攻守状態取得部324は、攻守交替状態M530又は攻守不確定状態M540において、速攻状態M513、M523への遷移を検出する。攻守状態取得部324は、例えば撮影用カメラ141により取得される撮影画像から、ボールもしくは選手の加速度、ボールの移動方向もしくは選手の向きの変動、所定領域に存在する選手の数、選手の移動方向、一定の方向に移動する選手の数等を解析する。攻守状態取得部324は当該解析の結果に基づいて速攻状態M513、M523であることを検出する。また、攻守状態取得部324は、選手又はボールの移動方向により、Aチーム速攻状態M513であるかBチーム速攻状態M523であるかを判定する。 The offensive and defensive state acquisition unit 324 detects a transition to a fast attack state M513, M523 in an offensive/defensive change state M530 or an offensive/defensive uncertain state M540. The offensive and defensive state acquisition unit 324 analyzes, for example, the acceleration of the ball or the players, the fluctuation in the ball's movement direction or the player's orientation, the number of players in a specified area, the movement direction of the players, the number of players moving in a certain direction, etc., from images captured by the image capture camera 141. The offensive and defensive state acquisition unit 324 detects a fast attack state M513, M523 based on the results of this analysis. The offensive and defensive state acquisition unit 324 also determines whether it is an A team fast attack state M513 or a B team fast attack state M523, depending on the movement direction of the players or the ball.

 なお、攻守状態は上述に限られず、撮影条件の変更の契機となる任意の状態が規定されていてもよい。例えば、ロングパスを検出して遷移する状態が規定されていてもよい。また、撮影対象の競技又は催物の内容に応じて適宜規定され得る。 The offensive and defensive states are not limited to those described above, and any state that triggers a change in the shooting conditions may be specified. For example, a state that transitions upon detection of a long pass may be specified. In addition, the state may be specified appropriately depending on the content of the sport or event to be shot.

 なお、イベント検出部320は、上述の各取得部321乃至324に代えて、又は加えて、外部システム700からの入力情報に基づいてイベントを判定してよい。例えば、イベント検出部320は、外部システム700の例としての気象情報システムからの入力情報に基づき、強風等の外乱をイベントとして判定してよい。また、イベント検出部320は、外部システム700の別の例としてのコート設備システムからの入力情報、又はコート設備の関係者が入力する設備情報に基づいてイベントを判定してもよい。 In addition, the event detection unit 320 may determine an event based on input information from the external system 700, instead of or in addition to the above-mentioned acquisition units 321 to 324. For example, the event detection unit 320 may determine an external disturbance such as a strong wind as an event based on input information from a weather information system, which is an example of the external system 700. The event detection unit 320 may also determine an event based on input information from a court facility system, which is another example of the external system 700, or facility information entered by a person involved with the court facility.

(A-1-4-4.撮影条件決定部325)
 撮影条件決定部325は、ドローン100の撮影用カメラ141において設定される撮影条件を決定する機能部である。撮影条件決定部325は、イベント検出部320により検出されるイベントに応じて撮影条件を決定する。撮影条件は、ドローン100の目標撮影位置および目標撮影方向の少なくともいずれかを含む。目標撮影方向は、例えば水平に対するピッチ角度および所定の基準方向に対するヨー角度のいずれか又は両方の情報を含む。さらに、目標撮影方向は、撮影用カメラ141の目標ズーム量の情報を含んでいてもよい。以降の説明においては、撮影方向には、上記ピッチ角度、上記ヨー角度、およびズーム量が含まれているものとして説明する。
(A-1-4-4. Shooting condition determination unit 325)
The shooting condition determination unit 325 is a functional unit that determines the shooting conditions set in the shooting camera 141 of the drone 100. The shooting condition determination unit 325 determines the shooting conditions according to the event detected by the event detection unit 320. The shooting conditions include at least one of the target shooting position and the target shooting direction of the drone 100. The target shooting direction includes, for example, information on either or both of the pitch angle with respect to the horizontal and the yaw angle with respect to a predetermined reference direction. Furthermore, the target shooting direction may include information on the target zoom amount of the shooting camera 141. In the following description, the shooting direction will be described as including the pitch angle, the yaw angle, and the zoom amount.

 また、撮影条件は、撮影用カメラ141による撮影範囲の情報を含んでいてもよい。なお、本願発明の技術的範囲は、イベントに応じて目標撮影位置および目標撮影方向の少なくともいずれかが自動的に設定されるようになればよく、両方が設定される態様には限られない。 The shooting conditions may also include information on the shooting range of the shooting camera 141. Note that the technical scope of the present invention is not limited to a configuration in which both are set, but rather that at least one of the target shooting position and the target shooting direction is automatically set depending on the event.

 目標撮影方向は、ドローン100の機首方向および撮影用カメラ141の撮影方向の少なくともいずれかの制御により達成される。ドローン100の機首方向は、ドローン100の飛行制御部123により制御される。また、撮影用カメラ141の撮影方向は、例えば撮影制御部143によりカメラ保持部142を駆動させることで制御される。なお、上述した「機首方向の制御」および「撮影方向の制御」とは、左右方向(いわゆる「パン方向」)のみならず、上下方向(いわゆる「ティルト方向」)の制御も含む概念である。 The target shooting direction is achieved by controlling at least one of the nose direction of the drone 100 and the shooting direction of the shooting camera 141. The nose direction of the drone 100 is controlled by the flight control unit 123 of the drone 100. The shooting direction of the shooting camera 141 is controlled, for example, by the shooting control unit 143 driving the camera holding unit 142. Note that the above-mentioned "control of the nose direction" and "control of the shooting direction" are concepts that include control not only in the left-right direction (the so-called "pan direction") but also in the up-down direction (the so-called "tilt direction").

 撮影条件決定部325は、検出されるイベントの種類に応じて、撮影条件を決定する。撮影条件決定部325は、通常プレー状態M410の場合には、操縦器200を介した手動操縦を許可する。撮影条件決定部325は、イベント検出部320によりイベントが検出された場合には、記憶部380に格納されるイベント-撮影条件テーブルT1(図13参照)を参照し、イベントに応じた撮影条件を決定する。 The photographing condition determination unit 325 determines the photographing conditions according to the type of event detected. In the normal play state M410, the photographing condition determination unit 325 allows manual control via the controller 200. When an event is detected by the event detection unit 320, the photographing condition determination unit 325 refers to the event-photographing condition table T1 (see FIG. 13) stored in the memory unit 380 and determines the photographing conditions according to the event.

 図13に示すように、イベント-撮影条件テーブルT1は、試合状態として検出されるイベントと、当該イベントにおいて選択される撮影条件とが対応付けられて記憶されるテーブルである。より具体的には、イベント-撮影条件テーブルT1には、イベントと、撮影範囲、ドローン100が位置する撮影位置、撮影用カメラ141の撮影方向、および撮影用カメラ141のズーム量、とが互いに対応付けられている。例えば、PK状態M442又はPK戦状態M443においては、撮影範囲はボールがある方のペナルティエリアF130a又はF130bであり、撮影位置は撮影位置L206又は撮影位置L211である。また、撮影方向はボールがある方のゴールF120a又はF120b方向である。ズーム量は、例えばズーム量が大きい順にIN、Middle、OUTとあらかじめ段階的に規定されており、PK状態M442又はPK戦状態M443においては、INである。 As shown in FIG. 13, the event-photography condition table T1 is a table in which events detected as game states and the photography conditions selected for the events are stored in association with each other. More specifically, in the event-photography condition table T1, events are associated with the photography range, the photography position where the drone 100 is located, the photography direction of the photography camera 141, and the zoom amount of the photography camera 141. For example, in the PK state M442 or the PK shootout state M443, the photography range is the penalty area F130a or F130b where the ball is located, and the photography position is the photography position L206 or the photography position L211. The photography direction is the direction of the goal F120a or F120b where the ball is located. The zoom amount is, for example, predetermined in stages, such as IN, Middle, and OUT, in descending order of zoom amount, and is IN in the PK state M442 or the PK shootout state M443.

 ゴールキック状態M422および守り側のフリーキック状態M441においては、撮影位置が外縁飛行モードM102における撮影位置L101、L102およびL104のいずれかとなっている。この構成により、ボールがドローン100に衝突するリスクを低減できる。攻め側のフリーキック状態M441、PK状態M442およびPK戦状態M443においては、コートF100内の撮影位置L206~L211のいずれかから撮影することで、ゴールシーンをより見やすい位置から撮影できる。反則状態M431においては、外縁に沿う撮影位置L101、L102又はL104において撮影することでコートF100全体を撮影できるようになっている。また、反則状態M431では、ボール又は審判の地点周辺を撮影範囲とするよう撮影方向が設定される。この構成によれば、反則直後の試合の様子、例えば審判の判定の様子や選手の動向を確実に撮影することができる。コーナーキック状態M423においては、撮影位置L207、L209、L212又はL215で撮影することにより、ゴール前を近くで撮影できる。 In the goal kick state M422 and the defending free kick state M441, the shooting position is one of the shooting positions L101, L102, and L104 in the outer edge flight mode M102. This configuration reduces the risk of the ball colliding with the drone 100. In the attacking free kick state M441, the penalty kick state M442, and the penalty shootout state M443, the goal scene can be shot from a position where it is easier to see by shooting from one of the shooting positions L206 to L211 in the court F100. In the foul state M431, the entire court F100 can be shot by shooting from the shooting positions L101, L102, or L104 along the outer edge. In the foul state M431, the shooting direction is set so that the shooting range is around the ball or the referee's position. With this configuration, it is possible to reliably shoot the state of the game immediately after the foul, such as the referee's decision and the movements of the players. In the corner kick state M423, the area in front of the goal can be photographed up close by taking a photo at the shooting positions L207, L209, L212, or L215.

 スポーツにおいては、フリーキック、速攻、コーナーキック等、試合で生じるイベントに応じて求められる理想の撮影方向は、同じ撮影位置であっても異なっている。状況が目まぐるしく移り変わるスポーツの撮影において、理想の撮影方向による撮影を、手動の操縦により素早く正確に実現するのは困難であり、操作ミスや操作遅れが生じた結果、重要なシーンを撮影し損ねてしまうおそれもある。また、手動操縦により実現しようとすると、複数の撮影担当者を配置する必要があった。一方で、イベントに応じて必要な撮影方向はある程度決まっているため、上述のような構成によりイベントに則した撮影位置および撮影方向に自動でドローン100を制御する構成によれば、試合状況に応じた適切な撮影を行うことができる。また、撮影担当者数を削減し、省人化に寄与できる。 In sports, the ideal shooting direction required for events occurring in a match, such as free kicks, fast breaks, and corner kicks, varies even for the same shooting position. When shooting sports where the situation changes rapidly, it is difficult to quickly and accurately achieve shooting from the ideal shooting direction by manual operation, and there is a risk that important scenes will be missed as a result of operational errors or delays. Furthermore, if manual operation was attempted, multiple camera operators would have to be deployed. On the other hand, since the shooting direction required for each event is determined to a certain extent, the above-mentioned configuration, which automatically controls the drone 100 to a shooting position and shooting direction in accordance with the event, makes it possible to take appropriate photos according to the match situation. It also reduces the number of camera operators, contributing to labor savings.

 また、撮影位置が複数格納されているイベントの場合には、撮影条件決定部325は、格納されている撮影位置のうち最もボールに近接する撮影位置を選択してもよい。 In addition, in the case of an event in which multiple shooting positions are stored, the shooting condition determination unit 325 may select the shooting position that is closest to the ball among the stored shooting positions.

 イベント-撮影条件テーブルT1は、複数のドローン100に設定される互いに異なる撮影条件を格納していてもよい。例えば、反則状態M431において、第1のドローン100は、ボール地点を、ズーム量を大きくして撮影し、第2のドローンは、ボール地点を、ズーム量を小さくして撮影してもよい。また、第1のドローン100と第2のドローン100とで、互いに逆方向にヨー回転しながら撮影してもよい。さらに、第1のドローン100は横から撮影し、第2のドローン100は真上から撮影してもよい。複数のドローン100を適切な撮影条件に手動操縦することは一層困難であるところ、上述の通り自動制御される構成によれば、複数のドローン100による素早く多面的な撮影が可能である。 The event-photography condition table T1 may store different photography conditions set for the multiple drones 100. For example, in the foul state M431, the first drone 100 may photograph the ball location with a large zoom amount, and the second drone may photograph the ball location with a small zoom amount. The first drone 100 and the second drone 100 may also photograph while yaw rotating in opposite directions. Furthermore, the first drone 100 may photograph from the side, and the second drone 100 may photograph from directly above. While it is even more difficult to manually control multiple drones 100 to the appropriate photography conditions, the above-mentioned automatically controlled configuration makes it possible to quickly photograph from multiple angles using multiple drones 100.

 ズーム量が「OUT」である撮影は、いわゆる俯瞰撮影である。イベント-撮影条件テーブルT1において、俯瞰撮影とするイベントには互いに逆の撮影方向が対応付けられていてもよい。すなわち、例えば1個のイベントに対して、Aチーム側のコートからBチーム側のコートに向かう撮影方向と、Bチーム側のコートからAチーム側のコートに向かう撮影方向とが対応付けられている。攻守状態取得部324は、どちらのチームがボールを保持しているかを検出し、撮影条件決定部325は、ボールを保持しているチームに応じて撮影方向を決定する。具体的には、撮影条件決定部325は、ボールを保持しているチームの攻撃方向を撮影方向とするよう決定する。この構成によれば、進行するボールを連続的に俯瞰撮影することができる。 Shooting with a zoom amount of "OUT" is so-called overhead shooting. In the event-shooting condition table T1, events for which overhead shooting is performed may be associated with opposite shooting directions. That is, for example, one event is associated with a shooting direction from team A's court to team B's court, and a shooting direction from team B's court to team A's court. The offensive/defensive state acquisition unit 324 detects which team is in possession of the ball, and the shooting condition determination unit 325 determines the shooting direction depending on the team in possession of the ball. Specifically, the shooting condition determination unit 325 determines that the shooting direction should be the offensive direction of the team in possession of the ball. With this configuration, it is possible to continuously take overhead shots of the moving ball.

 また、イベント-撮影条件テーブルT1は、ユーザによる修正入力を受け付け、変更内容が記憶部380に格納される構成であってよい。この構成によれば、イベント-撮影条件テーブルT1にユーザによる撮影の知見が反映され、より最適な自動撮影が実現可能になる。また、イベント-撮影条件テーブルT1は、記憶部380に複数種類格納され、適用するテーブルをユーザが選択可能になっていてもよい。例えば、撮影目的がスポーツ観戦を目的とする場合と、選手等への指導を目的とする場合では、撮影したい内容が異なるためである。このような構成によれば、目的に適した自動撮影を簡単に実現できる。なお、図13に示すイベント-撮影条件テーブルT1は、例えばコーチング目的で設定されるテーブルの例であるが、あくまで1例であり、テーブルに格納される具体的な撮影条件は任意である。 Furthermore, the event-photography condition table T1 may be configured to accept correction input by the user, and the changes may be stored in the storage unit 380. With this configuration, the user's photography knowledge is reflected in the event-photography condition table T1, making it possible to realize more optimal automatic photography. Furthermore, multiple types of event-photography condition tables T1 may be stored in the storage unit 380, and the user may select the table to be applied. For example, this is because the content to be photographed differs depending on whether the purpose of photography is to watch a sports game or to instruct athletes, etc. With this configuration, automatic photography suited to the purpose can be easily realized. Note that the event-photography condition table T1 shown in FIG. 13 is an example of a table set for coaching purposes, for example, but it is merely one example, and the specific photography conditions stored in the table are arbitrary.

 なお、上述の実施形態においては、検出されるイベントに応じて撮影位置L101~L215と撮影方向の両方が自動で設定されるものとしたが、これに代えて、撮影位置L101~L215のみが自動で設定される構成としてもよい。この場合、撮影方向は操縦器200を介した入力により決定される。また、ユーザが操縦器200を介して撮影位置L101~L215を選択すると、撮影条件決定部325は、検出されたイベントおよび選択された撮影位置L101~L215に応じた撮影方向を決定する構成であってもよい。 In the above embodiment, both the shooting positions L101 to L215 and the shooting direction are automatically set according to the detected event, but instead, only the shooting positions L101 to L215 may be automatically set. In this case, the shooting direction is determined by input via the controller 200. Also, when the user selects a shooting position L101 to L215 via the controller 200, the shooting condition determination unit 325 may be configured to determine the shooting direction according to the detected event and the selected shooting position L101 to L215.

 撮影条件決定部325は、イベント検出部320がイベントを検出していない場合には、操縦器200への入力に基づいて撮影条件を決定し、イベント検出部320がイベントを検出した場合には、当該イベントに基づいて撮影条件を決定するものとしてもよい。この構成によれば、ユーザによる入力がある場合であっても、イベントの検出時には指定された撮影条件で確実に撮影することができるため、重要な撮影シーンにおける撮影条件を適切に保つことができる。また、イベントが検出されていない場合には、ユーザによる撮影の自由度を確保でき、自動制御の利便性と、手動操縦の柔軟性を両立することができる。 The shooting condition determination unit 325 may determine the shooting conditions based on the input to the controller 200 when the event detection unit 320 has not detected an event, and may determine the shooting conditions based on the event when the event detection unit 320 has detected an event. With this configuration, even if there is user input, shooting can be reliably performed under the specified shooting conditions when an event is detected, so that shooting conditions for important shooting scenes can be maintained appropriately. Furthermore, when no event is detected, the user's freedom of shooting can be ensured, and the convenience of automatic control can be combined with the flexibility of manual operation.

 また、撮影条件決定部325は、操縦器200又は外部システム700からの操作を受け付けた場合には、イベント検出部320によりイベントを検出している場合であっても、当該操縦器200又は外部システム700を介して受け付けられる操作に基づいて撮影条件を決定してもよい。すなわち、操縦器200から入力される撮影条件は、イベントと対応付けられる撮影条件よりも優先して適用される。この構成によれば、ユーザは、自動制御でまかなえない操作のみに注力すればよいため、すべてを手動操縦する構成に比べて操作負担が軽減され、操作ミスを減らすことができる。すなわち、本構成によれば、自動撮影の利便性を維持しつつ、ユーザの適宜の要望に応じた撮影の自由度を確保する、という利便性と自由度の両立が実現できる。 In addition, when the shooting condition determination unit 325 receives an operation from the controller 200 or the external system 700, even if an event is detected by the event detection unit 320, the shooting condition determination unit 325 may determine the shooting conditions based on the operation received via the controller 200 or the external system 700. That is, the shooting conditions input from the controller 200 are applied in priority over the shooting conditions associated with the event. With this configuration, the user only needs to focus on operations that cannot be handled by automatic control, so the operational burden is reduced compared to a configuration in which everything is manually controlled, and operational errors can be reduced. In other words, with this configuration, it is possible to achieve both convenience and freedom, that is, to ensure the freedom of shooting according to the user's appropriate requests while maintaining the convenience of automatic shooting.

 撮影条件決定部325は、イベント検出部320がイベントを検出した場合には、当該イベントに基づいて撮影条件を決定し、イベント検出部320がイベントを検出していない場合には、ボールを自動で追従して撮影するものとしてもよい。 If the event detection unit 320 detects an event, the photographing condition determination unit 325 may determine the photographing conditions based on the event, and if the event detection unit 320 does not detect an event, the photographing condition determination unit 325 may automatically track and photograph the ball.

 撮影条件決定部325は、システム1に複数のドローン100が含まれている場合には、当該複数のドローン100にそれぞれ異なる撮影条件を決定する。複数のドローン100は、互いに異なる撮影位置かつ異なる撮影方向で撮影を行ってもよい。例えば、一方はシュートをする選手を撮影するとともに、他方は相手チームのキーパーを撮影してもよい。 When the system 1 includes multiple drones 100, the shooting condition determination unit 325 determines different shooting conditions for each of the multiple drones 100. The multiple drones 100 may take pictures from different shooting positions and different shooting directions. For example, one drone may take a picture of a player taking a shot, while the other may take a picture of the goalkeeper of the opposing team.

 また、撮影条件決定部325は、同時に飛行する複数のドローン100に対し、互いに同一の撮影範囲を互いに異なる目標撮影位置から撮影する撮影条件を設定してもよい。このような構成によれば、重要なシーンを多角度から撮影できる。また、撮影条件決定部325は、同時に飛行する複数のドローン100に対し、同一の撮影範囲を含むエリアを互いに異なるズーム量で撮影する撮影条件を設定してもよい。このような構成によれば、競技場Fにおいて特に注目される範囲について、複数の撮影条件で撮影することが可能になり、重要なシーンをより確実に撮影できる。 The shooting condition determination unit 325 may also set shooting conditions for multiple drones 100 flying simultaneously such that they each shoot the same shooting range from different target shooting positions. With this configuration, important scenes can be shot from multiple angles. The shooting condition determination unit 325 may also set shooting conditions for multiple drones 100 flying simultaneously such that they each shoot an area including the same shooting range with different zoom amounts. With this configuration, it becomes possible to shoot areas that are particularly noteworthy in the stadium F under multiple shooting conditions, making it possible to more reliably shoot important scenes.

 撮影条件決定部325は、撮影画像の分析を行い、当該分析結果に基づいて撮影すべき撮影範囲を予測し、撮影条件を決定してもよい。撮影条件決定部325は、例えば、撮影画像からボールの移動方向と速度又は加速度を分析することにより所定時間後のボールの移動距離を予測し、所定時間後のボールの位置が撮影範囲となるような撮影条件に決定してもよい。なお、ボールの速度は、予測開始時点における速度、すなわち初速を参照してよい。 The shooting condition determination unit 325 may analyze the captured image, predict the shooting range to be captured based on the analysis results, and determine the shooting conditions. For example, the shooting condition determination unit 325 may predict the movement distance of the ball after a predetermined time by analyzing the movement direction and speed or acceleration of the ball from the captured image, and determine the shooting conditions such that the position of the ball after the predetermined time will be in the shooting range. Note that the speed of the ball may refer to the speed at the start of the prediction, i.e., the initial speed.

 撮影条件決定部325は、イベント検出部320がイベントの検出結果として予測したボールの軌跡に応じて、撮影条件を決定してもよい。撮影条件決定部325は、例えば、速攻状態M513、M523又はロングパス等が検出されたことを契機に、ボールの軌跡予測を行い、予測される軌跡に応じて撮影条件を決定してもよい。撮影条件決定部325は、撮影方向をボールの進行方向側に変更するとともに、撮影倍率をズームアウトさせるように決定してよい。ズームアウトすることにより、進行するボールをより確実に撮影することができる。
 なお、ボールの軌跡予測は、所定の軌跡が予測される場合にイベント検出部320によりイベントとして検出される態様に限られない。すなわち例えば、撮影条件決定部325は、イベント検出部320とは別に、撮影条件決定部325又は別の機能部によりボールの軌跡予測を行い、この軌跡予測の結果のみに基づいて撮影条件を決定してもよい。
The photographing condition determination unit 325 may determine the photographing conditions according to the trajectory of the ball predicted by the event detection unit 320 as a result of the detection of an event. The photographing condition determination unit 325 may predict the trajectory of the ball, for example, when a fast attack state M513, M523, a long pass, or the like is detected, and determine the photographing conditions according to the predicted trajectory. The photographing condition determination unit 325 may change the photographing direction to the traveling direction of the ball and may determine to zoom out the photographing magnification. By zooming out, the traveling ball can be photographed more reliably.
Note that the ball trajectory prediction is not limited to a mode in which a predetermined trajectory is predicted and then detected as an event by the event detection unit 320. That is, for example, the shooting condition determination unit 325 may perform ball trajectory prediction by the shooting condition determination unit 325 or another functional unit separately from the event detection unit 320, and determine the shooting conditions based only on the result of the trajectory prediction.

 また、特にボールの軌跡予測に応じた撮影方向の変更の場合には、機首方向の制御に代えて、撮影制御部143の制御により撮影用カメラ141の撮影方向を制御することが望ましい。ボールの軌跡予測に基づく撮影方向の変更は、高い応答速度で撮影方向を変更する必要があるためである。すなわち例えば、ボールの軌跡予測に基づく撮影方向の変更においては撮影制御部143により撮影用カメラ141の撮影方向を変更し、軌跡予測に基づかない撮影方向の変更においてはドローン100の機首方向を制御することで撮影方向を変更するものとしてもよい。 In particular, when changing the shooting direction in response to a ball trajectory prediction, it is preferable to control the shooting direction of the shooting camera 141 by control of the shooting control unit 143 instead of controlling the nose direction. This is because changing the shooting direction based on a ball trajectory prediction requires changing the shooting direction with high response speed. That is, for example, when changing the shooting direction based on a ball trajectory prediction, the shooting direction of the shooting camera 141 is changed by the shooting control unit 143, and when changing the shooting direction not based on a trajectory prediction, the shooting direction may be changed by controlling the nose direction of the drone 100.

 また、撮影条件決定部325は、攻守交替および速攻を検出した場合には、ボールの進行方向に撮影方向を変更した上で、撮影倍率をズームアウトさせるよう決定してもよい。 In addition, when a change of offense and defense or a fast break is detected, the shooting condition determination unit 325 may determine to change the shooting direction to the direction of the ball's movement and to zoom out the shooting magnification.

(A-1-4-5.飛行モード切替部330)
 飛行モード切替部330は、イベント検出部320による検出結果に応じて、飛行モードを切り替える機能部である。
 飛行モード切替部330は主として、モード切替入力取得部331と、飛行許可エリア切替部332と、ジオフェンス切替部333と、飛行経路生成部334と、を有する。
(A-1-4-5. Flight mode switching unit 330)
The flight mode switching unit 330 is a functional unit that switches the flight mode depending on the detection result by the event detection unit 320.
The flight mode switching unit 330 mainly has a mode switching input acquisition unit 331, a flight permitted area switching unit 332, a geofence switching unit 333, and a flight path generation unit 334.

 モード切替入力取得部331は、飛行モードの切替に関する入力情報を取得する機能部である。飛行モードは、例えば外縁飛行モードM102、定位置飛行モードM103又はM107、コート内飛行モードM105(いずれも図8参照)である。定位置の飛行に関しコート外定位置飛行モードM103とするかコート内定位置飛行モードM107とするかは、飛行モードの選択入力を受け付けた時点でのドローン100の位置に応じて決定する。すなわち、ドローン100がコート外領域F200にいる場合はコート外定位置飛行モードM103、ドローン100がコートF100内にいる場合はコート内定位置飛行モードM107になる。 The mode switching input acquisition unit 331 is a functional unit that acquires input information regarding switching of flight modes. The flight modes are, for example, the outer edge flight mode M102, the fixed position flight mode M103 or M107, and the on-court flight mode M105 (all see FIG. 8). Whether the off-court fixed position flight mode M103 or the on-court fixed position flight mode M107 is used for flight in a fixed position is determined according to the position of the drone 100 at the time the flight mode selection input is received. In other words, when the drone 100 is in the area outside the court F200, the off-court fixed position flight mode M103 is used, and when the drone 100 is inside the court F100, the on-court fixed position flight mode M107 is used.

 飛行許可エリア切替部332は、飛行モードの切替に応じて飛行許可エリアを切り替える機能部である。 The permitted flight area switching unit 332 is a functional unit that switches the permitted flight area in response to switching of the flight mode.

 ジオフェンス切替部333は、飛行モードの切替に応じてジオフェンスを切り替える機能部である。例えば、ジオフェンス切替部333は、飛行モードが外縁飛行モードM102で有る場合、ジオフェンスG100を設定する。また、ジオフェンス切替部333は、飛行モードがコート内飛行モードM105である場合、ジオフェンスG200を設定する。また、外縁飛行モードM102とコート内飛行モードM105との遷移に介在する介在モード、すなわちコート内進入モードM104およびコート外退出モードM106においては、外縁飛行モードM102およびコート内飛行モードM105におけるジオフェンスG100、G200とは異なる第3ジオフェンスを設定する。 The geofence switching unit 333 is a functional unit that switches geofences in response to switching of flight modes. For example, when the flight mode is the outer edge flight mode M102, the geofence switching unit 333 sets a geofence G100. When the flight mode is the inner court flight mode M105, the geofence switching unit 333 sets a geofence G200. In addition, in the intermediate modes that are intermediate in the transition between the outer edge flight mode M102 and the inner court flight mode M105, i.e., the inner court entry mode M104 and the outer court exit mode M106, a third geofence different from the geofences G100 and G200 in the outer edge flight mode M102 and the inner court flight mode M105 is set.

 飛行経路生成部334は、飛行モードの切替を伴う移動におけるドローン100の飛行経路を生成する機能部である。飛行経路生成部334は、例えば、コート内進入モードM104又はコート外退出モードM106に遷移する撮影位置を決定する。また、飛行経路生成部334は、コート内進入モードM104からコート内飛行モードM105に遷移する撮影位置、又はコート外退出モードM106から外縁飛行モードM102に遷移する撮影位置を決定する。さらに、飛行経路生成部334は、コート内進入モードM104又はコート外退出モードM106における具体的な飛行経路を生成する。この飛行経路は、原則として第3ジオフェンスの領域内で生成される。 The flight path generating unit 334 is a functional unit that generates a flight path of the drone 100 during movement involving switching of flight modes. The flight path generating unit 334 determines, for example, the shooting position at which the mode transitions to the inside-court entry mode M104 or the outside-court exit mode M106. The flight path generating unit 334 also determines the shooting position at which the mode transitions from the inside-court entry mode M104 to the inside-court flight mode M105, or the shooting position at which the mode transitions from the outside-court exit mode M106 to the outer edge flight mode M102. Furthermore, the flight path generating unit 334 generates a specific flight path in the inside-court entry mode M104 or the outside-court exit mode M106. This flight path is generated, in principle, within the area of the third geofence.

(A-1-4-6.外縁飛行制御部340)
 外縁飛行制御部340は、外縁飛行モードにおけるドローン100の飛行を制御する機能部である。外縁飛行制御部340は、撮影条件指令部341と、飛行経路生成部342とを有する。
(A-1-4-6. Outer flight control unit 340)
The outer edge flight control unit 340 is a functional unit that controls the flight of the drone 100 in the outer edge flight mode. The outer edge flight control unit 340 has an imaging condition command unit 341 and a flight path generation unit 342.

 撮影条件指令部341は、ドローン100に撮影条件に関する指令を送信する機能部である。撮影条件は、すなわち例えば目標撮影位置又は撮影方向である。この撮影条件指令部341は、外縁飛行モードM102での飛行可能エリアの範囲内に位置する目標撮影位置を撮影条件決定部325から取得する。また、撮影条件指令部341は、例えば操縦器200の目標位置受付部226を介して受け付けた、ユーザにより入力される目標撮影位置を取得してもよい。 The shooting condition command unit 341 is a functional unit that transmits commands regarding shooting conditions to the drone 100. The shooting conditions are, for example, a target shooting position or shooting direction. This shooting condition command unit 341 acquires a target shooting position located within the range of the flight area in the outer edge flight mode M102 from the shooting condition determination unit 325. The shooting condition command unit 341 may also acquire a target shooting position input by the user, for example, received via the target position receiving unit 226 of the controller 200.

 飛行経路生成部342は、外縁飛行モードM102においてドローン100が移動する飛行経路を生成する機能部である。言い換えれば、飛行経路生成部342は、外縁飛行モードM102での飛行可能エリアにおける飛行経路を生成する。飛行経路生成部342は、ドローン100が外縁飛行モードM102での飛行可能エリアに属し、取得した目標位置も外縁飛行モードM102での飛行可能エリアに属する場合には、現在地点から目標位置までの飛行経路を生成する。また、飛行経路生成部342は、ドローン100が外縁飛行モードM102およびコート内飛行モードM105をまたがる移動を行う場合には、外縁飛行モードM102での飛行範囲における飛行経路を生成してよい。 The flight path generating unit 342 is a functional unit that generates a flight path along which the drone 100 moves in the outer perimeter flight mode M102. In other words, the flight path generating unit 342 generates a flight path in a flyable area in the outer perimeter flight mode M102. When the drone 100 belongs to a flyable area in the outer perimeter flight mode M102 and the acquired target position also belongs to a flyable area in the outer perimeter flight mode M102, the flight path generating unit 342 generates a flight path from the current position to the target position. In addition, when the drone 100 moves across the outer perimeter flight mode M102 and the on-court flight mode M105, the flight path generating unit 342 may generate a flight path in the flight range in the outer perimeter flight mode M102.

 飛行経路生成部342は、ボールが、ドローン100が飛行しているタッチラインF111bを超えてコート外領域F200に出た場合に、コート外領域F200であってタッチラインF111bよりも外側にドローン100を移動させる。また、飛行経路生成部342は、ドローン100の真下、又は当該地点からコートF100に向かって撮影を行わせてもよい。この構成によれば、コート外領域F200にボールが転がり出た場合においてもボールの追従撮影が可能である。なお、外縁飛行モードM102のジオフェンスG100は、前記タッチラインF111b上よりもコートF100外側にまで広がってあらかじめ設定されているとよい。この構成によれば、上述のようにドローン100がボールに追従してタッチラインF111bのやや外側を飛行する場合であっても、ドローン100をジオフェンスG100内に確実に維持できる。 When the ball crosses the touchline F111b on which the drone 100 is flying and enters the outer court area F200, the flight path generation unit 342 moves the drone 100 to the outer court area F200 outside the touchline F111b. The flight path generation unit 342 may also perform shooting directly below the drone 100 or from that point toward the court F100. With this configuration, it is possible to follow and shoot the ball even if the ball rolls into the outer court area F200. The geofence G100 of the outer edge flight mode M102 may be set in advance to extend beyond the touchline F111b to the outside of the court F100. With this configuration, the drone 100 can be reliably maintained within the geofence G100 even when the drone 100 follows the ball and flies slightly outside the touchline F111b as described above.

 飛行経路生成部342は、飛行経路上に、又はドローン100の近傍に障害物を検知した場合には、コートF100内部側に障害物を迂回する飛行経路を再生成する。また、飛行経路生成部342は、障害物を検知した場合に、所定時間ホバリングをした後に、当初生成した飛行経路での移動を行わせるものとしてもよい。競技場Fにおいてコート外領域F200におけるドローン100の安全は確保されていない一方、コートF100内のドローン100の安全は確保されている蓋然性が高いためである。障害物は、例えばドローン100の障害物検知部130により検知される他、外部システム700等からの情報により検知してもよい。
 また、外縁飛行制御部340は、障害物を検知した場合に、飛行経路生成部342により所定時間ホバリングをさせた後に、手動操縦への切替を行ってもよい。さらにまた、外縁飛行制御部340は、障害物を検知した場合に、飛行経路生成部342により所定時間ホバリングをさせた後に、表示制御部210を介してユーザに目標位置の再入力を促す表示を行ってもよい。
When the flight path generating unit 342 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle on the inside of the court F100. When the flight path generating unit 342 detects an obstacle, it may hover for a predetermined time and then move along the originally generated flight path. This is because while the safety of the drone 100 is not ensured in the area outside the court F200 in the stadium F, the safety of the drone 100 inside the court F100 is highly likely to be ensured. The obstacle may be detected, for example, by the obstacle detecting unit 130 of the drone 100, or by information from an external system 700 or the like.
Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may switch to manual control after causing the flight path generation unit 342 to hover for a predetermined time. Furthermore, when an obstacle is detected, the outer edge flight control unit 340 may cause the flight path generation unit 342 to hover for a predetermined time, and then display a message prompting the user to re-input the target position via the display control unit 210.

(A-1-4-7.コート内飛行制御部350)
 コート内飛行制御部350は、コート内飛行モードM105におけるドローン100の飛行を制御する機能部である。コート内飛行制御部350は、撮影条件指令部351と、飛行経路生成部352とを有する。撮影条件指令部351は、コート内飛行モードM105における飛行可能エリアの範囲内における撮影条件に関する指令をドローン100に送信する機能部である。撮影条件は、すなわち例えば目標撮影位置又は撮影方向である。
(A-1-4-7. In-court flight control unit 350)
The in-court flight control unit 350 is a functional unit that controls the flight of the drone 100 in the in-court flight mode M105. The in-court flight control unit 350 has an image capture condition command unit 351 and a flight path generation unit 352. The image capture condition command unit 351 is a functional unit that transmits commands to the drone 100 regarding image capture conditions within the range of the flight area in the in-court flight mode M105. The image capture conditions are, for example, a target image capture position or image capture direction.

 飛行経路生成部352は、コート内飛行モードM105においてドローン100が移動する飛行経路を生成する。すなわち、飛行経路生成部352は、コート内飛行モードM105での飛行可能エリアにおける飛行経路を生成する。より具体的には、飛行経路生成部352は、コート内飛行モードM105において、あらかじめ設定された複数の撮影位置を接続して飛行経路を生成する。飛行経路生成部352は、外縁飛行制御部340の飛行経路生成部342と同様、現在地点と目標撮影位置が異なる飛行モードの飛行可能エリアに属している場合には、コート内飛行モードM105での飛行範囲における飛行経路を生成してよい。 The flight path generating unit 352 generates a flight path along which the drone 100 moves in the on-court flight mode M105. That is, the flight path generating unit 352 generates a flight path in a flyable area in the on-court flight mode M105. More specifically, the flight path generating unit 352 generates a flight path by connecting multiple preset shooting positions in the on-court flight mode M105. Like the flight path generating unit 342 of the outer edge flight control unit 340, the flight path generating unit 352 may generate a flight path in a flight range in the on-court flight mode M105 when the current location and the target shooting position belong to flyable areas of different flight modes.

 また、飛行経路生成部352は、イベントの検出状況に応じて、接続する撮影位置を変更する。すなわち、飛行経路生成部352は、イベントの検出を契機に、飛行経路における撮影位置の接続関係を変更し、目標撮影位置への飛行経路を生成する。 Furthermore, the flight path generating unit 352 changes the connected shooting positions depending on the event detection status. That is, when an event is detected, the flight path generating unit 352 changes the connection relationship of the shooting positions on the flight path and generates a flight path to the target shooting position.

 飛行経路生成部352は、飛行経路上に、又はドローン100の近傍に障害物を検知した場合には、障害物を迂回する飛行経路を再生成する。障害物は、例えば障害物検知部130により検知される。飛行経路生成部352は、あらかじめ設定された複数の撮影位置の接続を変更して飛行経路を再生成してもよいし、平面上の飛行経路を維持したまま高度を高くした飛行経路に変更してもよい。また、飛行経路生成部352は、障害物を検知した場合に、所定時間ホバリングをした後に、当初生成した飛行経路での移動を行わせるものとしてもよい。さらに、コート内飛行制御部350は、障害物を検知した場合に、飛行経路生成部352により所定時間ホバリングをさせた後に、手動操縦への切替を行ってもよい。さらにまた、コート内飛行制御部350は、障害物を検知した場合に、飛行経路生成部352により所定時間ホバリングをさせた後に、表示制御部210を介してユーザに目標位置の再入力を促す表示を行ってもよい。障害物は、例えば、鳥、固定設備又は選手である。また、障害物は、ボールも含まれる。 If the flight path generating unit 352 detects an obstacle on the flight path or near the drone 100, it regenerates a flight path that bypasses the obstacle. The obstacle is detected by, for example, the obstacle detection unit 130. The flight path generating unit 352 may regenerate the flight path by changing the connection between multiple shooting positions that have been set in advance, or may change the flight path to a higher altitude while maintaining the flight path on a plane. In addition, the flight path generating unit 352 may hover for a predetermined time when it detects an obstacle, and then move along the flight path that was initially generated. Furthermore, the in-court flight control unit 350 may switch to manual control after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle. Furthermore, the in-court flight control unit 350 may display a message prompting the user to re-input the target position via the display control unit 210 after hovering for a predetermined time by the flight path generating unit 352 when it detects an obstacle. The obstacle may be, for example, a bird, a fixed facility, or a player. The obstacles also include balls.

 なお、本実施形態においては、外縁飛行モードM102における飛行制御は外縁飛行制御部340が行い、コート内飛行モードM105における飛行制御はコート内飛行制御部350が行うものとした。そして、外縁飛行モードM102とコート内飛行モードM105のそれぞれにおいて規定された撮影位置を選択肢として提示し、選択された目標位置への飛行経路を生成するものとした。しかしながら本発明の技術的範囲はこれに限られず、操縦器200によりドローン100の撮影位置と向きを制御して、各飛行モードに対応して設定されるジオフェンスG100、G200内のエリアの任意の位置を自由に飛行させるようにしてもよい。 In this embodiment, flight control in the outer edge flight mode M102 is performed by the outer edge flight control unit 340, and flight control in the inner court flight mode M105 is performed by the inner court flight control unit 350. Then, the shooting positions defined in each of the outer edge flight mode M102 and the inner court flight mode M105 are presented as options, and a flight path to the selected target position is generated. However, the technical scope of the present invention is not limited to this, and the pilot 200 may control the shooting position and orientation of the drone 100 to fly freely at any position in the area within the geofences G100 and G200 set corresponding to each flight mode.

 また、飛行経路生成部334、342、352の機能構成は一例であり、例えば細分化することなく1個の飛行経路生成部が飛行経路を生成するものとしてもよい。 Furthermore, the functional configuration of the flight path generation units 334, 342, and 352 is an example, and for example, a single flight path generation unit may generate the flight path without subdivision.

 図14は、撮影位置L101~L105、L206~L215および退避地点H200に定義される、ドローン100が遷移可能な経路を模式的に表した図である。撮影位置L101の地上の地点L101gは、ドローン100の離着陸地点となっている。ドローン100の位置遷移は、地点L101gから離陸し、撮影位置L101に到達するステップから開始される。また、ドローン100は、撮影位置L101において下降し、地点L101gに着陸して撮影を終了する。 FIG. 14 is a schematic diagram showing the routes that the drone 100 can follow, defined by the shooting positions L101-L105, L206-L215, and the evacuation point H200. Point L101g on the ground at the shooting position L101 is the takeoff and landing point for the drone 100. The position transition of the drone 100 begins with the step of taking off from point L101g and arriving at the shooting position L101. The drone 100 also descends at the shooting position L101, and lands at point L101g to end shooting.

 各撮影位置L101~L105、L206~L215において、ドローン100は、隣接する撮影位置にのみ遷移可能であってもよい。例えば、撮影位置L105にいるドローン100が外縁飛行モードを維持した状態で遷移可能な地点は、撮影位置L104である。また、撮影位置L105にいるドローン100がコート内飛行モードM105にモード切替した上で遷移可能な地点は撮影位置L106および撮影位置L107である。飛行経路生成部334、342又は352(図6参照)は、当該遷移可能な経路を参照して、ドローン100の飛行経路を生成する。 At each of the shooting positions L101 to L105 and L206 to L215, the drone 100 may only be able to transition to adjacent shooting positions. For example, the point to which the drone 100 at shooting position L105 can transition while maintaining the outer edge flight mode is shooting position L104. Furthermore, the points to which the drone 100 at shooting position L105 can transition after switching to the inside court flight mode M105 are shooting positions L106 and L107. The flight path generation unit 334, 342, or 352 (see FIG. 6) generates a flight path for the drone 100 by referring to the possible transition paths.

 目標位置受付部226により撮影位置の選択を受け付けると、ドローン100は、遷移可能な撮影位置を経由して選択された撮影位置に遷移する。例えば、ドローン100が撮影位置L105にいる場合に、撮影位置L215の選択がなされると、飛行経路生成部334、342又は352(図6参照)は、撮影位置L105、L207、L208、L213、L212、L215をこの順に遷移する飛行経路を生成し、ドローン100は、この飛行経路に沿って飛行する。 When the target position receiving unit 226 receives the selection of the shooting position, the drone 100 transitions to the selected shooting position via the available shooting positions. For example, when the drone 100 is at shooting position L105 and shooting position L215 is selected, the flight path generating unit 334, 342, or 352 (see FIG. 6) generates a flight path that transitions through shooting positions L105, L207, L208, L213, L212, and L215 in that order, and the drone 100 flies along this flight path.

 なお、ドローン100は、撮影位置L101~L215を接続して生成される飛行経路に代えて、現地点から目標撮影位置までを直線的に接続する飛行経路で飛行してもよい。また、ドローン100は、飛行モード切替を伴う撮影位置の遷移の場合には、隣接する撮影位置に遷移し、飛行モード切替を伴わない遷移の場合には、隣接しない撮影位置へ直接遷移可能になっていてもよい。すなわち例えば、ドローンが撮影位置L105から撮影位置L215に移動する場合に、モード切替を伴って撮影位置L105から撮影位置L207に遷移した後、撮影位置L107から撮影位置L215へはジオフェンスG200内の領域を直線的に移動してもよい。 In addition, instead of a flight path generated by connecting the shooting positions L101 to L215, the drone 100 may fly on a flight path that connects the current position to the target shooting position in a straight line. Furthermore, the drone 100 may transition to an adjacent shooting position in the case of a transition of the shooting position accompanied by a flight mode switch, and may be able to transition directly to a non-adjacent shooting position in the case of a transition without a flight mode switch. That is, for example, when the drone moves from shooting position L105 to shooting position L215, it may transition from shooting position L105 to shooting position L207 accompanied by a mode switch, and then move linearly from shooting position L107 to shooting position L215 in the area within the geofence G200.

 外縁飛行制御部340およびコート内飛行制御部350は、各飛行可能エリアにおいて飛行モードに応じてドローン100を自律飛行させる。例えば、外縁飛行制御部340およびコート内飛行制御部350は、各飛行可能エリアの内部においてドリー撮影、すなわちドローン100にボール又は指定の選手等の特定対象物の自動追従撮影を行わせてもよい。外縁飛行制御部340およびコート内飛行制御部350は、ドローン100の飛行高さを自動制御してもよい。また、自律飛行の態様は、飛行モードに応じて異なっていてよい。例えば、外縁飛行制御部340での制御時にはドリー撮影を行う一方、コート内飛行制御部350での制御時には撮影位置を固定した撮影方向のみの自動追従撮影、又は位置および撮影方向の自動追従撮影を行ってもよい。また、外縁飛行制御部340およびコート内飛行制御部350は、各飛行可能エリアにおいてユーザにより指定される目標位置にドローン100を移動させるための飛行経路をコート内(競技場F内)に生成してもよい。 The outer edge flight control unit 340 and the inner court flight control unit 350 fly the drone 100 autonomously in each flight area according to the flight mode. For example, the outer edge flight control unit 340 and the inner court flight control unit 350 may perform dolly shooting within each flight area, that is, the drone 100 may automatically follow and shoot a specific object such as a ball or a specified player. The outer edge flight control unit 340 and the inner court flight control unit 350 may automatically control the flight height of the drone 100. The autonomous flight mode may differ depending on the flight mode. For example, dolly shooting may be performed when controlled by the outer edge flight control unit 340, while automatic follow-up shooting of only the shooting direction with a fixed shooting position, or automatic follow-up shooting of the position and shooting direction may be performed when controlled by the inner court flight control unit 350. The outer edge flight control unit 340 and the inner court flight control unit 350 may also generate a flight path within the court (within the stadium F) for moving the drone 100 to a target position specified by the user in each flight area.

(A-1-4-8.定位置飛行制御部360)
 定位置飛行制御部360は、コート外定位置飛行モードM103およびコート内定位置飛行モードM107におけるドローン100の飛行制御を行う機能部である。定位置飛行制御部360は、定位置飛行モードでは、所定位置においてホバリングを行うとともに、機首方向又は撮影用カメラ141の方向を制御し、特定の選手又はボールに追従させて自動撮影を行う。なお、上述した「方向の制御」とは、左右方向(いわゆる「パン方向」)のみならず、上下方向(いわゆる「ティルト方向」)の制御も含む概念である。
(A-1-4-8. Fixed position flight control unit 360)
The fixed position flight control unit 360 is a functional unit that controls the flight of the drone 100 in the off-court fixed position flight mode M103 and the on-court fixed position flight mode M107. In the fixed position flight mode, the fixed position flight control unit 360 hovers at a predetermined position and controls the nose direction or the direction of the shooting camera 141 to follow a specific player or the ball and perform automatic shooting. Note that the above-mentioned "control of direction" is a concept that includes control not only in the left-right direction (so-called "pan direction") but also in the up-down direction (so-called "tilt direction").

 定位置飛行制御部360は、撮影条件指令部361を備える。撮影条件指令部361は、定位置飛行モードM103又はM107において目標位置および目標撮影方向の指令を送信する機能部である。撮影方向は、撮影条件決定部325により決定される情報であってもよいし、操縦器200を介してユーザにより入力される情報であってもよい。 The fixed position flight control unit 360 includes an image capture condition command unit 361. The image capture condition command unit 361 is a functional unit that transmits commands for the target position and the target image capture direction in the fixed position flight mode M103 or M107. The image capture direction may be information determined by the image capture condition determination unit 325, or may be information input by the user via the controller 200.

(A-1-4-9.通信部370)
 通信部370は、図示しないモデム等を有し、通信ネットワーク400を介してドローン100、操縦器200等との通信が可能である。通信部370は、例えばドローン100及びその周囲の状態を監視し、操縦器200に通知してもよい。
(A-1-4-9. Communication unit 370)
The communication unit 370 has a modem or the like (not shown) and is capable of communicating with the drone 100, the controller 200, and the like via the communication network 400. The communication unit 370 may, for example, monitor the state of the drone 100 and its surroundings and notify the controller 200.

(A-1-4-10.記憶部380)
 記憶部380は、ドローン100の飛行制御に係る情報を記憶する機能部であり、例えばデータベースである。記憶部380は、例えば競技場Fにおける複数の撮影位置L101~L105、L210~L215の座標を記憶する。この座標は、平面上の2次元座標の他、高さ方向の情報を含む3次元座標であってもよい。また、記憶部380は、図13に示すイベント-撮影条件テーブルT1を記憶している。前述の通り、イベント-撮影条件テーブルT1は書換可能に記録されている。また、イベント-撮影条件テーブルT1は複数記憶されていてもよい。
(A-1-4-10. Storage unit 380)
The storage unit 380 is a functional unit that stores information related to flight control of the drone 100, and is, for example, a database. The storage unit 380 stores, for example, the coordinates of multiple shooting positions L101 to L105, L210 to L215 in the stadium F. These coordinates may be two-dimensional coordinates on a plane or three-dimensional coordinates including information in the height direction. The storage unit 380 also stores an event-shooting condition table T1 shown in FIG. 13. As described above, the event-shooting condition table T1 is recorded in a rewritable manner. Furthermore, multiple event-shooting condition tables T1 may be stored.

●フローチャート
 図15は、本実施形態における空中撮影制御の全体的な流れを示すフローチャートである。図16は、図15の飛行制限処理S1002のサブルーチンである。図17は、図15の撮影条件切替処理S1010のサブルーチンである。
● Flowcharts Fig. 15 is a flowchart showing the overall flow of aerial photography control in this embodiment. Fig. 16 is a subroutine of the flight restriction process S1002 in Fig. 15. Fig. 17 is a subroutine of the photography condition switching process S1010 in Fig. 15.

 図15に示すフローチャートに記載する制御は、定期的にループして実行される。図15に示すように、ドローン100の飛行中に、ジオフェンスG100、G200の近傍に近づいたことを検出した場合(ステップS1001でYES)、ステップS1002の飛行制限処理に移行する。飛行制限処理S1002のサブルーチンは、図15で説明する。 The control described in the flowchart shown in FIG. 15 is executed in a regular loop. As shown in FIG. 15, if it is detected that the drone 100 is approaching the vicinity of geofences G100, G200 while flying (YES in step S1001), the process proceeds to flight restriction processing in step S1002. The subroutine of the flight restriction processing S1002 is explained in FIG. 15.

 ステップS1001でジオフェンスG100、G200の近傍に近づいたことが検出されない場合(ステップS1001でNO)、進路又はドローン100の機体近傍に障害物の有無を検出する(ステップS1003)。ステップS1003で障害物が検出された場合(ステップS1003でYES)、ドローン100をホバリングさせる、又は迂回経路を生成し、ドローン100の飛行経路を当該迂回経路に変更する(ステップS1004)。 If it is not detected in step S1001 that the drone is approaching the vicinity of geofences G100, G200 (NO in step S1001), the presence or absence of an obstacle in the path or near the drone 100 is detected (step S1003). If an obstacle is detected in step S1003 (YES in step S1003), the drone 100 is caused to hover or a detour route is generated, and the flight route of the drone 100 is changed to the detour route (step S1004).

 ステップS1003で障害物が検出されない場合、(ステップS1003でNO)、機体状態のアクション判定の有無を検出する(ステップS1005)。ステップS1005において機体状態のアクション判定が検出された場合(ステップS1005でYES)、ステップS1006に移行し、イベント種別の判定処理を実行する(ステップS1006)。 If no obstacle is detected in step S1003 (NO in step S1003), the presence or absence of an action determination of the aircraft state is detected (step S1005). If an action determination of the aircraft state is detected in step S1005 (YES in step S1005), the process proceeds to step S1006, where an event type determination process is executed (step S1006).

 ステップS1005でアクション判定が検出されない場合(ステップS1005でNO)、ユーザによる操縦器200からの入力の有無を判定する(ステップS1007)。ステップS1007において操縦器200からの入力が検出された場合(ステップS1007でYES)、入力に基づく指令を実行する(ステップS1008)。 If no action determination is detected in step S1005 (NO in step S1005), it is determined whether or not there is an input from the controller 200 by the user (step S1007). If an input from the controller 200 is detected in step S1007 (YES in step S1007), a command based on the input is executed (step S1008).

 ステップS1007において操縦器200からの入力が検出されない場合(ステップS1007でNO)、イベントの有無を判定する(ステップS1009)。イベントが検出された場合(ステップS1009でYES)、ステップS1010の撮影条件切替処理に移行する(ステップS1010)。ステップS1009においてイベントが検出されない場合(ステップS1009でNO)、ステップS1001に戻りステップS1001~S1009を繰り返す。 If no input from the controller 200 is detected in step S1007 (NO in step S1007), the presence or absence of an event is determined (step S1009). If an event is detected (YES in step S1009), the process proceeds to the shooting condition switching process in step S1010 (step S1010). If no event is detected in step S1009 (NO in step S1009), the process returns to step S1001 and steps S1001 to S1009 are repeated.

 図15に示す通り、空中撮影制御の全体的な処理は、ジオフェンス制限、障害物検知、機体状態に基づく制御、ユーザの入力に基づく制御、試合状態又は攻守状態といった試合内のイベントに基づく制御、の順に行われる。すなわち、試合内のイベントに基づく制御より先に、各制御が実行される。この順序は、安全な制御処理を行う優先度の高い順となっている。このような構成によれば、ドローン100の飛行に関する安全性をより確実に担保することができる。 As shown in FIG. 15, the overall processing of aerial photography control is performed in the following order: geofence restriction, obstacle detection, control based on aircraft status, control based on user input, and control based on in-game events such as the game status or offensive and defensive status. In other words, each control is executed before control based on in-game events. This order is based on the priority of performing safe control processing. With this configuration, the safety of flying the drone 100 can be more reliably guaranteed.

 図16に示すように、まず、ドローン100の飛行制御部123により、ドローン100がジオフェンスの外へ進出しないよう制限する動作指令を行う(ステップS1101)。ステップS1101では、手動でドローン100を操縦する際にも、ジオフェンスの外側にドローン100が進出しないように、飛行目標位置の制限を設ける。 As shown in FIG. 16, first, the flight control unit 123 of the drone 100 issues an operation command to restrict the drone 100 from advancing outside the geofence (step S1101). In step S1101, a restriction is set on the flight target position so that the drone 100 does not advance outside the geofence even when the drone 100 is manually operated.

 次いで、ジオフェンスの外にドローン100が進出しない場合には(ステップS1102でNO)、処理を終了する。 Next, if the drone 100 does not advance outside the geofence (NO in step S1102), the process ends.

 ドローン100がジオフェンスの外に進出した場合(ステップS1102でYES)、ステップS1103に進む。この状況の原因としては、例えば風が強すぎて機体が流される場合や、ドローン100の故障により意図する方向へ飛行しない場合等が考えられる。ステップS1103では、飛行制御部123により、ジオフェンス内に戻る動作指令を行う。より具体的には、ステップS1103では、ジオフェンス内に戻るような飛行目標位置の指令、すなわちジオフェンスの内部の所定地点を飛行目標位置とする動作指令をドローン100に与える。 If the drone 100 advances outside the geofence (YES in step S1102), proceed to step S1103. Possible causes of this situation include, for example, the wind being too strong and causing the aircraft to be swept away, or a malfunction in the drone 100 preventing it from flying in the intended direction. In step S1103, the flight control unit 123 issues an operation command to return to the geofence. More specifically, in step S1103, a flight target position command to return to the geofence, that is, an operation command to set the flight target position to a specified point inside the geofence, is given to the drone 100.

 次いで、ドローン100の測定部110により測定される情報、例えば位置、方位、高度又は速度の情報を参照し、ドローン100がジオフェンスの内側へ接近しているか判定する(ステップS1104)。この場合、ステップS1104はステップS1103の所定時間後に実行される。なお、ステップS1104においては、ステップS1103の時点よりもジオフェンスの内側に接近していれば足り、ジオフェンスの内側に位置するか否かを判定しなくてよい。 Next, by referring to information measured by the measurement unit 110 of the drone 100, such as information on position, direction, altitude, or speed, it is determined whether the drone 100 is approaching the inside of the geofence (step S1104). In this case, step S1104 is executed a predetermined time after step S1103. Note that in step S1104, it is sufficient if the drone 100 is closer to the inside of the geofence than at the time of step S1103, and it is not necessary to determine whether the drone 100 is located inside the geofence.

 ステップS1104において、ドローン100がジオフェンス内へ接近していないと判定される場合(ステップS1104でNO)、上記動作指令の効果がないと判断して、飛行制御部123により、ドローン100を強制着陸させる(ステップS1105)。 If it is determined in step S1104 that the drone 100 is not approaching the geofence (NO in step S1104), it is determined that the above operation command is ineffective, and the flight control unit 123 forces the drone 100 to land (step S1105).

 ステップS1104においてドローン100がジオフェンス内へ接近している場合、(ステップS1104でYES)、ステップS1106に進む。ステップS1106では、ドローン100の測定部110により測定される情報、例えば位置および高度を参照し、ドローン100がジオフェンス内に位置しているか判定する。ドローン100がジオフェンス内に位置している場合(S1106でYES)、処理を終了する。ドローン100がジオフェンス内に位置していない場合(S1106でNO)、ステップ1104に戻り、ジオフェンス内にドローン100が帰還するまで、ジオフェンス内へ戻る動作指令に基づく動作を継続する。 If the drone 100 is approaching the geofence in step S1104 (YES in step S1104), proceed to step S1106. In step S1106, information measured by the measurement unit 110 of the drone 100, such as the position and altitude, is referenced to determine whether the drone 100 is located within the geofence. If the drone 100 is located within the geofence (YES in S1106), end the process. If the drone 100 is not located within the geofence (NO in S1106), return to step 1104 and continue operation based on the operation command to return to the geofence until the drone 100 returns within the geofence.

 図17は、撮影条件の切替における切替処理フローの1例を示す図である。まず、イベント検出部320により所定のイベントが検出されると、撮影条件決定部325により、イベント-撮影条件テーブルT1を参照して撮影位置および撮影方向の目標値を決定し、ドローン100に制御指令を送信する(ステップS1301)。次いで、ユーザの手動介入による操作指令を操縦器200から受信したかを判定し(ステップS1302)、操縦器200からの操作指令を受信していない場合には(ステップS1302でNO)、目標撮影位置および目標撮影方向に到達するまで自動制御による飛行を継続する(ステップS1303)。操縦器200からの操作指令を受信した場合には、手動介入による操作指令に基づく動作を実行する(ステップS1304)。 FIG. 17 is a diagram showing an example of a switching process flow for switching shooting conditions. First, when a specific event is detected by the event detection unit 320, the shooting condition determination unit 325 refers to the event-shooting condition table T1 to determine the target values for the shooting position and shooting direction, and transmits a control command to the drone 100 (step S1301). Next, it is determined whether an operation command due to manual intervention by the user has been received from the controller 200 (step S1302). If an operation command has not been received from the controller 200 (NO in step S1302), flight by automatic control continues until the target shooting position and target shooting direction are reached (step S1303). If an operation command has been received from the controller 200, an operation based on the operation command due to manual intervention is executed (step S1304).

 ステップS1303において目標撮影位置および目標撮影方向に到達すると、ステップS1305に進む。ステップS1305では、手動制御モードに移行し、手動操縦が許可されている旨が操作画面G3(図20参照)に表示される。 When the target shooting position and the target shooting direction are reached in step S1303, the process proceeds to step S1305. In step S1305, the mode transitions to manual control mode, and a message is displayed on the operation screen G3 (see FIG. 20) indicating that manual operation is permitted.

●表示部201の表示例
 図18および図19は、操縦器200の表示部201に表示される画面G1、G2の例である。
Display Examples of the Display Unit 201 FIGS. 18 and 19 are examples of screens G1 and G2 displayed on the display unit 201 of the controller 200. FIG.

 図18に示す画面G1には、競技場Fおよび撮影位置L101~L215を俯瞰的に模式的に示すフィールドマップG10、ドローン100の位置情報を示すアイコンG11、撮影用カメラ141が撮影している撮影範囲G12、ドローン100が属する飛行モードの表示欄G21、機体状態、機体行動状態、試合状態および攻守状態といったイベント検出部320により検出される状態を示す状態表示欄G22、ドローン100を着陸させる着陸ボタンG30、ドローン100により撮影されている画像が表示される映像欄G40等が表示されている。図18の例では、表示欄G21には、自動制御モードか手動制御モードかに大別されて表示されている。ドローンは撮影位置L101において外縁飛行モードで飛行している。  The screen G1 shown in FIG. 18 displays a field map G10 that shows a schematic bird's-eye view of the stadium F and the shooting positions L101 to L215, an icon G11 that shows the position information of the drone 100, the shooting range G12 captured by the shooting camera 141, a display field G21 of the flight mode to which the drone 100 belongs, a status display field G22 that shows the status detected by the event detection unit 320, such as the aircraft status, aircraft behavior status, match status, and offensive and defensive status, a landing button G30 for landing the drone 100, and a video field G40 in which images captured by the drone 100 are displayed. In the example of FIG. 18, the display field G21 displays the main control modes, either automatic control mode or manual control mode. The drone is flying in the outer edge flight mode at the shooting position L101.

 なお、ドローン100の位置と撮影方向の制御は、手動であってもよいし、ボール又は特定の選手への自動追従制御が行われてもよい。自動追従制御が行われている場合には、追従対象となるボール又は特定の選手の情報を画面G1上に表示するものとしてもよい。 The position and shooting direction of the drone 100 may be controlled manually, or automatic tracking control of the ball or a specific player may be performed. When automatic tracking control is performed, information about the ball or specific player being tracked may be displayed on the screen G1.

 ドローン100を示すアイコンG11には、ドローン100の進行方向を示す矢印が表示されている。なお、ドローン100の機首方向は、ドローン100の進行方向とは限らず、任意の方向を向いていてよい。移動中においてドローン100の機首方向は一定でなくてもよく、例えばヨー回転により選手又はボールを撮影しながら移動してもよい。 The icon G11 representing the drone 100 displays an arrow indicating the direction of travel of the drone 100. Note that the direction of the nose of the drone 100 is not limited to the direction of travel of the drone 100, and may be pointing in any direction. The direction of the nose of the drone 100 does not have to be constant while moving, and for example, the drone 100 may move while photographing a player or the ball by rotating in a yaw motion.

 図19には、複数のドローン100が1個の競技場Fを撮影する場合における画面G2の表示例を示している。図19では、特に、フリーキック状態M441を検出した場合の表示例を示している。画面G2では、フィールドマップG10上に2個のドローンのアイコン11a、11bが表示されている。また、複数のドローン100がそれぞれ撮影する撮影範囲G13a、G13b、および撮影画像を示す映像欄G40a、G40bが、対応するドローン100のアイコン11a、11bと紐づけられて表示されている。 FIG. 19 shows an example of the display on screen G2 when multiple drones 100 are photographing one stadium F. FIG. 19 particularly shows an example of the display when a free kick state M441 is detected. On screen G2, icons 11a and 11b of two drones are displayed on a field map G10. In addition, the shooting ranges G13a and G13b photographed by each of the multiple drones 100, and video columns G40a and G40b showing the captured images are displayed in association with the icons 11a and 11b of the corresponding drones 100.

 図19の例では、アイコン11aに対応する第1のドローン100がゴール120a付近の局所撮影を行っている一方、アイコン11bに対応する第2のドローン100は、外縁の撮影位置L101から俯瞰撮影を行っている。また、第1のドローンと第2のドローンの撮影角度は、互いに異なっている。第1のドローンと第2のドローンは、互いに撮影できない位置を補完するような撮影条件となっていることが好ましい。このように、複数のドローン100が互いに異なる撮影条件で撮影する構成によれば、競技場Fを多面的に撮影することができる。 In the example of FIG. 19, the first drone 100 corresponding to icon 11a is taking localized shots near the goal 120a, while the second drone 100 corresponding to icon 11b is taking bird's-eye shots from a shooting position L101 on the outer edge. The shooting angles of the first drone and the second drone are different from each other. It is preferable that the shooting conditions of the first drone and the second drone are such that they complement each other's positions where they cannot shoot. In this way, with a configuration in which multiple drones 100 shoot under different shooting conditions, the stadium F can be photographed from multiple angles.

 図20は、イベント検出部320によりイベントが未検出である場合の、表示部201に表示される画面G3の例である。イベントが未検出であるためドローン100の自動制御は行われず、表示欄G21には「手動制御モード」である旨が表示されている。 FIG. 20 is an example of screen G3 displayed on the display unit 201 when an event has not been detected by the event detection unit 320. Because an event has not been detected, automatic control of the drone 100 is not performed, and the display field G21 indicates that the mode is "manual control mode."

 図21は、画面G3の状態においてイベント検出部320がイベントを検出した場合の画面G4の例である。この場合、イベントの検出に応じてドローン100が自動制御に切り替わり、表示欄G21には「自動制御モード」である旨が表示されている。また、状態表示欄G22には、攻守状態としてAチーム速攻状態M513が検出されていることが表示されている。 FIG. 21 is an example of screen G4 when the event detection unit 320 detects an event in the state of screen G3. In this case, the drone 100 switches to automatic control in response to the detection of the event, and the display field G21 indicates that it is in "automatic control mode." In addition, the status display field G22 indicates that a team A quick attack state M513 has been detected as the offensive and defensive state.

 速攻状態M513、M523が検出されると、図21では、フィールドマップG10上に、ボールの予測される軌跡を示す矢印G15、および軌跡に基づく移動後のボールG16が表示される。また、撮影用カメラ141が推移する撮影方向の変化を示す矢印G17が表示される。この構成によれば、自動制御モードにおいてもドローン100の動作の理由がユーザにとって明確であり、ユーザに安心感を与えることができる。 When a fast attack state M513 or M523 is detected, in FIG. 21, an arrow G15 indicating the predicted trajectory of the ball and the ball G16 after movement based on the trajectory are displayed on the field map G10. Also, an arrow G17 indicating the change in the shooting direction of the shooting camera 141 is displayed. With this configuration, the reason for the operation of the drone 100 is clear to the user even in automatic control mode, providing the user with a sense of security.

[A-2.本実施形態の効果]
 本実施形態によれば、撮影作業を省力化するとともに、撮影対象の状況に応じた適切な撮影が可能である。
[A-2. Effects of this embodiment]
According to this embodiment, it is possible to reduce the labor required for photographing and to photograph an object appropriately according to the situation of the object.

 なお、本発明は、上記実施形態に限らず、本明細書の記載内容に基づき、種々の構成を採り得ることはもちろんである。 The present invention is not limited to the above embodiment, and various configurations can be adopted based on the contents of this specification.

 上記実施形態に関連した説明した一連の処理は、ソフトウェア、ハードウェア並びにソフトウェア及びハードウェアの組合せのいずれを用いて実現されてもよい。本実施形態に係るサーバ300の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えば通信ネットワーク400を介して配信されてもよい。 The series of processes described in relation to the above embodiment may be implemented using software, hardware, or a combination of software and hardware. A computer program for implementing each function of the server 300 according to this embodiment may be created and implemented in a PC or the like. A computer-readable recording medium on which such a computer program is stored may also be provided. Examples of the recording medium include a magnetic disk, an optical disk, a magneto-optical disk, and a flash memory. The above computer program may also be distributed, for example, via the communication network 400, without using a recording medium.

 上記実施形態で用いたフローチャートに関し、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 The flowcharts used in the above embodiments do not necessarily have to be executed in the order shown. Some processing steps may be executed in parallel. In addition, additional processing steps may be employed, and some processing steps may be omitted.

1     空中撮影システム
100   ドローン(移動体)
 141  撮影用カメラ
200   操縦器
 220  入力制御部
300   サーバ
320   イベント検出部
330   飛行モード切替部
 331  モード切替判定部
380   記憶部
F     競技場
 F100 コート
 F200 コート外領域
M102  外縁飛行モード
M105  コート内飛行モード
1 Aerial photography system 100 Drone (mobile body)
141 Shooting camera 200 Pilot 220 Input control unit 300 Server 320 Event detection unit 330 Flight mode switching unit 331 Mode switching determination unit 380 Memory unit F Stadium F100 Court F200 Area outside the court M102 Outer edge flight mode M105 Inner court flight mode

Claims (15)

 対象エリアを飛行する移動体と、
 前記移動体に搭載される、前記対象エリアを撮影するカメラと、
 前記カメラにより取得される撮影画像又は外部システムからの入力に基づいてイベントを検出するイベント検出部と、
 検出される前記イベントに応じて、前記移動体の目標撮影位置および目標撮影方向の少なくともいずれかを含む撮影条件を決定する撮影条件決定部と、
を備える、
 空中撮影システム。
 
A moving object flying in a target area;
A camera mounted on the moving object for photographing the target area;
an event detection unit that detects an event based on a captured image acquired by the camera or an input from an external system;
an image capturing condition determination unit that determines image capturing conditions including at least one of a target image capturing position and a target image capturing direction of the moving object in response to the detected event;
Equipped with
Aerial photography system.
 前記撮影条件決定部は、検出される前記イベントの種類に応じて、前記撮影条件を決定する、
請求項1記載の空中撮影システム。
 
the photographing condition determination unit determines the photographing condition according to a type of the detected event.
2. The aerial photography system according to claim 1.
 前記目標撮影方向は、前記移動体の機首方向および前記カメラの前記移動体に対する角度の少なくともいずれかの制御により達成される、
請求項1記載の空中撮影システム。
 
The target shooting direction is achieved by controlling at least one of a nose direction of the moving body and an angle of the camera relative to the moving body.
2. The aerial photography system according to claim 1.
 前記撮影条件は、前記カメラの目標ズーム量を含む、
請求項1記載の空中撮影システム。
 
The photographing conditions include a target zoom amount of the camera.
2. The aerial photography system according to claim 1.
 前記撮影条件決定部は、前記移動体の操縦器又は前記外部システムから前記撮影条件の入力を受け付けた場合には、前記イベント検出部により前記イベントを検出している場合であっても、当該前記操縦器又は前記外部システムを介して受け付けられる操作に基づいて前記撮影条件を決定する、
請求項1記載の空中撮影システム。
 
When the photographing condition determination unit receives an input of the photographing conditions from a controller of the moving body or the external system, even if the event detection unit detects the event, the photographing condition determination unit determines the photographing conditions based on an operation received via the controller or the external system.
2. The aerial photography system according to claim 1.
 ユーザによる前記撮影条件の入力を受け付ける操縦器をさらに備え、
 前記撮影条件決定部は、前記イベント検出部が前記イベントを検出していない場合には、前記操縦器を介する入力に基づいて前記撮影条件を決定し、前記イベント検出部が前記イベントを検出した場合には、当該イベントに基づいて前記撮影条件を決定する、
請求項1記載の空中撮影システム。
 
Further comprising a controller for receiving input of the photographing conditions by a user,
The photographing condition determination unit determines the photographing condition based on an input via the controller when the event detection unit does not detect the event, and determines the photographing condition based on the event when the event detection unit detects the event.
2. The aerial photography system according to claim 1.
 前記移動体を複数備え、1個の前記対象エリアに複数の前記移動体を同時に飛行させることで当該対象エリアを撮影する空中撮影システムであって、
 前記撮影条件決定部は、複数の前記移動体にそれぞれ異なる前記撮影条件を決定する、
請求項1記載の空中撮影システム。
 
An aerial photography system including a plurality of the moving bodies, the aerial photography system photographing one target area by flying the plurality of the moving bodies simultaneously in the target area,
The photographing condition determination unit determines different photographing conditions for each of the plurality of moving objects.
2. The aerial photography system according to claim 1.
 前記撮影条件決定部は、同時に飛行する複数の前記移動体に対し、互いに同一の撮影範囲を互いに異なる前記目標撮影位置から撮影する前記撮影条件、又は前記同一の撮影範囲を含むエリアを互いに異なるズーム量で撮影する前記撮影条件を設定する、
請求項7記載の空中撮影システム。
 
The photographing condition determination unit sets, for the plurality of moving bodies flying simultaneously, the photographing condition for photographing the same photographing range from the different target photographing positions, or the photographing condition for photographing an area including the same photographing range with different zoom amounts,
8. The aerial photography system according to claim 7.
 前記撮影条件決定部は、前記イベント検出部がイベントの検出結果として予測したボールの軌跡の予測結果に応じて、前記撮影条件を決定する、
請求項1記載の空中撮影システム。
 
the photographing condition determination unit determines the photographing condition in accordance with a prediction result of a ball trajectory predicted by the event detection unit as a detection result of the event.
2. The aerial photography system according to claim 1.
 前記撮影条件決定部は、前記対象エリアで行われる競技において反則が発生した旨のイベントが検出された場合に、前記競技で使用されるボール又は前記競技の審判の地点周辺を撮影範囲とする前記撮影条件を決定する、
請求項1記載の空中撮影システム。
 
the photographing condition determination unit, when an event indicating that a foul has occurred in a game held in the target area is detected, determines the photographing condition to set a photographing range around a ball used in the game or a position of a referee of the game;
2. The aerial photography system according to claim 1.
 前記移動体を複数備え、1個の前記対象エリアに複数の前記移動体を同時に飛行させることで当該対象エリアを撮影する空中撮影システムであって、
 前記撮影条件決定部は、前記競技において前記反則が発生した旨のイベントが検出された場合に、複数の前記移動体により、互いに異なる前記目標撮影位置、前記目標撮影方向又はズーム量で、前記ボール又は前記競技の審判の地点周辺を撮影範囲とする前記撮影条件を決定する、
請求項10記載の空中撮影システム。
 
An aerial photography system including a plurality of the moving bodies, the aerial photography system photographing one target area by flying the plurality of the moving bodies simultaneously in the target area,
the photographing condition determination unit, when an event indicating that the foul has occurred in the game is detected, determines the photographing conditions in which the photographing range is the periphery of the ball or a point of a referee of the game at the target photographing positions, the target photographing directions, or the zoom amounts different from one another by the multiple moving objects;
The aerial imaging system according to claim 10.
 前記移動体の飛行経路を生成する飛行経路生成部を備え、
 前記飛行経路生成部は、前記撮影画像により検出される前記イベントに基づいて決定される前記目標撮影位置までの前記飛行経路を自動生成する、
請求項1記載の空中撮影システム。
 
A flight path generating unit that generates a flight path of the moving object,
The flight path generation unit automatically generates the flight path to the target shooting position that is determined based on the event detected from the captured image.
2. The aerial photography system according to claim 1.
 前記飛行経路生成部は、前記対象エリア内に構成されるコート内における前記飛行経路を生成し、
 前記飛行経路生成部は、あらかじめ設定された複数の撮影位置を接続して前記目標撮影位置までの前記飛行経路を生成し、前記イベントの検出状況に応じて、接続する前記撮影位置を変更する、
請求項12記載の空中撮影システム。
 
The flight path generation unit generates the flight path within a court configured within the target area,
the flight path generation unit generates the flight path to the target shooting position by connecting a plurality of shooting positions set in advance, and changes the shooting positions to be connected depending on the detection status of the event.
13. The aerial imaging system according to claim 12.
 対象エリアを撮影するカメラにより取得される撮影画像又は外部システムからの入力に基づいてイベントを検出するイベント検出ステップと、
 検出される前記イベントに応じて、前記カメラを搭載する移動体の目標撮影位置および目標撮影方向の少なくともいずれかを含む撮影条件を決定する撮影条件決定ステップと、
を備える、
 空中撮影方法。
 
An event detection step of detecting an event based on an image captured by a camera capturing an image of a target area or an input from an external system;
a photographing condition determination step of determining photographing conditions including at least one of a target photographing position and a target photographing direction of a moving body equipped with the camera in response to the detected event;
Equipped with
Aerial photography method.
 対象エリアを撮影するカメラにより取得される撮影画像又は外部システムからの入力に基づいてイベントを検出するイベント検出命令と、
 検出される前記イベントに応じて、前記カメラを搭載する移動体の目標撮影位置および目標撮影方向の少なくともいずれかを含む撮影条件を決定する撮影条件決定命令と、
をコンピュータにより実行させる、
 空中撮影プログラム。
 

 
An event detection command for detecting an event based on an image captured by a camera capturing an image of a target area or an input from an external system;
a photographing condition determination command for determining photographing conditions including at least one of a target photographing position and a target photographing direction of a moving body equipped with the camera in response to the detected event;
Executing the above by a computer,
Aerial photography program.


PCT/JP2022/036120 2022-09-28 2022-09-28 Aerial imaging system, aerial imaging method, and aerial imaging program Ceased WO2024069789A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/036120 WO2024069789A1 (en) 2022-09-28 2022-09-28 Aerial imaging system, aerial imaging method, and aerial imaging program
JP2024548908A JPWO2024069789A1 (en) 2022-09-28 2022-09-28

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/036120 WO2024069789A1 (en) 2022-09-28 2022-09-28 Aerial imaging system, aerial imaging method, and aerial imaging program

Publications (1)

Publication Number Publication Date
WO2024069789A1 true WO2024069789A1 (en) 2024-04-04

Family

ID=90476657

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036120 Ceased WO2024069789A1 (en) 2022-09-28 2022-09-28 Aerial imaging system, aerial imaging method, and aerial imaging program

Country Status (2)

Country Link
JP (1) JPWO2024069789A1 (en)
WO (1) WO2024069789A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017011469A (en) * 2015-06-22 2017-01-12 カシオ計算機株式会社 Photographing device, photographing method, and program
WO2017057157A1 (en) * 2015-09-30 2017-04-06 株式会社ニコン Flight device, movement device, server, and program
US20180129212A1 (en) * 2016-11-09 2018-05-10 Samsung Electronics Co., Ltd. Unmanned aerial vehicle and method for photographing subject using the same
JP2019161486A (en) * 2018-03-14 2019-09-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Dynamic body detection device, control device, moving body, dynamic body detection method, and program
JP2020115642A (en) * 2016-02-03 2020-07-30 ソニー株式会社 System and method for capturing stationary scene and/or moving scene by using plural camera networks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017011469A (en) * 2015-06-22 2017-01-12 カシオ計算機株式会社 Photographing device, photographing method, and program
WO2017057157A1 (en) * 2015-09-30 2017-04-06 株式会社ニコン Flight device, movement device, server, and program
JP2020115642A (en) * 2016-02-03 2020-07-30 ソニー株式会社 System and method for capturing stationary scene and/or moving scene by using plural camera networks
US20180129212A1 (en) * 2016-11-09 2018-05-10 Samsung Electronics Co., Ltd. Unmanned aerial vehicle and method for photographing subject using the same
JP2019161486A (en) * 2018-03-14 2019-09-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Dynamic body detection device, control device, moving body, dynamic body detection method, and program

Also Published As

Publication number Publication date
JPWO2024069789A1 (en) 2024-04-04

Similar Documents

Publication Publication Date Title
US12416918B2 (en) Unmanned aerial image capture platform
US12443186B2 (en) Fitness and sports applications for an autonomous unmanned aerial vehicle
US11755041B2 (en) Objective-based control of an autonomous unmanned aerial vehicle
US10816967B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
US10377484B2 (en) UAV positional anchors
US10336469B2 (en) Unmanned aerial vehicle movement via environmental interactions
US10357709B2 (en) Unmanned aerial vehicle movement via environmental airflow
JP6816156B2 (en) Systems and methods for adjusting UAV orbits
CN107087427A (en) Aircraft control method, device and equipment, and aircraft
US20220137647A1 (en) System and method for operating a movable object based on human body indications
CN110325939A (en) System and method for operating unmanned vehicle
CN107077152A (en) Control method, device, system, unmanned aerial vehicle and mobile platform
US12007763B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
CN108268050A (en) Motion control device wears display equipment, unmanned plane and flight system
WO2024069789A1 (en) Aerial imaging system, aerial imaging method, and aerial imaging program
WO2024069788A1 (en) Mobile body system, aerial photography system, aerial photography method, and aerial photography program
WO2024069790A1 (en) Aerial photography system, aerial photography method, and aerial photography program
WO2024189898A1 (en) Imaging system, imaging method, and imaging program
WO2024166318A1 (en) Imaging system, imaging method, and imaging program
JP7777368B2 (en) Flight control system and flight control method
WO2024252444A1 (en) Determination system, determination method, and determination program
WO2023238208A1 (en) Aerial photography system, aerial photography method, and aerial mobile body management device
WO2024180639A1 (en) Imaging system, imaging method, moving body control device, and program
WO2024018643A1 (en) Imaging system, imaging method, imaging control device and program
US12498714B2 (en) Systems and methods for UAV flight control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22960852

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024548908

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22960852

Country of ref document: EP

Kind code of ref document: A1