WO2025177925A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations - Google Patents
Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informationsInfo
- Publication number
- WO2025177925A1 WO2025177925A1 PCT/JP2025/004674 JP2025004674W WO2025177925A1 WO 2025177925 A1 WO2025177925 A1 WO 2025177925A1 JP 2025004674 W JP2025004674 W JP 2025004674W WO 2025177925 A1 WO2025177925 A1 WO 2025177925A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- camera
- unit
- photographing
- information processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/36—Other airport installations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C7/00—Tracing profiles
- G01C7/02—Tracing profiles of land surfaces
- G01C7/04—Tracing profiles of land surfaces involving a vehicle which moves along the profile to be traced
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/22—Command input arrangements
- G05D1/221—Remote-control arrangements
- G05D1/222—Remote-control arrangements operated by humans
- G05D1/224—Output arrangements on the remote controller, e.g. displays, haptics or speakers
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/247—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
- G05D1/248—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons generated by satellites, e.g. GPS
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/40—Control within particular dimensions
- G05D1/46—Control of position or course in three dimensions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
Definitions
- This technology relates to an information processing device, an information processing method, and an information processing system.
- Prior art includes technology for creating point clouds and orthoimages representing terrain from multiple pieces of image data captured by a camera mounted on a mobile object such as a drone. Furthermore, technology has been proposed for re-photographing an object when the subject fails to be photographed using such a mobile object (Patent Document 1).
- the second technology is an information processing method that displays geographic information created based on images captured by a camera carried by a moving object on a display unit, associates re-photographing information for re-shooting with the geographic information based on input from the user, and creates a movement plan for the moving object to re-shoot with the camera based on the re-photographing information.
- the third technology is an information processing system comprising an information processing device that includes a geographic information creation device that creates geographic information based on images captured by a camera equipped on a moving object, a display processing unit that displays the geographic information on a display unit, an association unit that associates re-photographing information for re-photographing with the camera based on input from a user with the geographic information, and a movement plan creation unit that creates a movement plan for the moving object to re-photograph with the camera based on the re-photographing information.
- a geographic information creation device that creates geographic information based on images captured by a camera equipped on a moving object
- a display processing unit that displays the geographic information on a display unit
- an association unit that associates re-photographing information for re-photographing with the camera based on input from a user with the geographic information
- a movement plan creation unit that creates a movement plan for the moving object to re-photograph with the camera based on the re-photographing
- FIG. 10 is an explanatory diagram of re-photographing information associated with geographic information.
- FIG. 1 is an explanatory diagram of oblique photography.
- FIG. 10 is an explanatory diagram of association of re-photographing information with an orthoimage.
- FIG. 1 is an explanatory diagram of a movement plan.
- FIG. 10 is an explanatory diagram of the reliability of geographic information. This is an example of how importance is displayed when using this technology in as-built management.
- the information processing system 1000 includes a mobile object 100, a geographic information creation device 200, and an information processing device 300.
- Wired connection methods include, for example, HDMI (registered trademark) (High-Definition Multimedia Interface) and USB (Universal Serial Bus), while wireless connection methods include, for example, Wi-Fi, wireless LAN (Local Area Network), networks such as 4G (fourth generation mobile communication system) and 5G (fifth generation mobile communication system), Bluetooth (registered trademark), NFC (Near Field Communication), and Ethernet (registered trademark).
- Wi-Fi wireless LAN (Local Area Network)
- networks such as 4G (fourth generation mobile communication system) and 5G (fifth generation mobile communication system)
- 4G fourth generation mobile communication system
- 5G fifth generation mobile communication system
- Bluetooth registered trademark
- NFC Near Field Communication
- Ethernet registered trademark
- the mobile object 100 is an unmanned aerial vehicle such as a drone that flies in the sky.
- the mobile object 100 is equipped with a camera 105, and can perform automatic flight and automatic photographing based on a pre-set movement route (pre-movement route) or a movement plan created by the information processing device 300.
- the pre-set movement route can be set using existing mobile object control software, etc.
- the mobile body 100 When performing automatic flight and automatic photography, route information, photography position, photography direction, photography timing, etc. are set in advance, and the mobile body 100 performs flight control and photography control in accordance with these settings.
- the mobile body 100 flies above the photography range, i.e., the range for which geographic information is to be created (geographic information creation range), and photographs the ground surface with the camera 105 while flying.
- the mobile body 100 also obtains camera position information, which is the position of the camera 105 at the time of photography, and transmits the image and camera position information to the geographic information creation device 200.
- the mobile body 100 may also be capable of being flown manually by a pilot.
- the geographic information creation device 200 creates geographic information for the geographic information creation range based on images captured by the camera 105 of the mobile object 100.
- the geographic information may be a three-dimensional point cloud, orthoimages, etc.
- Actuator 102 is a drive source for driving the rotor blades, and is provided at the tip of the rotor blade support shaft, etc. Actuator 102 operates under the control of UAV control unit 101.
- Camera 105 can capture a subject and obtain an image as digital data. Camera 105 may also have the function of performing AI-based processing on the image. Examples of this processing include image recognition processing and image detection processing.
- Camera 105 is mounted, for example, on the bottom of the housing of mobile body 100 so as to be suspended via gimbal 103 in order to capture images of the ground surface within the geographic information creation range. By driving gimbal 103, camera 105 is able to point its lens in any direction from 360 degrees horizontally to vertically downwards to capture images. Note that camera 105 may be mounted on mobile body 100 in any manner as long as it is able to point its lens in any direction from 360 degrees horizontally to vertically downwards to capture images.
- the CPU 301, ROM 302, RAM 303, and non-volatile memory unit 304 are interconnected via a bus.
- the bus is also connected to an input/output interface 305.
- circuitry or processing circuitry including general-purpose processors, application-specific processors, integrated circuits, ASICs (Application Specific Integrated Circuits), CPUs (Central Processing Units), conventional circuits, and/or combinations thereof, programmed to provide the described functions.
- Processors include transistors and other circuits and are considered to be circuitry or processing circuitry.
- a processor may also be a programmed processor that executes a program stored in memory.
- a circuit, unit, or means is hardware that is programmed to provide or executes the described functions.
- the hardware may be any hardware disclosed herein or any hardware that is programmed to provide or is known to execute the described functions.
- the circuitry, means, or unit is a combination of hardware and software used to configure the hardware and/or processor.
- step S11 the mobile object 100 moves (flies) above the geographic information creation range based on a pre-created preliminary movement route, and photographs are taken by the camera 105. While the mobile object 100 is moving along the pre-created movement route, the camera 105 continues to take photographs repeatedly at predetermined intervals.
- the position information acquisition unit 106 acquires the camera position information at the time of image capture.
- the mobile object 100 then transmits the image and camera position information to the geographic information creation device 200.
- step S13 the geographic information creation unit 202 of the geographic information creation device 200 creates geographic information based on the image.
- the geographic information creation device 200 transmits the geographic information, camera position information, and camera attitude information to the information processing device 300.
- the anomaly detection unit 322 of the information processing device 300 detects an anomaly in the geographic information and supplies information on the location where the anomaly exists in the geographic information (anomaly location information) to the display processing unit 323 as the anomaly detection result.
- Anomalies in a three-dimensional point cloud as geographic information can be detected, for example, by measuring the distance between each point that makes up the point cloud and determining whether that distance is equal to or greater than a predetermined threshold.
- the point cloud is created in advance to satisfy a predetermined density, areas where the density of the created point cloud is equal to or less than a predetermined density can be detected as an anomaly.
- step S15 the display processing unit 323 of the information processing device 300 processes the geographic information to display it on the display unit 307.
- the geographic information displayed on the display unit 307 is as shown in Figure 7.
- the geographic information is a three-dimensional point cloud.
- the ground surface photographed by the camera 105 includes the ground, roads, embankments, buildings, and trees.
- this configuration of the ground surface is merely an example, and the present technology is not limited to such a configuration of the ground surface.
- the display processing unit 323 displays on the display unit 307 the camera position information at the time of shooting, which was acquired in synchronization with the shooting of the image used to create the geographic information.
- the camera position information is displayed as a point indicating the position on the geographic information. This allows the user to easily understand the position of the camera 105 at which the image was taken.
- the display processing unit 323 displays, in addition to the geographical information, anomaly location information, which is the anomaly detection result, on the display unit 307. This allows the user to understand where the anomaly is in the geographical information, identify locations or areas within the geographical information creation range that require re-photography, and enter information to associate with the photographing information.
- camera position information, camera attitude information, movement path, and abnormality location information are displayed along with geographic information, but it is not necessary to display all of this information at the same time.
- the user may be able to select which information to display. Also, the display or non-display of selected information may be switched depending on the selection input from the user.
- Geographical information can also be displayed in 3D.
- the geographical information displayed on the display unit 307 will be as shown in Figure 8.
- the geographical information in Figure 8 is also a three-dimensional point cloud.
- camera position information, camera attitude information, movement path, and abnormality position information are displayed superimposed on the geographical information, just like the 2D display of Figure 7. The user may be able to select and switch between 2D and 3D display.
- step S16 the association unit 324 of the information processing device 300 performs a process of associating the re-photograph information with the geographic information based on the user's input results.
- the re-photograph information associated with the geographic information based on the input results shown in FIG. 9 becomes as shown in FIG. 10.
- step S17 the coordinate conversion unit 325 converts the designated route, priority area, and unnecessary area indicated in the re-photography information associated with the geographic information from the site coordinate system to the drone coordinate system.
- the movement plan creation unit 326 creates a movement plan based on the re-photography information associated with the geographic information.
- the movement plan is created so that the mobile object 100 moves along the designated route in the designated route information and the priority area in the priority area information, and does not move through the unnecessary areas in the unnecessary area information.
- the movement plan is also created as a route that connects the designated route and the priority area via the shortest distance.
- movement plans can be created using multiple movement methods. These methods include round trip movement, up and down movement, and movement at reduced speed.
- the object photographed by moving up and down can be perpendicular to the ground surface, or it can be any object such as terrain or a building that is higher than the ground surface, even if it is not perpendicular.
- Slower speed movement is a method of capturing images by slowing the speed of the moving body 100 moving above the priority area compared to when moving above other areas. This increases the number of times images are captured, increasing the image overlap rate and improving the accuracy of the geographic information, allowing for the creation of geographic information without any abnormalities.
- the method for moving the priority area and taking photographs may be set in advance by the user, or the information processing device 300 may automatically decide using an algorithm. Furthermore, the method is not limited to one, and a movement plan may be created by combining multiple methods. For example, the movement speed may be reduced and photographs taken while moving back and forth.
- the start position of the moving body 100 in the movement plan may be the current position of the moving body 100, or any position specified by the user. If the user specifies the start position of the movement, it is preferable that the user be able to input the start position of the movement along with the input of information for re-photography. If the user does not specify a position, it is preferable that the current position of the moving body 100 or a position set in advance as a default position be used as the start position of the movement.
- the moving body 100 first moves from the movement start position to priority area A, and moves back and forth so as to focus on photographing within priority area A. Next, the moving body 100 moves from priority area A to one end of the designated route, moves along the designated route, and then moves from the other end of the designated route to priority area B. Next, the moving body 100 moves so as to focus on photographing within priority area B. Because priority area B is an area with height, such as a wall, the moving body 100 moves up and down within priority area B to photograph. Then the movement of the moving body 100 ends.
- the movement plan creation unit 326 creates a movement plan based on the re-photographing information associated with the geographical information. Therefore, if the only re-photographing information associated with the geographical information is designated route information, the movement plan creation unit 326 creates a movement plan that reflects only that designated route information. Also, if the only re-photographing information associated with the geographical information is priority area information, the movement plan creation unit 326 creates a movement plan that reflects only that priority area information.
- the user may be able to select from that information the re-photographing information to be reflected in the travel plan.
- the travel plan can also be adjusted or changed.
- multiple different travel plans can be created based on the re-photographing information associated with the geographic information.
- the user does not input the travel speed and camera attitude in the specified route information, they are automatically set to a predetermined travel speed and camera attitude.
- a notification prompting the user to input the travel speed and camera attitude may also be displayed on the display unit 307.
- high-priority designated route information or priority area information is reflected in the travel plan regardless of the remaining battery charge of the mobile object 100.
- low-priority designated route information or priority area information is reflected in the travel plan only if the remaining battery charge of the mobile object 100 is equal to or greater than a predetermined amount.
- the movement plan creation unit 326 may also create a movement plan based on the time required for image capture. For example, a time for image capture, such as 30 minutes, is set in advance, and a movement plan is created that prioritizes re-image capture information set in a position close to the start position of the movement of the moving body 100 so that the time it takes for the moving body 100 to travel the route in the movement plan is within 30 minutes. Re-image capture information that does not fit within 30 minutes is not included in the movement plan.
- the information processing device 300 can obtain weather information from the Internet, etc., in order to create a travel plan.
- the movement plan created by the movement plan creation unit 326 is transmitted from the information processing device 300 to the mobile body 100.
- step S19 the mobile body 100 moves above the geographic information creation range based on the movement plan and re-images using the camera 105.
- the UAV control unit 101 of the mobile body 100 controls the output of the actuator 102 while comparing the current position of the mobile body 100 with the movement plan based on the movement plan, thereby controlling the mobile body 100 to move in accordance with the movement plan.
- the gimbal control unit 104 controls the gimbal 103 based on the movement plan to adjust the attitude of the camera 105 and take images. This makes it possible to take images again based on the movement plan.
- the mobile body 100 then transmits the images acquired during the re-images to the geographic information creation device 200.
- step S20 the geographic information creation unit 202 of the geographic information creation device 200 recreates the geographic information based on the image.
- the flowchart in Figure 6 shows the re-photography based on a movement plan and the re-creation of geographic information using the images obtained from the re-photography.
- the user can reliably recognize anomalies in the geographical information and create a movement plan to resolve the anomaly.
- This technology is unique in that it displays geographic information and associates it with re-photography information, so its implementation in other companies' products can be verified by checking the GUI display of those products.
- the number of moving bodies 100 moving based on the movement plan is not limited to one, and may be multiple. In other words, movement and re-photographing based on the movement plan may be performed using multiple moving bodies 100. If there are multiple moving bodies 100, the re-photographing information may be separated by type of information and a movement plan may be created for each moving body 100, or one movement plan may be created and divided for each moving body 100.
- a movement plan may be created for a portion of the geographic information creation range, and that portion of the geographic information creation range may be re-photographed based on that movement plan.
- a movement plan may be created for a portion of the geographic information creation range, and the entire geographic information creation range may be re-photographed again based on the movement plan for that portion of the geographic information creation range.
- the reliability of geographic information decreases around objects of a certain height, such as building walls. For example, if a blind spot occurs due to a tall object, as shown in Figure 14, the overlap rate of multiple images captured by the camera 105 of the mobile object 100 decreases, and the reliability of the three-dimensional point cloud decreases.
- 3D point clouds can be considered low around tall objects, such as building walls, due to the reduced overlap rate caused by blind spots and the influence of changing shadows.
- These low reliability areas may be calculated from the position of the sun or the attitude of the camera 105, and displayed together with the geographic information on the display unit 307.
- a user viewing this display can input re-photograph information to associate the low reliability areas with the re-photograph information.
- the movement plan creation unit 326 then creates a movement plan based on the re-photograph information, making it possible to create highly accurate geographic information even for low reliability areas.
- the importance can be determined based on the results of terrain recognition within the geographic information creation range, the results of object detection within the shooting range, the overlap rate of the captured images, the density of the point cloud created as geographic information, and the shooting camera position information (information on whether it is in the center or periphery of the area).
- the importance can be determined based on any one of these, or a combination of several of them.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Navigation (AREA)
Abstract
La présente invention concerne un dispositif de traitement d'informations qui comprend : une unité de traitement d'affichage qui affiche, sur une unité d'affichage, des informations géographiques créées sur la base d'images capturées par une caméra disposée sur un corps mobile ; une unité d'association qui associe des informations de réimagerie pour une réimagerie par la caméra avec les informations géographiques sur la base d'une entrée provenant d'un utilisateur ; et une unité de création de plan de déplacement qui crée un plan de déplacement pour le corps mobile pour effectuer la réimagerie au moyen de la caméra sur la base des informations de réimagerie.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024024312 | 2024-02-21 | ||
| JP2024-024312 | 2024-02-21 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025177925A1 true WO2025177925A1 (fr) | 2025-08-28 |
Family
ID=96847047
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2025/004674 Pending WO2025177925A1 (fr) | 2024-02-21 | 2025-02-13 | Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025177925A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180109767A1 (en) * | 2015-02-13 | 2018-04-19 | Unmanned Innovation, Inc. | Unmanned aerial vehicle sensor activation and correlation system |
| JP2020043543A (ja) * | 2018-09-13 | 2020-03-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 情報処理装置、飛行経路生成方法、プログラム、及び記録媒体 |
| JP2022108823A (ja) * | 2021-01-14 | 2022-07-27 | 株式会社ロックガレッジ | 捜索支援システムおよび救助支援プログラム |
-
2025
- 2025-02-13 WO PCT/JP2025/004674 patent/WO2025177925A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180109767A1 (en) * | 2015-02-13 | 2018-04-19 | Unmanned Innovation, Inc. | Unmanned aerial vehicle sensor activation and correlation system |
| JP2020043543A (ja) * | 2018-09-13 | 2020-03-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 情報処理装置、飛行経路生成方法、プログラム、及び記録媒体 |
| JP2022108823A (ja) * | 2021-01-14 | 2022-07-27 | 株式会社ロックガレッジ | 捜索支援システムおよび救助支援プログラム |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7556383B2 (ja) | 情報処理装置、情報処理方法、情報処理プログラム、画像処理装置および画像処理システム | |
| CN110771141B (zh) | 拍摄方法和无人机 | |
| US10648809B2 (en) | Adaptive compass calibration based on local field conditions | |
| KR102001728B1 (ko) | 스테레오 카메라 드론을 활용한 무기준점 3차원 위치좌표 취득 방법 및 시스템 | |
| CN102829762B (zh) | 无人飞行载具的图像处理系统及方法 | |
| CN106796112A (zh) | 检测车辆控制设备、控制方法和计算机程序 | |
| US20200218289A1 (en) | Information processing apparatus, aerial photography path generation method, program and recording medium | |
| US20230359204A1 (en) | Flight control method, video editing method, device, uav and storage medium | |
| US20200320886A1 (en) | Information processing device, flight control instruction method, program and recording medium | |
| US20210229810A1 (en) | Information processing device, flight control method, and flight control system | |
| JP7501535B2 (ja) | 情報処理装置、情報処理方法、情報処理プログラム | |
| US20210185235A1 (en) | Information processing device, imaging control method, program and recording medium | |
| WO2020062178A1 (fr) | Procédé basé sur une carte d'identification d'objet cible, et terminal de commande | |
| US20230032219A1 (en) | Display control method, display control apparatus, program, and recording medium | |
| CN102706331B (zh) | 航拍测绘图像校正方法 | |
| US12174629B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
| US20210208608A1 (en) | Control method, control apparatus, control terminal for unmanned aerial vehicle | |
| CN113063401A (zh) | 一种无人机航测系统 | |
| US20200217665A1 (en) | Mobile platform, image capture path generation method, program, and recording medium | |
| CN112837378A (zh) | 一种基于多无人机编队的航拍相机姿态外部动态标定以及测绘方法 | |
| WO2025177925A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations | |
| Wu et al. | Design and experiment OF a high payload fixed wing VTOL UAV system for emergency response | |
| CN110168524A (zh) | 地图叠加展示方法、装置和无人飞行系统 | |
| JP7706037B1 (ja) | 点群処理装置および点群処理システム | |
| WO2021035746A1 (fr) | Procédé et dispositif de traitement d'image, et plate-forme mobile |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25758021 Country of ref document: EP Kind code of ref document: A1 |