WO2019093692A1 - Procédé et dispositif électronique de commande de véhicule aérien sans pilote comprenant une caméra - Google Patents
Procédé et dispositif électronique de commande de véhicule aérien sans pilote comprenant une caméra Download PDFInfo
- Publication number
- WO2019093692A1 WO2019093692A1 PCT/KR2018/012625 KR2018012625W WO2019093692A1 WO 2019093692 A1 WO2019093692 A1 WO 2019093692A1 KR 2018012625 W KR2018012625 W KR 2018012625W WO 2019093692 A1 WO2019093692 A1 WO 2019093692A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- camera
- electronic device
- image
- aerial vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/22—Command input arrangements
- G05D1/221—Remote-control arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U40/00—On-board mechanical arrangements for adjusting control surfaces or rotors; On-board mechanical arrangements for in-flight adjustment of the base configuration
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/20—UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/20—Aircraft, e.g. drones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
Definitions
- Various embodiments of the present invention are directed to a method and an electronic device for navigating an unmanned aerial vehicle including a camera.
- An unmanned aerial vehicle may be a flight craft that is designed to fly in the air and perform the assigned mission without the pilot on board.
- the unmanned aerial vehicle may include various names such as a drone and an unmanned aircraft system.
- the unmanned aerial vehicle is wirelessly connected to the electronic device and can be controlled remotely under the control of the electronic device.
- the unmanned aerial vehicle may have a processor mounted therein and communicate wirelessly with the electronic device.
- the electronic device can remotely control the movement and function of the unmanned aerial vehicle.
- the unmanned aerial vehicle may be equipped with a camera capable of three-dimensional modeling with respect to a nearby object, and may analyze the positions of the objects and fly so as not to collide with the objects.
- the unmanned aerial vehicle can move up and down (eg throttle), tilt back and forth (eg pitch), left / right rotation (eg yaw) As shown in FIG.
- the unmanned aerial vehicle can be connected wirelessly with the electronic device and can be controlled under the control of the electronic device.
- a user can control an unmanned aerial vehicle using an electronic device, but it can be extremely difficult for a user to steer the unmanned aerial vehicle in a desired position and direction.
- images photographed through the camera can be displayed on the screen of the electronic device. Since the user steers the unmanned aerial vehicle while gazing at the electronic device screen to photograph a desired subject, shooting convenience may be very low.
- Various embodiments of the present invention can perform a three-dimensional modeling function for at least one subject positioned in the vicinity of the unmanned aerial vehicle, and display an expected image corresponding to the movement of the unmanned aerial vehicle device on the electronic device screen.
- Various embodiments of the present invention can provide a method of easily controlling an unmanned aerial vehicle so that a user can take an image desired based on an expected image displayed on an electronic device screen.
- An electronic device includes a communication module, a display device, a memory, and a processor electrically connected to the communication module, the display device, and the memory, Dimensional modeling information obtained by using at least one camera mounted on an unmanned aerial vehicle, displaying the virtual space implemented based on the received three-dimensional modeling information using the display device,
- the controller may detect a user input for controlling the unmanned aerial vehicle and display an expected image of a point corresponding to the detected user input using the display device.
- An unmanned flight device includes at least one camera, a communication module, and a processor electrically connected to the at least one camera and the communication module, Dimensional modeling information using the communication module, and transmit the obtained three-dimensional modeling information to the electronic device connected through the communication module.
- a method is a method for wirelessly connecting an electronic device and an unmanned flight device and wirelessly connecting the electronic device and the unmanned airplane to the 3D modeling information acquired using the at least one camera mounted on the unmanned flight device
- a controller for receiving from the unmanned flight device, displaying a virtual space implemented based on the received three-dimensional modeling information, detecting a user input for controlling the unmanned airplane device in the virtual space, An expected image of the corresponding point can be displayed.
- Various embodiments of the present invention provide a 3D modeling device for modeling three-dimensional objects of a nearby object using at least one camera mounted on an unmanned aerial vehicle, and a virtual space and an expected image implemented by the 3D modeling, Can be displayed.
- the electronic device can display an expected image to be photographed according to the movement of the unmanned flight device on the display device based on the three-dimensional modeling function, and can display a user interface capable of photographing a desired image. The user can easily manipulate the unmanned aerial vehicle based on the expected image.
- Various embodiments of the present invention may display an anticipated image and easily manipulate the unmanned aerial vehicle in response to the anticipated image, in order to capture a scene desired by the user.
- the convenience of photographing using the unmanned aerial vehicle can be improved.
- FIG. 1 is a block diagram of an electronic device in a networked environment for navigating an unmanned aerial vehicle including a camera, in accordance with various embodiments.
- FIG. 2 is a view showing the appearance of an unmanned aerial vehicle according to various embodiments.
- FIG. 3 is a diagram illustrating a situation in which an unmanned aerial vehicle is steered using an electronic device according to various embodiments.
- FIG. 4 is a block diagram of an unmanned aerial vehicle according to various embodiments.
- FIG. 5 is a flowchart illustrating a process of controlling an unmanned aerial vehicle under the control of an electronic device according to various embodiments.
- 6A and 6B are flowcharts illustrating a method of controlling an unmanned aerial vehicle according to various embodiments.
- FIG. 7A to 7C are views illustrating a process of performing a three-dimensional scanning function using the UAV according to various embodiments.
- FIGS. 8A and 8B are views showing the moving radius of the unmanned aerial vehicle around a center subject according to various embodiments.
- FIGS. 9A and 9B are diagrams showing predicted images according to various angles of view of a main camera mounted on the UAV according to various embodiments.
- 10A and 10B are diagrams illustrating a process of changing an expected image corresponding to movement of the UAV according to various embodiments.
- 11A and 11B are diagrams illustrating operations performed in an electronic device and operations performed in an unmanned aerial vehicle, corresponding to the movement of the UAV according to various embodiments.
- FIG. 12 is a diagram illustrating a process of displaying an expected image corresponding to the position of the UAV according to various embodiments.
- FIG. 13 is a diagram showing a range in which the unmanned aerial vehicle can be moved within a set background range according to various embodiments.
- FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 for steering an unmanned aerial vehicle including a camera, in accordance with various embodiments.
- an electronic device 101 in a network environment 100 communicates with an electronic device 102 via a first network 198 (e.g., near-field wireless communication) or a second network 199 (E. G., Remote wireless communication).
- a first network 198 e.g., near-field wireless communication
- a second network 199 E. G., Remote wireless communication
- ≪ / RTI > the electronic device 101 is capable of communicating with the electronic device 104 through the server 108.
- the electronic device 101 includes a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, a sensor module 176, an interface 177, a haptic module 179, a camera module 180, a power management module 188, a battery 189, a communication module 190, a subscriber identity module 196, and an antenna module 197 ).
- at least one (e.g., display 160 or camera module 180) of these components may be omitted from the electronic device 101, or other components may be added.
- some components such as, for example, a sensor module 176 (e.g., a fingerprint sensor, an iris sensor, or an illuminance sensor) embedded in a display device 160 Can be integrated.
- Processor 120 may be configured to operate at least one other component (e.g., hardware or software component) of electronic device 101 connected to processor 120 by driving software, e.g., And can perform various data processing and arithmetic operations.
- Processor 120 loads and processes commands or data received from other components (e.g., sensor module 176 or communication module 190) into volatile memory 132 and processes the resulting data into nonvolatile memory 134.
- the processor 120 may operate in conjunction with a main processor 121 (e.g., a central processing unit or an application processor) and, independently, or additionally or alternatively, Or a co-processor 123 (e.g., a graphics processing unit, an image signal processor, a sensor hub processor, or a communications processor) specific to the designated function.
- a main processor 121 e.g., a central processing unit or an application processor
- a co-processor 123 e.g., a graphics processing unit, an image signal processor, a sensor hub processor, or a communications processor
- the coprocessor 123 may be operated separately from or embedded in the main processor 121.
- the coprocessor 123 may be used in place of the main processor 121, for example, while the main processor 121 is in an inactive (e.g., sleep) state, At least one component (e.g., display 160, sensor module 176, or communications module 176) of the components of electronic device 101 (e.g., 190) associated with the function or states.
- the coprocessor 123 e.g., an image signal processor or communications processor
- the coprocessor 123 is implemented as a component of some other functionally related component (e.g., camera module 180 or communication module 190) .
- Memory 130 may store various data used by at least one component (e.g., processor 120 or sensor module 176) of electronic device 101, e.g., software (e.g., program 140) ), And input data or output data for the associated command.
- the memory 130 may include a volatile memory 132 or a non-volatile memory 134.
- the program 140 may be software stored in the memory 130 and may include, for example, an operating system 142, a middleware 144,
- the input device 150 is an apparatus for receiving a command or data to be used for a component (e.g., processor 120) of the electronic device 101 from the outside (e.g., a user) of the electronic device 101,
- a component e.g., processor 120
- a microphone, a mouse, or a keyboard may be included.
- the sound output device 155 is a device for outputting a sound signal to the outside of the electronic device 101.
- the sound output device 155 may be a speaker for general use such as a multimedia reproduction or a sound reproduction, .
- the receiver may be formed integrally or separately with the speaker.
- Display device 160 may be an apparatus for visually providing information to a user of electronic device 101 and may include, for example, a display, a hologram device, or a projector and control circuitry for controlling the projector. According to one embodiment, the display device 160 may include a touch sensor or a pressure sensor capable of measuring the intensity of the pressure on the touch.
- the audio module 170 is capable of bi-directionally converting sound and electrical signals. According to one embodiment, the audio module 170 may acquire sound through the input device 150, or may be connected to the audio output device 155, or to an external electronic device (e.g., Electronic device 102 (e.g., a speaker or headphone)).
- an external electronic device e.g., Electronic device 102 (e.g., a speaker or headphone)
- the sensor module 176 may generate an electrical signal or data value corresponding to an internal operating state (e.g., power or temperature) of the electronic device 101, or an external environmental condition.
- the sensor module 176 may be a gesture sensor, a gyro sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared sensor, Or an illuminance sensor.
- the interface 177 may support a designated protocol that may be wired or wirelessly connected to an external electronic device (e.g., the electronic device 102).
- the interface 177 may include a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital interface
- audio interface an audio interface
- the connection terminal 178 may be a connector such as an HDMI connector, a USB connector, an SD card connector, or an audio connector that can physically connect the electronic device 101 and an external electronic device (e.g., the electronic device 102) (E.g., a headphone connector).
- an HDMI connector such as an HDMI connector, a USB connector, an SD card connector, or an audio connector that can physically connect the electronic device 101 and an external electronic device (e.g., the electronic device 102) (E.g., a headphone connector).
- the haptic module 179 may convert electrical signals into mechanical stimuli (e.g., vibrations or movements) or electrical stimuli that the user may perceive through tactile or kinesthetic sensations.
- the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module 180 can capture a still image and a moving image.
- the camera module 180 may include one or more lenses, an image sensor, an image signal processor, or a flash.
- the power management module 188 is a module for managing the power supplied to the electronic device 101, and may be configured as at least a part of, for example, a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 is an apparatus for supplying power to at least one component of the electronic device 101 and may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
- the communication module 190 is responsible for establishing a wired or wireless communication channel between the electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108) Lt; / RTI > Communication module 190 may include one or more communication processors that support wired communication or wireless communication, operating independently of processor 120 (e.g., an application processor).
- the communication module 190 may include a wireless communication module 192 (e.g., a cellular communication module, a short range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (E.g., a local area network (LAN) communication module, or a power line communication module), and the corresponding communication module may be used to communicate with a first network 198 (e.g., Bluetooth, WiFi direct, Communication network) or a second network 199 (e.g., a telecommunications network such as a cellular network, the Internet, or a computer network (e.g., a LAN or WAN)).
- a wireless communication module 192 e.g., a cellular communication module, a short range wireless communication module, or a global navigation satellite system (GNSS) communication module
- GNSS global navigation satellite system
- wired communication module 194 E.g., a local area network (LAN) communication module, or a power line communication module
- the wireless communication module 192 may use the user information stored in the subscriber identification module 196 to identify and authenticate the electronic device 101 within the communication network.
- the antenna module 197 may include one or more antennas for externally transmitting or receiving signals or power.
- the communication module 190 e.g., the wireless communication module 192 may transmit signals to or receive signals from an external electronic device via an antenna suitable for the communication method.
- Some of the components are connected to each other via a communication method (e.g., bus, general purpose input / output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI) (Such as commands or data) can be exchanged between each other.
- a communication method e.g., bus, general purpose input / output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI) (Such as commands or data) can be exchanged between each other.
- the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 via the server 108 connected to the second network 199.
- Each of the electronic devices 102 and 104 may be the same or a different kind of device as the electronic device 101.
- all or a portion of the operations performed in the electronic device 101 may be performed in another or a plurality of external electronic devices.
- the electronic device 101 in the event that the electronic device 101 has to perform some function or service automatically or upon request, the electronic device 101 may be capable of executing the function or service itself, And may request the external electronic device to perform at least some functions associated therewith.
- the external electronic device receiving the request can execute the requested function or additional function and transmit the result to the electronic device 101.
- the electronic device 101 can directly or additionally process the received result to provide the requested function or service.
- cloud computing, distributed computing, or client-server computing technology may be used.
- the processor 120 of the electronic device 101 may be wirelessly connected to an external electronic device 102 (e.g., an unmanned aerial vehicle) via the communication module 190, and the operation of the unmanned aerial vehicle Can be controlled.
- the processor 120 may provide data (e.g., elevation of the unmanned aerial vehicle, moving speed, direction of movement, control of the camera mounted on the unmanned aerial vehicle, etc.) Lt; / RTI >
- the processor 120 can receive a photographed image using a camera mounted on the unmanned aerial vehicle and output the received image on the display device 160.
- the processor 120 may implement a three-dimensional modeling image based on a sub-camera mounted on the unmanned aerial vehicle.
- the processor 120 may be configured to determine whether the three-dimensional modeling image corresponds to the three-dimensional modeling image based on coordinate information of the unmanned aerial vehicle and photographing information (e.g., photographing angle, photographing direction, zoom in / Can be generated.
- the processor 120 may display the generated predicted image on the display device 160 and calculate the movement information of the unmanned aerial vehicle for capturing an image corresponding to the generated predicted image.
- the processor 120 may move the unmanned aerial vehicle based on the calculated movement information.
- An electronic device includes a communication module, a display device, a memory, and a processor electrically connected to the communication module, the display device, and the memory, Dimensional modeling information obtained by using at least one camera mounted on an unmanned aerial vehicle, displaying the virtual space implemented based on the received three-dimensional modeling information using the display device,
- the controller may detect a user input for controlling the unmanned aerial vehicle and display an expected image of a point corresponding to the detected user input using the display device.
- the processor of the electronic device transmits movement information corresponding to the expected image to the unmanned flight device in response to a command to shoot the predicted image, and based on the movement information, Movement can be controlled.
- the virtual space includes at least one of a virtual space image or an expected image, wherein the virtual space image includes a subject icon corresponding to at least one subject and an unmanned flight device icon corresponding to the unmanned flight device And the predicted image may include an image to be photographed using the at least one camera corresponding to a point where the unmanned airplane icon is located.
- the virtual space image may indicate the movable range of the UAV while the distance between the at least one subject and the UAV is kept constant.
- the processor senses a dragging input to the unmanned flight device icon contained in the virtual space, and responds to the dragging input to transmit the position information corresponding to the unmanned airplane to a predetermined time It can be acquired at intervals.
- the predicted image may be an image expected when taken with the at least one camera mounted on the UAV.
- the 3D modeling information may be obtained based on the at least one camera, and may include position information corresponding to at least one subject and position information corresponding to the unmanned flight device.
- the processor may set a center subject among the at least one subject, and control the unmanned aerial device such that the center subject is displayed at the center of the expected image.
- the at least one camera may include a main camera for capturing the expected image and a sub-camera for acquiring the three-dimensional modeling information to implement the virtual space.
- the UAV 201 can take off, land, and fly under the control of a remotely controller (not shown) (e.g., electronic device 101 of FIG. 1) Automatic flight can be performed independently of the control unit.
- the unmanned flight control device 201 may maintain the hovering state.
- the unmanned aerial vehicle 201 in the hovering state can return to the current position by using the built-in sensor even if it moves from the current position to another position due to an external factor such as wind.
- the unmanned flight control device 201 can be wirelessly connected to the remote control device and can move in correspondence with the movement direction, the moving speed and the movement coordinates set by the remote control device.
- the UAV 201 may include a housing including a frame, at least one motor, and a propeller corresponding to the number of the motors.
- the UAV 201 may be equipped with a camera 210 for capturing an image, and may include at least two cameras for performing three-dimensional modeling (e.g., 3D modeling)
- One sub-camera 220 can be mounted.
- the camera 210 may be located at the bottom or the bottom of the UAV 201.
- 3D modeling is a field of computer graphics, and it can be a process of creating a mathematical model that can be reproduced in a virtual three-dimensional space.
- the UAV 201 may use the sub camera 220 to perform a three-dimensional modeling function corresponding to nearby objects, and to transmit an image implemented by the three- To the control device (the electronic device 101 in Fig. 1).
- the remote control device (electronic device 101 of FIG. 1) may output an image implemented by the three-dimensional modeling function to the display (display device 160 of FIG. 1).
- the unmanned flight device 201 can adjust the setting angles of the camera 201 and the sub camera 220 separately from the movement of the unmanned aerial vehicle such as the flight and the movement and the gimbal , Not shown).
- FIG. 3 is a diagram illustrating a situation in which an unmanned aerial vehicle is steered using an electronic device according to various embodiments.
- the unmanned aerial vehicle 201 is connected wirelessly with the electronic device 301 (e.g., a remote control device) and can be moved under the control of the electronic device 301.
- the electronic device 301 may be controlled by a user 310 (e.g., a first subject) and the user 310 may control the electronic device 301 to move the unmanned aerial device 201 as desired .
- the unmanned aerial vehicle 201 moves in a three-dimensional space, it can move up and down (e.g., throttle), tilt back and forth (e.g. pitch), left / right rotation (e.g., yaw) roll) should be considered together.
- the user 310 needs to manipulate the UAV 201 very precisely when the user 310 wishes to capture a desired composition and a desired subject. In general, maneuvering of the unmanned aerial vehicle 201 is difficult and requires skill.
- the user 310 may control the electronic device 301 to capture an image through the camera 210 mounted on the unmanned aerial vehicle 201.
- the unmanned flight device 201 can photograph a first subject (e.g., a user 310) and a second subject (e.g., a tree 320) through the camera 210 mounted thereon.
- the unmanned aerial vehicle 201 can transmit an image photographed using the camera 210 to the electronic device 301 and the electronic device 301 can transmit the captured image to the display device 160 The transmitted image can be displayed.
- FIG. 4 is a block diagram of the UAV 201, in accordance with various embodiments.
- the UAV 201 may include an application platform 410 and a flight platform 420.
- the unmanned aerial vehicle 201 may be controlled under the control of the application platform 410, or may be wirelessly connected to an electronic device (e.g., the electronic device 101 of FIG. 1).
- the unmanned flight device 201 can control the flight of the unmanned aerial flight device 201 under the control of the flight platform 420.
- the UAV 201 may include at least one camera (e.g., a main camera 210 and a sub camera 220), a communication module 430, a memory 440, and a sensor 450 .
- the components described above may be controlled by the application platform 410.
- the main camera 210 is mounted on the unmanned aerial flight device 201, and can photograph a surrounding image.
- At least one sub camera 220 is mounted on the UAV 201 and can calculate positional information (e.g., coordinate information) corresponding to a nearby subject.
- the unmanned aerial vehicle 201 can be wirelessly connected to the electronic device 101 via the communication module 430 and can exchange data with the electronic device 101.
- the memory 440 may store data related to three-dimensional modeling.
- the sensor 450 includes an acceleration sensor for measuring an acceleration of the UAV 201, a magnetometer for detecting a direction of a magnetic north of the UAV 201, A gyro sensor that detects the rotational angular velocity of the device 201, a barometer that detects the altitude of the unmanned airplane device 201, and an optical flow sensor that maintains the hovering state .
- the UAV 201 uses an optical current sensor to limit movement due to external influences during the flight (wind, snow, rain, etc.) or errors in internal control (motor control, etc.) .
- the application platform 410 may use the main camera 210 to capture an image and process the captured image.
- the application platform 410 may perform a 3D modeling function using at least one camera (e.g., main camera 210, sub camera 220).
- the sub camera 220 can acquire positional information (e.g., coordinate information) corresponding to nearby objects and implement a three-dimensional modeling image based on the obtained positional information.
- the application platform 410 can calculate the movement route of the UAV 201 based on the positional information on the surrounding objects.
- the unmanned flight device 201 may include an electronic speed controller (ESC) 460 and at least one motor (motor) 470 for flight.
- the ESC 460 and the at least one motor 470 may be controlled by the flight platform 420.
- one ESC 460 is shown, but is not limited thereto.
- the ESC 460 may be composed of a plurality of, corresponding to the number of motors.
- the ESC 460 can control the rotational speed of the motor 470 (e.g., a transmission) and can control the acceleration, deceleration, and reverse rotation of the motor 470.
- the ESC 460 may control at least one or more motors.
- At least one motor 470 may be driven (e.g., rotated, stopped, accelerated, decelerated, etc.) by the ESC 460.
- the at least one motor 470 may include a BLDC motor (blushless DC motor).
- flight platform 420 may control at least one motor 470 via ESC 460 and control the flight of unmanned aerial device 201.
- the flight platform 420 can move the UAV 201 based on movement information about the UAV 201 received from the electronic device 101.
- An unmanned flight device includes at least one camera, a communication module, and a processor electrically connected to the at least one camera and the communication module, Dimensional modeling information using the communication module, and transmit the obtained three-dimensional modeling information to the electronic device connected through the communication module.
- a processor of the UAV may receive movement information based at least on the 3D modeling information from the electronic device, and determine a position movement based on the received movement information.
- the three-dimensional modeling information may include at least one of the number of subjects, the type of subject, the size of the subject, the angle information between the subject and the unmanned flight device, the altitude of the unmanned flight device, And may include data obtained by digitizing at least one of information on one camera.
- FIG. 5 is a flowchart illustrating a process of controlling an unmanned aerial vehicle under the control of an electronic device according to various embodiments.
- the electronic device 101 and the UAV 201 may be connected wirelessly, and the electronic device 101 may control the flight and functions of the UAV 201.
- the electronic device 101 can move the position of the UAV 201 or control the driving of the camera mounted on the UAV 201.
- the UAV 201 may be in a state of performing a hovering function.
- the UAV 201 can maintain a hovering state by a flight platform (e.g., flight platform 420 of FIG. 4).
- the unmanned flight device 201 in the hovering state can return to the current position (e.g., the set specific position) by using the sensor built in the unmanned flight control device 201, even if the unmanned flight device 201 moves to another position due to an external factor such as wind .
- the UAV 201 can perform a spatial scan for three-dimensional modeling based on a nearby object.
- the unmanned aerial vehicle 201 can be used to perform three-dimensional modeling of objects placed in the vicinity using at least one camera (e.g., the main camera 210 of FIG. 4, the sub camera 220 of FIG. 4) It can be changed.
- the three-dimensional modeling method is a method of realizing a photographed composition photographed by the main camera 210 in a virtual space, and the precision can be adjusted according to the setting of the user. In general, in order to roughly grasp the shape of the subject, a scanning process may be required for about one second.
- the sub camera can calculate position information (e.g., coordinate information) corresponding to each of the subjects through the space scan.
- the UAV 201 can receive location information from at least one satellite via a communication module (e.g., communication module 430 in FIG. 4).
- the unmanned flight device 201 can receive information on the current position, current time, altitude, or moving speed of the unmanned airplane device 201 and calculates coordinate information corresponding to the current position of the unmanned airplane device 201 .
- the UAV 201 may transmit coordinate information corresponding to at least one subject and coordinate information corresponding to the UAV 201 to the electronic device 101 through the communication module 430.
- the processor (e.g., processor 120 of FIG. 1) of the electronic device 101 at operation 507 may perform three-dimensional modeling based on the coordinate information for the subject and the coordinate information for the UAV 201 .
- the processor 120 may determine an icon corresponding to at least one subject (e.g., user, surrounding structure, building), terrain, and unmanned aerial vehicle 201, Can be arranged in the virtual space.
- the processor 120 may implement a subject placed in a virtual space based on coordinate information on which an actual subject is located.
- the processor 120 may display the virtual space implemented through the three-dimensional modeling on a display device (e.g., the display device 160 of FIG. 1).
- the processor 120 may set a center subject among at least one subject included in the virtual space. For example, when the center object is set as the center object, the unmanned flight control device 201 can determine the shooting composition around the set center object. When the position of the center subject moves, the unmanned flight device 201 can also fly in response to the movement of the position of the center subject.
- the processor 120 may receive a move command for an icon corresponding to the unmanned aerial vehicle 201 included in the virtual space, and may calculate an expected image according to the move command. For example, in a state where the virtual space is being displayed through the display device 160 of the electronic device 101, the processor 120 can sense a move command for the icon corresponding to the unmanned airplane device 201. [ For example, the move command for the icon may be a dragging input to the icon. According to various embodiments, the processor 120 may calculate an expected image according to a move command based on coordinate information of the icons contained in the virtual space. The processor 120 may display the calculated expected image on the display device 160. [ The predicted image may be an image based on two-dimensional modeling, and may be an image to be photographed by the main camera 210 according to the movement.
- the processor 120 may transmit the coordinate information corresponding to the movement of the unmanned aerial device 201 to the unmanned aerial vehicle 201.
- the processor 120 can transmit the coordinate information of the position where the unmanned flight device 201 will move, and the information about the moving speed and the moving path of the unmanned aerial vehicle 201 to the unmanned air combat device 201 have.
- the processor 120 may transmit not only the information about the movement route of the UAV 201 but also a command for capturing an image using the mounted camera after moving to the corresponding position.
- the UAV 201 may move based on the information about the received movement route. For example, the unmanned aerial vehicle 201 may move to a position corresponding to the predicted image, in order to photograph an image corresponding to the predicted image displayed on the electronic device 101. [ The unmanned flight device 201 can photograph an image using the main camera 210 mounted after the positional movement.
- 6A and 6B are flowcharts illustrating a method of controlling an unmanned aerial vehicle according to various embodiments.
- 6A is a flowchart illustrating a process of controlling an unmanned aerial vehicle (e.g., the unmanned aerial vehicle 201 of FIG. 2) in an electronic device (for example, the electronic device 101 of FIG. 1) (201) is moving or capturing an image under the control of the electronic device (101).
- an unmanned aerial vehicle e.g., the unmanned aerial vehicle 201 of FIG. 2
- an electronic device for example, the electronic device 101 of FIG. 1 (201) is moving or capturing an image under the control of the electronic device (101).
- a processor e.g., the processor 120 of FIG. 1
- the UAV 201 can perform a hovering function under the control of the electronic device 101, or in response to a hovering command with an external input button.
- the processor 120 may receive three-dimensional modeling information from the unmanned aerial vehicle 201.
- the three-dimensional modeling information may include coordinate information corresponding to at least one subject and coordinate information corresponding to the UAV 201.
- the unmanned aerial vehicle 201 calculates coordinate information corresponding to a nearby object based on at least one camera (e.g., the main camera 210 of Fig. 2, the sub camera 220 of Fig. 2) can do.
- the unmanned flight device 201 can receive coordinate information of the unmanned aerial vehicle 201 from an external satellite.
- the unmanned flight device 201 can transmit the coordinate information on the subject and the coordinate information on the UAV 201 to the electronic device 101.
- the processor 120 may implement a three-dimensional modeling-based virtual image based on the three-dimensional modeling information received from the unmanned aerial vehicle 201.
- the processor 120 may display the virtual space based on the 3D modeling implemented on the display device (e.g., the display device 160 of FIG. 1).
- the virtual space on the basis of the three-dimensional modeling may be an image that virtually implements the image photographed by the main camera 210 on the unmanned aerial vehicle 201.
- a subject and an object photographed by the main camera 210 may be displayed as specific icons and virtual items.
- the processor 120 may detect a user input for moving the unmanned aerial vehicle in a virtual space (virtual image) based on the three-dimensional modeling.
- the user input may include an input for dragging an unmanned aerial vehicle icon contained in the virtual space.
- the processor 120 may calculate an expected image of a point corresponding to the user input, assuming that the UAV 201 is moved by the detected user input. For example, when the UAV 201 captures an image using at least one camera at the point, an expected image displayed on the display device 160 may be calculated.
- the processor 120 may transmit movement information (e.g., coordinate information) of the UAV 201 corresponding to the calculated expected image to the UAV 201.
- the processor 120 controls the main camera 201 mounted on the UAV 201 when the UAV 201 completes the movement to a position corresponding to the movement information, .
- the UAV 201 may be in a state of performing a hovering function.
- the unmanned aerial vehicle 201 can be connected to the electronic device 101 wirelessly and the movement can be controlled under the control of the electronic device 101.
- the UAV 201 may acquire 3D modeling information using at least one camera (e.g., main camera 210, sub camera 220) mounted.
- the three-dimensional modeling information may include coordinate information for at least one object located in the vicinity.
- the three-dimensional modeling information may include coordinate information for the unmanned aerial vehicle 201 received from an external satellite.
- the UAV 201 may transmit the obtained 3D modeling information to the wirelessly connected electronic device 101.
- the UAV 201 can implement a 3D space modeling based virtual space on the basis of 3D modeling information, and transmit the virtual space to the electronic device 101.
- the UAV 201 may receive movement information from the electronic device 101.
- the movement information may include coordinate information, altitude information, information on the speed of movement, and the like.
- the UAV 201 may move in response to the received movement information.
- the unmanned flight control apparatus 201 When the unmanned flight control apparatus 201 according to the embodiment has completed the movement to the position corresponding to the movement information, the unmanned flight device 201 can shoot the image using the main camera 201 mounted on the unmanned flight device 201.
- the electronic device 101 can take an image based on a composition that is substantially the same as the calculated expected image. According to various embodiments, the user can view the calculated expected image and take an image based on the desired composition.
- FIG. 7A to 7C are views illustrating a process of performing a three-dimensional scanning function using the UAV according to various embodiments.
- FIG. 7A shows a process in which the UAV 201, which is in a hovering state, performs a three-dimensional scanning function in response to a nearby object.
- the unmanned flight control device 201 can take off to a set position.
- the set position can be set by the developer or set by the user.
- the unmanned aerial vehicle 201 in the hovering state uses the main camera 210, the sub camera 220, and sensors (for example, the sensor 450 in FIG. 4) Dimensional scanning function can be performed. For example, when the user corresponding to the first object 310 and the tree corresponding to the second object 320 are located in the vicinity of the unmanned flight device 201, the unmanned airborne device 201 transmits the first object 310 Dimensional scanning function corresponding to the first subject 320 and the second subject 320.
- the unmanned flight device 201 calculates the number of subjects, the type of the subject, the size of the subject, the angle information between the subject and the unmanned flight device 201, Camera information (e.g., camera lens direction, angle of view), and the like, and can transmit the digitized information to the electronic device 101.
- Camera information e.g., camera lens direction, angle of view
- the positional information (e.g., coordinate information) corresponding to each of the subjects (e.g., the first subject 310 and the second subject 320) can be digitized.
- the unmanned flight control device 201 can digitize the altitude h1 of the UAV 201, the height h2 of the first subject 310, and the height h3 of the second subject 320 And the distance D2 between the unmanned flight device 201 and the first subject 310 and the distance D3 between the unmanned flight device 201 and the second subject 320 can be quantified.
- the unmanned flight control device 201 can digitize the first angle? 1 between the first object 310 and the second object 320 around the unmanned aerial flight device 201.
- the straight line distance from the UAV 201 to the first object 310 is set to the first side
- the straight line distance from the UAV 201 to the second object 320 is set to the second side
- the angle formed by intersecting the first side and the second side may be the first angle? 1.
- the vertical distance from the UAV 201 to the ground is set as the third side
- the angle formed by the first side and the third side is determined as the second angle? 2
- the angle formed by intersection can be determined as the third angle? 3.
- the unmanned flight control device 201 can digitize the first angle? 1, the second angle? 2, and the third angle? 3 described above.
- the unmanned flight device 201 can acquire the digitized data when performing the three-dimensional scanning function.
- the numerical data is information for performing a three-dimensional modeling function, and is hereinafter referred to as " three-dimensional modeling information ".
- the unmanned flight device 201 can transmit the obtained three-dimensional modeling information to the electronic device 101.
- the 7B shows a virtual space 700 in which the electronic device 101 receives the three-dimensional modeling information and implements it based on the three-dimensional modeling information.
- the electronic device 101 performs a three-dimensional modeling function, thereby three-dimensionally modeling the surrounding space of the unmanned aerial vehicle 201 in hovering, and realizing the virtual space 700 similar to reality .
- the electronic device 101 can display the virtual space 700 as a three-dimensional image.
- the electronic device 101 includes an unmanned aerial vehicle icon 701 corresponding to the UAV 201 in the virtual space 700, a first subject icon 710 corresponding to the first subject 310, The second subject icon 720 corresponding to the second subject icon 720 may be displayed.
- the electronic device 101 can determine the position where the respective icons are arranged based on the three-dimensional modeling information in the virtual space 700.
- the electronic device 101 may display a virtual space 700 implemented similarly to the real world.
- the 7C shows a user interface 750 displayed on a display device 160 (e.g., a display) of the electronic device 101.
- the electronic device 101 may divide the user interface 750 into a plurality of zones.
- the user interface 750 includes a virtual space 700 implemented by the electronic device 101, an expected image 730 when photographed by the main camera 210 mounted on the unmanned air vehicle 201, A photographing button 740 for performing a photographing function.
- the electronic device 101 may display a virtual space 700 implemented based on a three-dimensional modeling function and an expected image 730 corresponding to a shot of the main camera 210,
- the main camera 210 may take an image in response to an input to the main camera 740.
- FIGS. 8A and 8B are views showing the moving radius of the unmanned aerial vehicle around a center subject according to various embodiments.
- the electronic device 101 may set a subject located in the vicinity to the center subject 810. [ The main camera 210 mounted on the UAV 201 can photograph the center of the subject 810. The electronic device 101 can control the flight of the unmanned aerial vehicle 201 so that the photographing can be performed while maintaining the size of the center subject 810. [ The unmanned flight control device 201 can photograph the center subject 810 while keeping the distance from the center subject 810 constant. The electronic device 101 can calculate the moving radius 801 that the unmanned flight device 201 can move while keeping the distance from the center subject 810 constant. The movement radius can be expressed in the form of a hemisphere. According to various embodiments, the electronic device 101 can calculate a travel radius 801 around which the unmanned flight device 201 can travel, with the center subject 810 as the center.
- FIG. 8B shows a range in which the unmanned flight device 201 can move when the second object 320 is set as the center subject 820.
- the electronic device 101 can control the flight of the unmanned flight device 201 so as to maintain the size of the center subject 820.
- the unmanned flight control device 201 can photograph the center subject 820 while keeping the distance from the center subject 820 constant.
- the electronic device 101 can calculate and display the moving radius 802 that the unmanned flight device 201 can move while keeping the distance from the center subject 820 constant.
- the electronic device 101 can set the center subject 820 and can control the flight of the unmanned flight device 201 so that the unmanned flight device 201 maintains a constant distance from the central subject 820 Can be controlled.
- FIGS. 9A and 9B are diagrams showing predicted images according to various angles of view of a main camera mounted on the UAV according to various embodiments.
- the electronic device 101 sets the first subject 310 as a center subject, and the main camera 210 mounted on the unmanned aerial vehicle 201 can photograph an image with the center of the subject as a center .
- the main camera 210 may have a photographable view angle 910 fixed.
- the unmanned flight control device 201 can fly while keeping the distance from the central object constant so as to shoot while maintaining the same size of the center object.
- the unmanned flight device 201 can determine the moving radius 920 and can determine the shooting composition within the moving radius 920.
- the main camera 210 may photograph an image based on a photographing scheme in which the first subject 310 is located at the center.
- the main camera 210 may include a part of the second subject 320 located on the right side of the first subject 310 in the photographic composition.
- FIG. 9B there is shown an anticipated image photographed using the main camera 210 mounted on the unmanned aerial vehicle 201 shown in FIG. 9A.
- the electronic device 101 includes the unmanned aerial vehicle 201 and the main camera 210 so that the first object 310 is positioned at the center and at least a portion of the second object 320 is positioned on the right side of the first object 310. [ Can be controlled.
- 10A and 10B are diagrams illustrating a process of changing an expected image corresponding to movement of the UAV according to various embodiments.
- Figs. 10A and 10B show images displayed on the display device 160 of the electronic device 101.
- the electronic device 101 may be wirelessly connected to the unmanned aerial vehicle that is hovering to control the unmanned aerial vehicle.
- the electronic device 101 can control a camera mounted on the unmanned aerial vehicle.
- ⁇ 1001> represents an unmanned flight device icon 701 corresponding to the unmanned aerial vehicle displayed on the display device 160 of the electronic device 101, a first subject icon 710 corresponding to the first subject, And shows the corresponding second subject icon 720.
- Each icon can be modeled in three dimensions and placed in a virtual space.
- the electronic device 101 can set a center subject and highlight the set center subject.
- the electronic device 101 may sense a dragging input to move the unmanned aerial vehicle icon 701.
- ≪ 1002 > shows a layout structure when the virtual space is viewed from above.
- a first subject 710 set as a center subject in the photographing composition may be displayed at the center and an expected image may be displayed such that a second subject 720 is positioned behind the first subject 710.
- the electronic device 101 senses a user's input (e.g., a dragging input) to the unmanned aerial vehicle icon 701, and responds to the user's input to display the unmanned airplane icon 701 Can be moved. As the unmanned aerial vehicle icon 701 moves, the expected image can be changed.
- a user's input e.g., a dragging input
- ⁇ 1011> illustrates a process of detecting the user's input in the virtual space and moving the unmanned aerial vehicle icon 701 in response to the detected user's input.
- ≪ 1002 > shows a layout structure when the virtual space is viewed from the top, and shows the left-right movement with respect to the unmanned airplane device icon 701 in response to user input.
- ≪ 1003 > shows a process in which the shooting composition is changed as the unmanned aerial vehicle icon 701 moves.
- the electronic device 101 can three-dimensionally model and display the movement of the unmanned aerial vehicle virtually before moving the unmanned aerial vehicle.
- 11A and 11B are diagrams illustrating operations performed in an electronic device and operations performed in an unmanned aerial vehicle, corresponding to the movement of the UAV according to various embodiments.
- 11A shows a process performed in the electronic device 101.
- the UAV 201 can check the current position of the UAV 201 and acquire 3D modeling information 1151 at the current position.
- the unmanned aerial vehicle 201 can receive at least one three-dimensional modeling information 1103 from the electronic device 101.
- the unmanned flight device 201 can move based on the three-dimensional modeling information 1151 at the current location and the three-dimensional modeling information 1103 according to the movement of the unmanned aerial vehicle 201 received from the electronic device 101 .
- FIG. 12 is a diagram illustrating a process of displaying an expected image corresponding to the position of the UAV according to various embodiments.
- the UAV 201 may be positioned at the first position 1201 and the second position 1202, and an expected image may be determined corresponding to the position of the UAV 201.
- an expected image may be determined corresponding to the position of the UAV 201.
- the first imaging configuration 1210 can be determined and the first expected image 1210 corresponding to the determined first imaging configuration 1210 1211 may be displayed.
- the second imaging configuration 1220 can be determined and the second expected image 1221 corresponding to the determined second imaging configuration 1220 is displayed .
- the electronic device 101 can acquire 3D modeling information according to the movement of the UAV 201 from the UAV 201 in real time.
- the electronic device 101 can calculate a photographing plan based on the three-dimensional modeling information, and can display an expected image when the image is photographed in the calculated photographing plan.
- FIG. 13 is a diagram showing a range in which the unmanned aerial vehicle can be moved within a set background range according to various embodiments.
- the background 1301 to be photographed may be set, and the moving range 1300 of the UAV 201 may be determined so that the background 1301 is at least partially displayed.
- the electronic device 101 may display a movement range 1300 in which the unmanned flight device 201 can move in correspondence with the background 1301 to be photographed. For example, if the unmanned aerial vehicle 201 is located at the first position, the first photographing scheme 1310 may be determined within the background 1301 to be photographed. The electronic device 101 may display an expected image 1311 corresponding to the first imaging configuration 1310. [ When the unmanned flight device 201 is located at the second position, the second photographing scheme 1320 may be determined within the background 1301 to be photographed. The electronic device 101 may display an expected image 1321 corresponding to the second imaging configuration 1320. [
- a method is a method for wirelessly connecting an electronic device and an unmanned flight device and wirelessly connecting the electronic device and the unmanned airplane to the 3D modeling information acquired using the at least one camera mounted on the unmanned flight device
- a controller for receiving from the unmanned flight device, displaying a virtual space implemented based on the received three-dimensional modeling information, detecting a user input for controlling the unmanned airplane device in the virtual space, An expected image of the corresponding point can be displayed.
- the method according to various embodiments may further include transmitting motion information corresponding to the expected image to the unmanned flight device in response to a command to shoot the expected image and controlling movement of the unmanned flight device based on the motion information Operation may be further included.
- the virtual space includes at least one of a virtual space image or an expected image, wherein the virtual space image includes a subject icon corresponding to at least one subject and an unmanned flight device icon corresponding to the unmanned flight device And the predicted image may include an image to be photographed using the at least one camera corresponding to a point where the unmanned airplane icon is located.
- the virtual space image may indicate the movable range of the UAV while the distance between the at least one subject and the UAV is kept constant.
- a method may include detecting dragging input to the unmanned flight device icon included in the virtual space and acquiring location information of the unmanned airplane device at predetermined time intervals in response to the dragging input May be further included.
- the 3D modeling information may be obtained based on the at least one camera, and may include position information corresponding to at least one subject and position information corresponding to the unmanned flight device.
- the method according to various embodiments may further comprise setting a center subject among the at least one subject and controlling the unmanned air vehicle so that the center subject is displayed at the center of the predicted image.
- the at least one camera may include a main camera for capturing the expected image and a sub-camera for acquiring the three-dimensional modeling information to implement the virtual space.
- the electronic device can be various types of devices.
- the electronic device can include, for example, at least one of a portable communication device (e.g., a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance.
- a portable communication device e.g., a smart phone
- a computer device e.g., a laptop, a desktop, a smart phone
- portable multimedia device e.g., a portable multimedia device
- portable medical device e.g., a portable medical device
- camera e.g., a camera
- a wearable device e.g., a portable medical device
- first component is "(functionally or communicatively) connected” or “connected” to another (second) component, May be connected directly to the component, or may be connected through another component (e.g., a third component).
- module includes units comprised of hardware, software, or firmware and may be used interchangeably with terms such as, for example, logic, logic blocks, components, or circuits.
- a module may be an integrally constructed component or a minimum unit or part thereof that performs one or more functions.
- the module may be configured as an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of the present document may include instructions stored on a machine-readable storage medium (e.g., internal memory 136 or external memory 138) readable by a machine (e.g., a computer) Software (e.g., program 140).
- the device may include an electronic device (e.g., electronic device 101) in accordance with the disclosed embodiments as an apparatus capable of calling stored instructions from the storage medium and operating according to the called instructions.
- a processor e.g., processor 120
- the processor may perform the function corresponding to the instruction, either directly or using other components under the control of the processor.
- the instructions may include code generated or executed by the compiler or interpreter.
- a device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-temporary' means that the storage medium does not include a signal and is tangible, but does not distinguish whether data is stored semi-permanently or temporarily on the storage medium.
- the method according to various embodiments disclosed herein may be provided in a computer program product.
- a computer program product can be traded between a seller and a buyer as a product.
- a computer program product may be distributed in the form of a machine readable storage medium (eg, compact disc read only memory (CD-ROM)) or distributed online through an application store (eg PlayStore TM).
- CD-ROM compact disc read only memory
- PlayStore TM application store
- at least a portion of the computer program product may be temporarily stored, or temporarily created, on a storage medium such as a manufacturer's server, a server of an application store, or a memory of a relay server.
- Each of the components may be comprised of a single entity or a plurality of entities, and some of the subcomponents described above may be omitted, or other subcomponents May be further included in various embodiments.
- some components e.g., modules or programs
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Software Systems (AREA)
- Mechanical Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- User Interface Of Digital Computer (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne, selon divers modes de réalisation, un dispositif électronique comprenant un module de communication, un dispositif d'affichage, une mémoire et un processeur connecté électriquement au module de communication, au dispositif d'affichage et à la mémoire, le processeur pouvant : recevoir, par l'intermédiaire du module de communication, des informations de modélisation tridimensionnelle acquises à l'aide d'au moins une caméra embarquée dans un véhicule aérien sans pilote; utiliser le dispositif d'affichage pour afficher un espace virtuel implémenté sur la base des informations de modélisation tridimensionnelle reçues; détecter une entrée d'utilisateur destinée à commander le véhicule aérien sans pilote dans l'espace virtuel; et utiliser le dispositif d'affichage pour afficher une image attendue d'un point correspondant à l'entrée d'utilisateur détectée. De plus, un autre mode de réalisation est également possible.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020170148964A KR20190053018A (ko) | 2017-11-09 | 2017-11-09 | 카메라를 포함하는 무인 비행 장치를 조종하는 방법 및 전자장치 |
| KR10-2017-0148964 | 2017-11-09 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019093692A1 true WO2019093692A1 (fr) | 2019-05-16 |
Family
ID=66438016
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2018/012625 Ceased WO2019093692A1 (fr) | 2017-11-09 | 2018-10-24 | Procédé et dispositif électronique de commande de véhicule aérien sans pilote comprenant une caméra |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20190053018A (fr) |
| WO (1) | WO2019093692A1 (fr) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102472809B1 (ko) | 2022-05-16 | 2022-12-01 | 고려대학교 산학협력단 | 복수의 무인이동체를 이용한 감시 시스템 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101625634B1 (ko) * | 2016-02-23 | 2016-05-31 | 공간정보기술 주식회사 | 드론의 스테레오 카메라가 촬영한 이미지를 이용한 3차원 모델링 시스템 |
| KR101628750B1 (ko) * | 2015-07-29 | 2016-06-09 | 주식회사 에이베스트 | 3d 항공 촬영을 이용한 지형물의 안전 예측 방법 |
| KR20170067373A (ko) * | 2015-12-08 | 2017-06-16 | 가톨릭대학교 산학협력단 | 드론 촬영 이미지를 기반으로 3d 오브젝트를 자동으로 추출하는 시스템 및 방법 |
| KR20170081488A (ko) * | 2016-01-04 | 2017-07-12 | 삼성전자주식회사 | 무인 촬영 장치를 이용한 이미지 촬영 방법 및 전자 장치 |
| KR20170093364A (ko) * | 2016-02-05 | 2017-08-16 | 한화테크윈 주식회사 | 무인 비행 시스템 |
-
2017
- 2017-11-09 KR KR1020170148964A patent/KR20190053018A/ko not_active Withdrawn
-
2018
- 2018-10-24 WO PCT/KR2018/012625 patent/WO2019093692A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101628750B1 (ko) * | 2015-07-29 | 2016-06-09 | 주식회사 에이베스트 | 3d 항공 촬영을 이용한 지형물의 안전 예측 방법 |
| KR20170067373A (ko) * | 2015-12-08 | 2017-06-16 | 가톨릭대학교 산학협력단 | 드론 촬영 이미지를 기반으로 3d 오브젝트를 자동으로 추출하는 시스템 및 방법 |
| KR20170081488A (ko) * | 2016-01-04 | 2017-07-12 | 삼성전자주식회사 | 무인 촬영 장치를 이용한 이미지 촬영 방법 및 전자 장치 |
| KR20170093364A (ko) * | 2016-02-05 | 2017-08-16 | 한화테크윈 주식회사 | 무인 비행 시스템 |
| KR101625634B1 (ko) * | 2016-02-23 | 2016-05-31 | 공간정보기술 주식회사 | 드론의 스테레오 카메라가 촬영한 이미지를 이용한 3차원 모델링 시스템 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20190053018A (ko) | 2019-05-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107168352B (zh) | 目标追踪系统及方法 | |
| WO2018110848A1 (fr) | Procédé de fonctionnement de véhicule aérien sans pilote et dispositif electronique pour sa prise en charge | |
| WO2020185029A1 (fr) | Dispositif électronique et procédé d'affichage des informations de partage sur la base de la réalité augmentée | |
| WO2016065623A1 (fr) | Systèmes et procédés de surveillance doté de repère visuel | |
| JP6583840B1 (ja) | 検査システム | |
| WO2016015232A1 (fr) | Systèmes et procédés de stabilisation de charge utile | |
| WO2015167080A1 (fr) | Procédé et appareil de commande de véhicule aérien sans pilote | |
| WO2016106715A1 (fr) | Traitement sélectif de données de capteur | |
| US10721378B2 (en) | Image management system and unmanned flying body | |
| WO2022039404A1 (fr) | Appareil de caméra stéréo ayant un large champ de vision et procédé de traitement d'image de profondeur l'utilisant | |
| WO2021054782A1 (fr) | Véhicule aérien sans pilote et procédé permettant d'effectuer un vol de diagnostic avant l'exécution d'un vol | |
| WO2019198868A1 (fr) | Procédé de reconnaissance mutuelle entre un véhicule aérien sans pilote et un terminal sans fil | |
| WO2016076463A1 (fr) | Dispositif et procédé de commande de robot volant | |
| WO2019124728A1 (fr) | Appareil et procédé d'identification d'objet | |
| CN107065894A (zh) | 无人飞行器、飞行高度控制装置、方法以及程序 | |
| WO2022080869A1 (fr) | Procédé de mise à jour d'une carte tridimensionnelle au moyen d'une image et dispositif électronique prenant en charge ledit procédé | |
| KR20200020295A (ko) | 로봇과 인터랙션하는 증강현실 서비스 제공 장치 및 방법 | |
| JP2019211486A (ja) | 検査システム | |
| WO2021149938A1 (fr) | Dispositif électronique et procédé de commande de robot | |
| CN113677412B (zh) | 信息处理装置、信息处理方法和程序 | |
| CN110799801A (zh) | 基于无人机的测距方法、装置及无人机 | |
| WO2019054790A1 (fr) | Procédé de traitement de contenu, et dispositif électronique pour sa prise en charge | |
| WO2020145653A1 (fr) | Dispositif électronique et procédé pour recommander un emplacement de capture d'images | |
| WO2020171315A1 (fr) | Système d'atterrissage de véhicule aérien sans pilote | |
| WO2022173164A1 (fr) | Procédé et dispositif électronique d'affichage d'un objet de réalité augmentée |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18876962 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18876962 Country of ref document: EP Kind code of ref document: A1 |