WO2022055117A1 - Dispositif électronique et son procédé de commande - Google Patents
Dispositif électronique et son procédé de commande Download PDFInfo
- Publication number
- WO2022055117A1 WO2022055117A1 PCT/KR2021/009249 KR2021009249W WO2022055117A1 WO 2022055117 A1 WO2022055117 A1 WO 2022055117A1 KR 2021009249 W KR2021009249 W KR 2021009249W WO 2022055117 A1 WO2022055117 A1 WO 2022055117A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- space
- information
- image
- controlling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72415—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/006—Controls for manipulators by means of a wireless system for controlling one or several manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/06—Control stands, e.g. consoles, switchboards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
Definitions
- the present disclosure relates to an electronic device and a control method thereof, and more particularly, to an electronic device for controlling a robot and a control method thereof.
- users can perform desired tasks by using robots at construction sites or work sites that are dangerous for humans to enter.
- the present disclosure is intended to solve the above problems, and relates to an electronic device and a control method thereof, which provide a UI for controlling a plurality of robots so that a user can receive services without interruption even when the robots cannot move.
- a method of controlling an electronic device includes displaying a UI for controlling a first robot located in a first space, and a control command for controlling the first robot based on a user manipulation command transmitting to the first robot, if it is difficult for the first robot to move to a second space different from the first space, identifying a second robot located in the second space and the confirmed second and displaying a UI for controlling the robot.
- the electronic device controls the display to display a communication interface, a display, and a UI for controlling a first robot located in a first space, and based on a user manipulation command, the The communication interface is controlled to transmit a control command for controlling the first robot to the first robot, and if it is difficult for the first robot to move to a second space different from the first space, it is located in the second space and a processor for controlling the display to identify a second robot that does, and display a UI for controlling the identified second robot.
- FIG. 1 is a view for explaining a system according to an embodiment of the present disclosure
- FIG. 2 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure
- FIG. 3 is a block diagram for explaining the configuration of a robot according to an embodiment of the present disclosure.
- 5A is a view for explaining a robot existing in a space in which an obstacle exists
- 5B is a view for explaining an electronic device that displays an image received from a robot
- 5C is a view for explaining an electronic device that displays an image including an obstacle
- 5D is a view for explaining an electronic device for displaying another robot existing in a space different from that in which the robot is present;
- 5E is a view for explaining an electronic device that displays an image received from another robot existing in another space of FIG. 5D;
- 6A is a view for explaining an electronic device that receives and displays an image including an obstacle from a robot
- 6B is a diagram for explaining an electronic device that identifies an obstacle in an image including the obstacle and displays an image of another space;
- FIG. 7 is a view for explaining a server that collects and stores spatial information according to an embodiment of the present disclosure
- FIG. 8 is a view for explaining a space collecting device according to an embodiment of the present disclosure.
- FIG. 9 is a flowchart illustrating a method of controlling an electronic device according to an embodiment of the present disclosure.
- expressions such as “have,” “may have,” “includes,” or “may include” refer to the presence of a corresponding characteristic (eg, a numerical value, function, operation, or component such as a part). and does not exclude the presence of additional features.
- first may be referred to as a second component, and similarly, the second component may also be renamed as a first component.
- module As used herein, terms such as “module”, “unit”, “part”, etc. are terms used to refer to a component that performs at least one function or operation, and such component is implemented in hardware or software or may be implemented as a combination of hardware and software.
- a plurality of “modules”, “units”, “parts”, etc. are integrated into at least one module or chip, except when each needs to be implemented in individual specific hardware, and thus at least one processor. can be implemented as
- a component eg, a first component is "coupled with/to (operatively or communicatively)" to another component (eg, a second component);
- another component eg, a second component
- the certain element may be directly connected to the other element or may be connected through another element (eg, a third element).
- a component eg, a first component
- another component eg, a second component
- the term user may refer to a person who uses an electronic device or a device (eg, an artificial intelligence electronic device) using the electronic device.
- a device eg, an artificial intelligence electronic device
- FIG. 1 is a diagram for explaining a system according to an embodiment of the present disclosure.
- a system 1000 includes an electronic device 100 , a first robot 200 - 1 , a second robot 200 - 2 , and a server 300 . do. In FIG. 1 , only two robots, the first robot 200-1 and the second robot 200-2, are shown, but the system 1000 may include N robots.
- the electronic device 100 may communicate with the first robot 200 - 1 , the second robot 200 - 2 and the server 300 .
- the electronic device 100 is illustrated as a smartphone in FIG. 1
- the electronic device 100 includes a tablet PC, a mobile phone, a TV, an electronic blackboard, a monitor, a laptop PC, a camera, or a wearable device (eg, , watches, glasses, and head-mounted-devices (HMDs)).
- the electronic device 100 is not necessarily limited thereto, and may be the electronic device 100 of the present disclosure as long as it includes a display and can communicate with an external device.
- the electronic device 100 may control the first robot 200 - 1 or the second robot 200 - 2 . Specifically, the electronic device 100 may transmit a control command to the first robot 200 - 1 or the second robot 200 - 2 . To this end, the electronic device 100 may display a user interface (UI) for controlling the first robot 200 - 1 or the second robot 200 - 2 .
- UI user interface
- the electronic device 100 may control the first robot 200 - 1 or the second robot 200 - 2 to capture an image of a specific space while moving in a specific space. Then, the electronic device 100 receives the image captured by the first robot 200-1 or the second robot 200-2 from the first robot 200-1 or the second robot 200-2, can be displayed Accordingly, even if the electronic device 100 does not exist in the space in which the first robot 200-1 or the second robot 200-2 is located, the first robot 200-1 or the second robot 200- Through 2), an image of a specific space in which the first robot 200-1 or the second robot 200-2 is located may be displayed.
- the first robot 200-1 or the second robot 200-2 may exist in the same space, but may also exist in different spaces.
- the second robot 200 - 2 may exist in a space where the first robot 200 - 1 cannot move.
- the first robot 200 - 1 or the second robot 200 - 2 may operate according to a control command received from the electronic device 100 .
- the first robot 200 - 1 or the second robot 200 - 2 may move in a specific space or take an image of a specific space according to a control command.
- a sound generated in a specific space may be collected and transmitted to the electronic device 100 , or a voice signal uttered by a user of the electronic device 100 may be received and outputted.
- the first robot 200 - 1 or the second robot 200 - 2 may have various shapes.
- the first robot 200-1 or the second robot 200-2 may be an articulated robot (walking robot) similar to a human shape, a wheel/caterpillar robot, and a flying robot (eg, drone), or a swimming robot that operates underwater.
- the present disclosure is not limited thereto, and any device capable of receiving a user's control command of the electronic device 100 and operating according to the received control command may be a robot according to the present disclosure.
- the electronic device 100 may receive information about the first robot 200 - 1 or the second robot 200 - 2 from the server 300 .
- the server 300 is a device capable of processing various requests received from the electronic device 100 , and may receive various information from the electronic device 100 or transmit various information to the electronic device 100 .
- the electronic device 100 may request information about a robot located in a specific space from the server 300 , and in response to the request, the first robot 200 - 1 or the second robot from the server 300 . It is possible to receive the location information of (200-2).
- the electronic device 100 may receive an image of a space in which the first robot 200 - 1 or the second robot 200 - 2 is located from the server 300 .
- the electronic device 100 receives the image from the first robot 200-1 or the second robot 200-2 from the server 300 before the first robot 200-1 or the second robot 200- 2) It is possible to receive an image of the existing space.
- the server 300 may receive location information of the first robot 200-1 or the second robot 200-2 from the first robot 200-1 or the second robot 200-2. . Specifically, the server 300 periodically receives the position information of the first robot 200-1 or the second robot 200-2 from the first robot 200-1 or the second robot 200-2. can do. Alternatively, the server 300 receives image information of a space in which the first robot 200-1 or the second robot 200-2 is located from the first robot 200-1 or the second robot 200-2. You may.
- FIG. 2 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
- the electronic device 100 may include a communication interface 110 , a display 120 , and a processor 130 .
- the communication interface 110 is a component for the electronic device 100 to communicate with an external electronic device (not shown) such as the robot 200 or the server 300 .
- the electronic device 100 may transmit a control command to the robot 200 through the communication interface 110 and receive an image captured by the robot 200 from the robot 200 .
- the electronic device 100 may request information on a specific space or information on a robot located in a specific space from the server 300 through the communication interface 110 , and information on the specific space from the server 300 .
- information about a robot located in a specific space may be received.
- the communication interface 110 may include various communication modules such as a wired communication module (not shown), a short-range wireless communication module (not shown), and a wireless communication module (not shown).
- a wired communication module not shown
- a short-range wireless communication module not shown
- a wireless communication module not shown
- the wired communication module is a module for performing communication with an external device (not shown) according to a wired communication method such as wired Ethernet.
- the short-range wireless communication module is a module for performing communication with an external device (not shown) located in a short distance according to a short-range wireless communication method such as Bluetooth (Bluetooth, BT), BLE (Bluetooth Low Energy), ZigBee method, and the like.
- the wireless communication module is a module that is connected to an external network according to a wireless communication protocol such as WiFi and IEEE to communicate with an external device (not shown).
- the wireless communication module is based on various mobile communication standards such as 3G (3rd Generation), 3GPP (3rd Generation Partnership Project), LTE (Long Term Evolution), LTE-A (LTE Advanced), 5G Networks, etc. It may further include a mobile communication module for performing communication by accessing a mobile communication network.
- 3G 3rd Generation
- 3GPP 3rd Generation Partnership Project
- LTE Long Term Evolution
- LTE-A LTE Advanced
- 5G Networks etc. It may further include a mobile communication module for performing communication by accessing a mobile communication network.
- the display 120 may provide various content screens that can be provided through the electronic device 100 .
- the display 120 may display a UI for controlling the robot 200 or may display a UI including an image received from the robot 200 .
- the display 120 may display the augmented reality image generated by the electronic device 100 .
- the display 120 may be implemented in various types of displays, such as a liquid crystal display (LCD), an organic light emitting diode (OLED) display, a plasma display panel (PDP), a wall, a micro LED, and the like.
- LCD liquid crystal display
- OLED organic light emitting diode
- PDP plasma display panel
- a driving circuit, a backlight unit, etc. that may be implemented in the form of an a-si TFT, a low temperature poly silicon (LTPS) TFT, or an organic TFT (OTFT) may also be included in the display 120 .
- the display 120 may be implemented as a touch screen combined with a touch sensor, a flexible display, a three-dimensional display, or the like.
- the processor 130 may be electrically connected to a memory (not shown) to control overall operations and functions of the electronic device 100 .
- the processor 130 may control hardware or software components connected to the processor 130 by driving an operating system or an application program, and may perform various data processing and operations.
- the processor 130 may load and process commands or data received from at least one of the other components into the volatile memory, and store various data in the non-volatile memory.
- the processor 130 is a general-purpose processor (eg, a CPU (Central) It may be implemented as a processing unit) or an application processor (AP).
- a general-purpose processor eg, a CPU (Central) It may be implemented as a processing unit) or an application processor (AP).
- CPU Central
- AP application processor
- the processor 130 may be implemented as a digital signal processor (DSP), a microprocessor, or a time controller (TCON) for processing a digital signal.
- DSP digital signal processor
- TCON time controller
- CPU central processing unit
- MCU micro controller unit
- MPU micro processing unit
- AP application processor
- GPU graphics-processing unit
- CP communication processor
- ARP processor Address Resolution Protocol processor
- SoC system on chip
- LSI large scale integration
- FPGA field programmable gate array
- the processor 130 may control the first robot 200 - 1 positioned in the first space. Specifically, the processor 130 may control the communication interface 110 to generate a control command for controlling the robot 200 and transmit it to the robot 200 .
- the processor 130 may control the display 120 to display a UI for controlling the first robot.
- the processor 130 may control to transmit the image captured by the first robot 200-1 to the electronic device 100, and display a UI including the image received from the first robot 200-1.
- the display 120 may be controlled.
- the processor 130 may identify the second robot located in the second space.
- the reason why it is difficult for the first robot 200 - 1 to move from the first space to the second space may vary according to embodiments. For example, this may correspond to a case in which the first robot 200 - 1 encounters an obstacle that cannot be passed while moving from the first space to the second space.
- the first robot 200- It may be difficult for the first robot to move from the first area to the second area depending on the state of 1) (battery, failure, etc.) or the communication state for transmitting the control command for the first robot 200 - 1 .
- the processor 130 may request information about the robot located in the second space from the server 300 .
- the processor 130 When the processor 130 receives information about the robot located in the second space from the server 300, it is connected to the second robot 200-2 located in the second space and controls the second robot 200-2. can do.
- the processor 130 may control the display 120 to display a UI for controlling the second robot.
- the components illustrated in the electronic device 100 of FIG. 2 may be added, changed, or deleted according to the performance and/or type of the electronic device 100 .
- the positions of the components may be changed corresponding to the performance or structure of the electronic device 100 .
- FIG. 3 is a block diagram for explaining the configuration of the robot 200 according to an embodiment of the present disclosure.
- the robot 200 includes a communication interface 210 , a driving device 220 , a sensor 230 , a memory 240 , a camera 250 , a microphone 260 , a speaker 270 and A processor 280 may be included.
- the communication interface 210 is a component for the robot 200 to communicate with the electronic device 100 or the server 300 .
- the robot 200 may transmit an image captured by the robot 200 to the electronic device 100 or the server 300 through the communication interface 210 .
- the robot 200 may transmit location information of the robot 200 to the server 300 through the communication interface 210 .
- the robot 200 may receive a control command from the electronic device 100 through the communication interface 210 .
- the driving device 220 may move the robot 200 .
- the driving device 220 may be connected to one or two or more wheels, and may include a driving unit such as a motor capable of rotating the wheels.
- the driving device 220 may perform a driving operation such as moving, stopping, or changing a direction of the robot 200 according to a control signal of the processor 280 .
- the sensor 230 may detect an obstacle around the robot 200 . Specifically, the sensor 230 may detect a position of an obstacle around the robot 200 and a distance from the obstacle using an ultrasonic sensor, an infrared sensor, an RF sensor, a camera, or the like. In addition, the sensor 230 may further include a collision sensor that detects an obstacle through collision with the obstacle.
- the memory 240 may store various programs and data necessary for the operation of the robot 200 .
- the memory 240 may be implemented as a non-volatile memory, a volatile memory, a flash-memory, a hard disk drive (HDD), or a solid state drive (SSD).
- the camera 250 may capture an image of a space in which the robot 200 is located.
- the camera 250 may include one or more image sensors, lenses, and image signal processors.
- the camera 250 is generally positioned on the front of the robot 200 , but is not limited thereto, and may be positioned on the rear or side surfaces of the robot 200 .
- the microphone 260 is configured to receive sound from the outside of the robot 200 and generate a signal corresponding to the input sound.
- the processor 280 may receive the sound of the space in which the robot 200 is located through the microphone 260 and generate a signal corresponding thereto.
- the speaker 270 is configured to output various types of audio data on which various processing operations such as decoding, amplification, and noise filtering have been performed by the audio processor, as well as various notification sounds or voice messages.
- the processor 280 may be electrically connected to a memory (not shown) to control overall operations and functions of the robot 200 .
- the processor 280 may control hardware or software components connected to the processor 130 by driving an operating system or an application program, and may perform various data processing and operations.
- the processor 280 may load and process commands or data received from at least one of the other components into the volatile memory, and store various data in the non-volatile memory.
- the processor 280 is a general-purpose processor (eg, CPU (Central) It may be implemented as a processing unit) or an application processor (AP).
- CPU Central
- AP application processor
- the processor 280 may control the operation and function of each component of the robot 200 so that the robot 200 may operate according to a control command received through the communication interface 210 .
- the processor 280 may control the driving device 220 to move the robot 200 according to a control command, or may control the camera 250 to image a space in which the robot 200 is located.
- FIG. 4 is a sequence diagram illustrating a server and an electronic device communicating with the electronic device according to an embodiment of the present disclosure.
- the first robot 200-1 and the second robot 200-2 may transmit location information to the server 300 (S401, S402). Specifically, the first robot 200-1 and the second robot 200-2 detect positions such as an acceleration sensor and a gyro sensor included in the first robot 200-1 and the second robot 200-2. The position of the first robot 200 - 1 and the second robot 200 - 2 may be identified using the sensor, and the identified position information may be periodically transmitted to the server 300 . In addition, the server 300 may store the positions of the first robot 200 - 1 and the second robot 200 - 2 .
- the first robot 200 - 1 and the second robot 200 - 2 may transmit information about the first robot 200 - 1 and the second robot 200 - 2 to the server 300 .
- the first robot 200-1 and the second robot 200-2 provide the server 300 with the types, image information, and identification information of the first robot 200-1 and the second robot 200-2. , location information, battery information, and information on a movable area may be transmitted.
- the server 300 may store various pieces of information received from the first robot 200 - 1 and the second robot 200 - 2 .
- the processor 130 included in the electronic device 100 may connect to the first robot 200 - 1 located in the first space through the communication interface 110 ( S403 ).
- the processor 130 transmits a signal requesting information about the robot located in the first space to the server 300 through the communication interface. (110) can be controlled.
- the server 300 receiving the signal requesting information on the robot from the electronic device 100 identifies the robot existing in the first space, and enters the first space based on the signal received from the first robot 200-1. It is possible to identify the existence of the first robot 200 - 1 and transmit information about the first robot 200 - 1 .
- the information of the first robot may include at least one of an image of the first robot, a type, identification information, location information, battery information, and information on a movable area.
- the processor 130 may request control authority from the first robot 200 - 1 . Specifically, the processor 130 may request control authority from the first robot 200 - 1 by transmitting identification information, address information, and user identification information of the electronic device 100 . In addition, the first robot 200 - 1 may grant control authority to the electronic device 100 based on the information of the electronic device 100 received from the electronic device 100 . Accordingly, the electronic device 100 may be connected to the first robot 200 - 1 .
- the first robot 200-1 when the first robot 200-1 is connected to the electronic device 100, the first robot 200-1 captures an image of the first space in which the first robot 200-1 is located (S404) , the captured image may be transmitted to the electronic device 100 (S405).
- the processor 130 may remotely control the first robot 200 - 1 based on the image received from the first robot 200 - 1 .
- the processor 130 may control the display 120 to display a UI for controlling the first robot 200 - 1 located in the first space ( S406 ).
- the UI for controlling the first robot 200 - 1 may include a UI including an image captured by the first robot 200 - 1 .
- the processor 130 may generate an augmented reality image based on the image captured by the first robot 200-1, and control the display 120 to display a UI including the generated augmented reality image. .
- the processor 130 may receive a user input for controlling the first robot 200-1 through a UI for controlling the first robot 200-1. In addition, the processor 130 may transmit a control signal for controlling the first robot to the first robot 200 - 1 based on the user input ( S407 ).
- the first robot 200 - 1 may operate based on a control signal received from the electronic device 100 . Specifically, the first robot 200 - 1 may move in the first space or photograph an object located in the first space according to the control signal and transmit it to the electronic device 100 . Also, the first robot 200 - 1 may collect sound generated in the first space according to the control signal and transmit it to the electronic device 100 . Alternatively, the first robot 200 - 1 may receive a user voice uttered by the user of the electronic device 100 and output the received voice.
- the movement of the first robot 200 - 1 is impossible according to the control signal transmitted by the electronic device 100 .
- the first robot 200-1 encounters an obstacle that is difficult to pass through, when the battery of the first robot 200-1 is less than or equal to a preset value, when the first robot 200-1 malfunctions, the first
- the communication state for receiving the control command for the robot 200-1 is poor or according to the movement area limit set in the first robot 200-1
- the first robot may not be able to move to the second space (S408) ).
- the second space represents a space in which the first robot 200 - 1 cannot move due to such a reason for not being able to move.
- the first robot 200 - 1 When the first robot 200 - 1 is unable to move from the first space to the second space, it may transmit a message indicating that movement to the second space is impossible to the electronic device 100 ( S409 ).
- the processor 130 may identify the second robot located in the second space. Specifically, the processor 130 may request information about the robot existing in the second space from the server 300 (S410).
- the server 300 may identify the robot existing in the second space (S411). Specifically, the server 300 determines that the second robot 200-2 exists in the second space based on the position information of the second robot 200-2 received from the second robot 200-2 in step S402. It can be identified that
- the server 300 may transmit information of the second robot to the electronic device 100 .
- the server 300 may provide information about the plurality of second robots 200 - 2 , but the plurality of second robots 200 - 2 By selecting one, information on the selected second robot 200 - 2 may be provided to the electronic device 100 .
- the server 300 selects the second robot 200-2 closest to the first robot 200-1 from among a plurality of second robots 200-2 located in the second space, and selects the selected second robot 200-2.
- Information of the robot 200 - 2 may be transmitted to the electronic device 100 .
- the server 300 selects the second robot 200-2 having the largest battery capacity among the plurality of second robots 200-2 located in the second space, and transmits information on the selected second robot to the electronic device ( 100) can be transmitted.
- the information about the second robot 200-2 includes image information of the second robot 200-2, the type of the second robot 200-2, identification information, location information, battery information, and a movable area. may include at least one of information about
- the processor 130 may control the display 120 to display the received information on the second robot 200 - 2 ( S413 ).
- the processor 130 is configured to display at least one of identification information, location information, battery information, or information about a movable area of the second robot 200-2, including the image of the second robot 200-2.
- the display 120 may be controlled.
- the processor 130 may receive a user input for connection with the second robot (S414). Specifically, the processor 130 may receive a user input for selecting information about the second robot 200 - 2 displayed on the display 120 . For example, the processor 130 may receive a user input for selecting the image of the second robot displayed on the display 120 .
- the processor 130 may connect to the second robot 200 - 2 ( S415 ). Specifically, the processor 130 may transmit at least one of identification information, address information, and user identification information of the electronic device 100 to request control authority from the second robot 200 - 2 . Then, the second robot 200 - 2 transmits a control authority approval message to the electronic device 100 based on the information received from the electronic device 100 to send the second robot 200 - 2 to the electronic device 100 . can be given control over. Accordingly, the electronic device 100 may be connected to the second robot 200 - 2 to control the second robot 200 - 2 .
- the processor 130 upon receiving the control authority approval message from the second robot 200-2, the processor 130 sends a control command to move the first robot 200-1 to a preset position to the first robot 200-1 ) to control the communication interface 110 to transmit.
- the preset position may be a station of the first robot 200 - 1 or a point where the first space is connected to a space other than the second space.
- the second robot 200 - 2 When the second robot 200 - 2 is connected to the electronic device 100 , the second robot 200 - 2 captures an image of the second space ( S416 ), and the second robot 200 - 2 captures the image of the second space captured by the electronic device 100 . ) can be transmitted to (S417).
- the processor 130 may remotely control the second robot 200 - 2 based on the image received from the second robot 200 - 2 .
- the processor 130 may control the display to display a UI for controlling the second robot 200 - 2 existing in the second space ( S418 ).
- the UI for controlling the second robot 200 - 2 may include a UI including an image captured by the second robot 200 - 2 .
- the processor 130 may generate an augmented reality image based on the image captured by the second robot 200 - 2 and control the display 120 to display a UI including the generated augmented reality image. .
- the electronic device 100 may acquire an image captured by the first robot 200-1 while controlling the first robot 200-1 located in the first space, and the first robot 200-1 ( When it is difficult for the second robot 200 - 1 to move to the second space, an image captured by the second robot 200 - 2 can be acquired while controlling the second robot 200 - 2 .
- 5A to 5E show that images captured by the first robot 200-1 and the second robot 200-2 are received from the first robot 200-1 and the second robot 200-2, and the reception It is a diagram for explaining an electronic device that displays an image.
- FIG. 5A is a diagram for explaining a robot existing in a space in which obstacles exist, for example, a diagram illustrating a robot existing in an art gallery or a museum in which obstacles such as stairs exist.
- an obstacle is an object that restricts movement of the robot to prevent the robot from moving, and may be in various forms such as stairs, as well as escalators, elevators, doors, and fences.
- the space of FIG. 5A may be a variety of spaces as well as an art gallery or a museum.
- an area in which the first robot 200-1 is located eg, an area in which figures A and B exist
- a first space an area in which figures A and B exist
- an area in which the first robot 200-1 cannot move due to an obstacle e.g., the region where Figure C and Figure D exist
- the second space can be described as the second space.
- the first robot 200 - 1 may receive a control signal from the electronic device 100 , and may capture an image while moving in a first space according to the received control signal. For example, the first robot 200 - 1 takes a picture A or B based on a control signal received from the electronic device 100 , and transmits an image of the photographed picture A or B to the electronic device 100 .
- the processor 130 may receive the image captured by the first robot 200-1 from the first robot 200-1, and control the display 120 to display a UI including the received image. there is.
- the processor 130 receives the image of the figure A captured by the first robot 200-1, and displays the UI including the image including the received figure A. 120 can be controlled.
- the processor 130 receives an image captured while the first robot 200-1 moves from the first space to the second space, and the first robot 200-1 ) may control the display 120 to display the image received from the screen.
- the processor 130 may identify the obstacle in the image captured while moving from the first robot 200 - 1 from the first space to the second space. Specifically, the processor 130 may identify an obstacle in an image displayed on the screen using an object recognition program or an artificial intelligence model trained to recognize an object included in the image.
- the processor 130 may determine that it is difficult for the first robot 200 - 1 to move to the second space. This will be described in detail with reference to FIGS. 6A and 6B .
- the processor 130 may receive information about the obstacle directly from the first robot 200 - 1 .
- the processor 130 receives information about the obstacle identified by the first robot 200-1 while moving from the first space to the second space, and based on the received information, the first robot 200-1 performs the second operation. It may be confirmed that it is difficult to move from the first space to the second space.
- the first robot 200-1 is an obstacle located in front of the first robot 200-1 through various sensors such as an infrared sensor, an optical sensor, and a camera included in the first robot 200-1. can be identified, and it can be calculated whether the identified obstacle can be bypassed or overcome.
- information about the obstacle including information on the identified obstacle based on the calculation result and whether the obstacle can be crossed or passed, may be transmitted to the electronic device 100 .
- the processor 130 may identify the second robot 200-2 located in the second space.
- the processor 130 requests information on the robot existing in the second space from the server 300 , and the second robot 200 existing in the second space from the server 300 . -2) may be received, and the received information of the second robot 200-2 may be displayed.
- the processor 130 generates a virtual image of the second robot 200-2 based on the information of the second robot 200-2 received from the server 300, and the generated virtual image An augmented reality image may be generated based on the .
- the processor 130 provides an image including the obstacle captured by the first robot 200-1 for the second robot 200-2 received from the server 300.
- the augmented reality image may be generated by rendering the virtual image, and the display 120 may be controlled to display the generated augmented reality image. Accordingly, the user of the electronic device 100 can visually recognize that the second robot 200 - 2 exists in the second space.
- the processor 130 may receive a user input for selecting information of the second robot 200 - 2 from the augmented reality image displayed on the display 120 .
- the processor 130 may receive a user input for selecting a virtual image for the second robot 200 - 2 as shown in FIG. 5D .
- the processor 130 may transmit a signal requesting control authority to the second robot 200 - 2 .
- control authority for the second robot 200-2 it is connected to the second robot 200-2 to control the second robot 200-2.
- the processor 130 transmits a control command to the second robot 200-2 to the second robot 200-2 to move the second robot 200-2 to an area adjacent to the first robot 200-1.
- Communication interface 110 can control
- the 'adjacent area' is an area of the second space in which the first robot 200 - 1 cannot move, and represents an area where the first robot 200 - 1 meets an obstacle located in front of it. Accordingly, the second robot 200 - 2 may move to an area where the second space meets the obstacle.
- the processor 130 displays an image captured by the second robot 200-2 from the second robot 200-2.
- the display 120 may be controlled to display a UI including the received image.
- the processor 130 causes the second robot 200-2 to display the image of the second space.
- the second robot 200 - 2 may be controlled to capture the image and transmit the captured image.
- the display 120 to display the UI including the image (eg, the image of Figure C) captured by the second robot 200-2 from the second robot 200-2. can control
- the electronic device 100 can provide an image of the specific space to the user without restrictions on the obstacle by using a plurality of robots existing in the specific space, and the user can Continuous service can be provided without restrictions.
- FIG. 6 is a diagram for explaining an electronic device that identifies an obstacle in an image captured by a robot and displays a virtual image.
- 6A is a diagram for explaining an electronic device that receives and displays an image including an obstacle from a robot
- FIG. 6B is a diagram for explaining an electronic device that identifies an obstacle in an image including an obstacle and displays an image of another space is a drawing for
- the first robot 200 - 1 may capture an image while moving in the first space, and may transmit the captured image to the electronic device 100 .
- the processor 130 may receive an image captured by the first robot 200 - 1 from the first robot 200 - 1 , and control the display 120 to display the received image.
- the processor 130 may identify an obstacle in the image captured by the first robot 200 - 1 . Specifically, the processor 130 identifies an obstacle included in the first robot 200 - 1 using an object recognition program stored in a memory (not shown) or an artificial intelligence model trained to recognize an object included in an image. can do.
- the processor 130 may determine that the first robot 200 - 1 cannot cross the obstacle.
- the preset area 610 is an area for determining whether the robot 200 can overcome the obstacle by roughly identifying the size of the obstacle located in front of the robot 200, including the central area of the display screen. It is an area occupying more than a certain percentage of the total screen area.
- the processor 130 displays an obstacle included in the image captured by the first robot 200-1 in a preset area 610 of the screen, and excludes the obstacle from the preset area 610 (hatched area). When the area of is less than or equal to a predefined value, it may be confirmed that the first robot 200 - 1 cannot move beyond the obstacle.
- the processor 130 identifies a door in the image captured by the first robot 200 - 1 , the door is displayed in a preset area 610 of the screen, and an area excluding the door from the preset area 610 .
- the area of is less than or equal to a predefined value, it may be determined that the first robot 200 - 1 cannot move beyond the obstacle.
- the processor 130 may request information about the second space from the server 300 .
- the processor 130 transmits the identification information of the first robot 200-1 to the server 300, and based on the identification information of the first robot 200-1, the first robot 200-1 You can request information about the second space adjacent to the location of .
- the information about the second space may include image information of the second space or image information of an object included in the second space.
- the processor 130 may request information about the robot located in the second space from the server 300 .
- the server 300 Position information of the first robot 200 - 1 and the second robot 200 - 2 may be stored. Accordingly, the server 300 receiving the identification information of the first robot 200-1 from the electronic device 100 may identify the location information of the first robot based on the identification information of the first robot 200-1. there is.
- the server 300 may acquire information about the second space adjacent to the first robot 200 - 1 based on the location information of the first robot 200 - 1 .
- the server 300 may store spatial information for a plurality of spaces.
- the server 300 may receive spatial information from a space collecting device located in each of a plurality of spaces, and store the received spatial information.
- the operation of the server 300 will be described in detail with reference to FIGS. 7 and 8 .
- the server 300 may transmit the acquired information on the second space and information on the robot located in the second space to the electronic device 100 .
- the server 300 includes at least one of image information of the second space, image information of the second robot 200 - 2 located in the second space, identification information, location information, battery information, and information on the movable area. One may be transmitted to the electronic device 100 .
- the processor 130 may request information about the third space connected to the second space from the server 300 . Specifically, the processor 130 may receive location information of the third space and information on the robot positioned in the third space. In addition, it is possible to check whether the third robot located in the third space can move to the second space, and display a UI for controlling the third robot.
- the processor 130 when there is a robot located in the second space, the processor 130 generates an augmented reality image based on the information on the second space received from the server 300 and the information on the robot located in the second space.
- the display 120 may be controlled to generate and display it.
- the processor 130 renders the image of the second space received from the server 300 and the image of the second robot 200-2 on the image received from the first robot 200-1 to obtain an augmented reality image.
- the processor 130 renders the image of the second space and the image of the second robot 200 - 2 at the location of the obstacle displayed in the preset area 610 in the image received from the first robot 200 - 1 .
- the display 120 may be controlled to do so.
- the processor 130 may control the display 120 to provide a graphic effect corresponding to the obstacle while rendering the image of the second space and the image of the second robot 200 - 2 .
- the processor 130 may control the display 120 to provide a graphic effect of opening the door while rendering the image of the second space and the image of the second robot 200 - 2 as shown in FIG. 6B . there is.
- the processor 130 determines the image of the second space and the image of the second robot existing in the second space.
- the display 120 may be controlled to display a UI including
- FIG. 7 is a diagram for describing a server that collects and stores spatial information according to an embodiment of the present disclosure.
- a spatial information collecting device 500 may be located in each space of the present disclosure.
- the spatial information collection device 500 is a device that generates an image of the space by imaging the space in which the spatial information collection device 500 is placed, and refers to various imaging devices including a camera and CCTV.
- FIG. 8 there may be a space collecting device 500-1 for imaging the first space and a space collecting device 500-2 for imaging the second space based on the area where the obstacle is located. there is.
- the space collecting device 500-1 for the first space and the space collecting device 500-2 for the second space are each shown as one, but a plurality of space collecting devices are provided in each space. can be located
- the spatial information collecting apparatus 500 may collect spatial information about the space in which the spatial information collecting apparatus 500 is placed (S710). Specifically, the spatial information collection apparatus 500 may generate an image including robots or various objects existing in the space while photographing the space in which the spatial information collection apparatus 500 is located.
- the first space collecting apparatus 500 - 1 may collect spatial information on the first space by photographing the first space and robots or various objects located in the first space.
- the second space collecting apparatus 500 - 2 may collect spatial information about the second space by photographing the second space and robots or various objects located in the second space.
- the spatial collection device 500 may transmit the collected spatial information to the server 300 (S720). Specifically, the space collecting device 500 may be connected to the server 300 and periodically transmit the collected spatial information to the server 300 .
- the server 300 may reconstruct and store the spatial information received from the spatial information collecting apparatus 500 through image processing (S730). Specifically, the server 300 performs Euclidean geometric transformation by enlarging, reducing, and rotating the image included in the spatial information received from the spatial information collecting device 500, performing color correction or color transformation of the image, or Each space through various image processing such as generating one image for the first space by combining a plurality of images received from the first space collecting device 500-1 (or the second space collecting device 500-2) spatial information can be reconstructed and stored.
- image processing S730
- the server 300 performs Euclidean geometric transformation by enlarging, reducing, and rotating the image included in the spatial information received from the spatial information collecting device 500, performing color correction or color transformation of the image, or Each space through various image processing such as generating one image for the first space by combining a plurality of images received from the first space collecting device 500-1 (or the second space collecting device 500-2) spatial information can be reconstructed and stored.
- the server 300 may transmit the reconstructed spatial information to the spatial information collecting apparatus 500 (S740).
- the spatial information collection apparatus 500 may re-collect spatial information based on the reconstructed spatial information received from the server 300 .
- a plurality of spatial information may be connected (S750).
- the manager of the server 300 may connect a plurality of spatial information based on geographic information for each space.
- the server 300 may connect each of a plurality of pieces of spatial information based on the input geographic information. For example, if it is indicated that the first space and the second space are connected and the second space and the third space are connected in the geographical information received by the server 300 , the server 300 provides spatial information about the first space and the second space. Spatial information on the two spaces may be mapped as a pair, and spatial information on the second space and spatial information on the third space may be mapped as a pair.
- the server 300 may store a plurality of spatial information connected to each other in the server 300 (S760).
- the server 300 when the server 300 receives the identification information of the first robot 500 - 1 from the electronic device 100 , the server 300 receives the identification information of the first robot 500 - 1 based on the identification information of the first robot 500 - 1 . ), it is possible to identify information on the second space adjacent to the first robot 500-1 based on the confirmed position information of the first robot 500-1.
- FIG. 9 is a flowchart illustrating a method of controlling an electronic device according to an embodiment of the present disclosure.
- the electronic device 100 may be connected to the first robot 200 - 1 located in the first space. Specifically, the electronic device 100 may transmit a control authority request message to the first robot 200 - 1 and receive a control authority from the first robot 200 - 1 .
- a UI for controlling the first robot may be displayed ( S910 ).
- the first robot 200-1 when connected to the first robot 200-1, receives an image captured by the first robot 200-1, and displays a UI including the received image.
- the first robot 200 - 1 may receive an image captured while moving from the first space to the second space, and display a UI including the received image.
- a control command for controlling the first robot 200 - 1 may be transmitted to the first robot 200 - 1 based on the user manipulation command. Accordingly, the first robot 200 - 1 may image the first space while moving in the first space according to the control command.
- the first robot 200 - 1 it may be checked whether the first robot 200 - 1 can move to a second space different from the first space. Specifically, when the first robot 200-1 identifies an obstacle in an image captured while moving from the first space to the second space, and the identified obstacle is located in a preset area of the screen, the first robot It may be determined that it is difficult to move from the first space to the second space.
- the first robot 200-1 from the first robot 200-1 receives information about the identified obstacle while moving from the first space to the second space, and receives information about the received obstacle. Based on the information, it may be confirmed whether the first robot 200 - 1 can move from the first space to the second space. To this end, the first robot 200 - 1 may identify obstacles through various sensors such as an infrared sensor, an optical sensor, and a camera while moving according to a control command of the electronic device 100 . In addition, the first robot 200 - 1 may transmit information about the identified obstacle to the electronic device 100 .
- the second robot 200-2 located in the second space may be checked (S930).
- the first robot 200-1 When the first robot 200-1 receives information about an obstacle identified while moving from the first space to the second space from the first robot 200-1, based on the received information, the first robot 200 If it is confirmed that -1) is difficult to move from the first space to the second space, it may be confirmed whether the robot exists in the second space.
- information on the robot located in the second space may be requested from the server 300 .
- information on the second robot 200 - 2 located in the second space may be received from the server 300 .
- the information of the second robot 200 - 2 may include at least one of an image, a type, identification information, location information, battery information, and information about a movable area of the second robot 200 - 2 .
- a third It can be checked whether the third robot 200 - 3 located in the space can move to the second space.
- a UI for controlling the third robot 200 - 3 may be displayed.
- the received information of the second robot 200 - 2 may be displayed.
- image information of the second robot 200 - 2 may be displayed.
- the second image captured by the second robot 200 - 2 may be displayed.
- a control authority may be requested from the second robot 200 - 2 .
- it may be connected to the second robot 200 - 2 and transmit a control command to the second robot 200 - 2 .
- a UI for controlling the confirmed second robot 200 - 2 may be displayed (S940).
- a control command for causing the second robot 200 - 2 to move to an area adjacent to the first robot 200 - 1 may be transmitted to the second robot 200 - 2 .
- the second robot 200-2 moves to an area adjacent to the first robot 200-1, it receives an image captured by the second robot 200-2 from the second robot 200-2, and 2 A UI including an image received from the robot 200 - 2 may be displayed.
- the first robot 200-1 moves to the first A control command to move to a preset position such as a station of the robot 200 - 1 may be transmitted to the first robot 200 - 1 .
- the various operations described above as being performed through at least one of the electronic device 100 or the server 200 are one or more electronic devices in the form of a method of controlling an electronic device or a method of controlling or operating a system including the electronic device. can be performed through
- the embodiments described in the present disclosure are ASICs (Application Specific Integrated Circuits), DSPs (digital signal processors), DSPDs (digital signal processing devices), PLDs (Programmable logic devices), FPGAs (field programmable gate arrays) ), a processor, a controller, a micro-controller, a microprocessor, and may be implemented using at least one of an electrical unit for performing other functions.
- ASICs Application Specific Integrated Circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs Programmable logic devices
- FPGAs field programmable gate arrays
- embodiments described herein may be implemented by the processor itself. According to the software implementation, embodiments such as the procedures and functions described in this specification may be implemented as separate software modules. Each of the software modules described above may perform one or more functions and operations described herein.
- computer instructions for performing a processing operation in a user device or an administrator device may be stored in a non-transitory computer-readable medium.
- the computer instructions stored in the non-transitory computer readable medium are executed by the processor of the specific device, the specific device performs the processing operation of the user device and/or the manager device according to the various embodiments described above.
- the non-transitory readable medium refers to a medium that stores data semi-permanently, rather than a medium that stores data for a short moment, such as a register, cache, memory, and the like, and can be read by a device.
- a non-transitory readable medium such as a CD, DVD, hard disk, Blu-ray disk, USB, memory card, ROM, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Manipulator (AREA)
Abstract
La présente invention concerne un procédé de commande d'un dispositif électronique. Le procédé de commande du dispositif électronique selon la présente invention comprend les étapes consistant à : afficher une interface utilisateur pour commander un premier robot situé dans un premier espace ; transmettre, au premier robot, une commande de commande pour commander le premier robot sur la base d'une commande de manipulation de l'utilisateur ; lorsqu'il est difficile pour le premier robot de se déplacer vers un second espace différent du premier espace, identifier un second robot situé dans le second espace ; et afficher une interface utilisateur pour commander le second robot identifié.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020200117915A KR20220035767A (ko) | 2020-09-14 | 2020-09-14 | 전자 장치 및 이의 제어 방법 |
| KR10-2020-0117915 | 2020-09-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022055117A1 true WO2022055117A1 (fr) | 2022-03-17 |
Family
ID=80631894
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2021/009249 Ceased WO2022055117A1 (fr) | 2020-09-14 | 2021-07-19 | Dispositif électronique et son procédé de commande |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20220035767A (fr) |
| WO (1) | WO2022055117A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12472632B2 (en) * | 2022-06-28 | 2025-11-18 | Toyota Jidosha Kabushiki Kaisha | Operation system, operation method, and storage medium |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102713579B1 (ko) * | 2024-01-30 | 2024-10-07 | (주)오토베이션 | 자율 주행 로봇의 동작 및 기능을 제어하는 방법 및 서버 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0884328A (ja) * | 1994-09-13 | 1996-03-26 | Nippon Telegr & Teleph Corp <Ntt> | テレイグジスタンス型テレビ電話装置 |
| JPH08336067A (ja) * | 1995-06-09 | 1996-12-17 | Ohbayashi Corp | マルチロボットシステムの制御方法 |
| KR20120126772A (ko) * | 2011-05-12 | 2012-11-21 | 엘지전자 주식회사 | 청소 장치, 및 복수의 로봇 청소기를 이용한 협동 청소 방법 |
| KR20150014237A (ko) * | 2013-07-29 | 2015-02-06 | 삼성전자주식회사 | 자동 청소 시스템, 청소 로봇 및 그 제어 방법 |
| KR20180039977A (ko) * | 2016-10-11 | 2018-04-19 | 엘지전자 주식회사 | 공항용 보조 로봇 및 그의 동작 방법 |
-
2020
- 2020-09-14 KR KR1020200117915A patent/KR20220035767A/ko not_active Ceased
-
2021
- 2021-07-19 WO PCT/KR2021/009249 patent/WO2022055117A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0884328A (ja) * | 1994-09-13 | 1996-03-26 | Nippon Telegr & Teleph Corp <Ntt> | テレイグジスタンス型テレビ電話装置 |
| JPH08336067A (ja) * | 1995-06-09 | 1996-12-17 | Ohbayashi Corp | マルチロボットシステムの制御方法 |
| KR20120126772A (ko) * | 2011-05-12 | 2012-11-21 | 엘지전자 주식회사 | 청소 장치, 및 복수의 로봇 청소기를 이용한 협동 청소 방법 |
| KR20150014237A (ko) * | 2013-07-29 | 2015-02-06 | 삼성전자주식회사 | 자동 청소 시스템, 청소 로봇 및 그 제어 방법 |
| KR20180039977A (ko) * | 2016-10-11 | 2018-04-19 | 엘지전자 주식회사 | 공항용 보조 로봇 및 그의 동작 방법 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12472632B2 (en) * | 2022-06-28 | 2025-11-18 | Toyota Jidosha Kabushiki Kaisha | Operation system, operation method, and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20220035767A (ko) | 2022-03-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017119664A1 (fr) | Appareil d'affichage et ses procédés de commande | |
| WO2018062658A1 (fr) | Appareil d'affichage et son procédé de commande | |
| WO2018008991A1 (fr) | Dispositif d'affichage et procédé de traitement d'image | |
| WO2019059562A1 (fr) | Dispositif électronique comprenant une pluralité de caméras utilisant un mode d'obturateur roulant | |
| WO2017030262A1 (fr) | Appareil photo, et procédé de commande associé | |
| WO2020096192A1 (fr) | Dispositif électronique et procédé de commande correspondant | |
| WO2017126741A1 (fr) | Visiocasque et procédé de commande de celui-ci | |
| WO2021118187A1 (fr) | Dispositif électronique pliable ayant une caméra rotative et procédé de capture d'images associé | |
| WO2017065535A1 (fr) | Dispositif électronique et son procédé de commande | |
| WO2018093075A1 (fr) | Dispositif électronique et procédé de commande associé | |
| WO2022055117A1 (fr) | Dispositif électronique et son procédé de commande | |
| WO2019112114A1 (fr) | Terminal de type lunettes et procédé pour son utilisation | |
| WO2022035054A1 (fr) | Robot et son procédé de commande | |
| WO2019112308A1 (fr) | Dispositif électronique, appareil de terminal utilisateur et procédé de commande associé | |
| WO2017164545A1 (fr) | Dispositif d'affichage et procédé permettant de commander un dispositif d'affichage | |
| WO2021145473A1 (fr) | Terminal mobile et procédé de commande associé | |
| WO2016122153A1 (fr) | Appareil d'affichage et son procédé de commande | |
| WO2014073939A1 (fr) | Procédé et appareil de capture et d'affichage d'image | |
| WO2021225333A1 (fr) | Dispositif électronique permettant de fournir un service de réalité augmentée, et son procédé de fonctionnement | |
| WO2021025266A1 (fr) | Appareil électronique et son procédé de commande | |
| WO2016047824A1 (fr) | Dispositif de projection d'informations d'image, et procédé de commande de dispositif de projection | |
| WO2020171572A1 (fr) | Appareil électronique et procédé de commande de celui-ci | |
| WO2021177594A1 (fr) | Procédé et dispositif d'aide à la conduite au moyen d'un affichage tête haute à réalité augmentée | |
| WO2017065555A1 (fr) | Dispositif de projection d'image, procédé de correction d'image associé et support d'enregistrement lisible par ordinateur non transitoire | |
| WO2018088804A1 (fr) | Appareil électronique et son procédé de commande |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21866984 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21866984 Country of ref document: EP Kind code of ref document: A1 |