US12459127B2 - Methods and arrangements for controlling a robot device over a wireless network - Google Patents
Methods and arrangements for controlling a robot device over a wireless networkInfo
- Publication number
- US12459127B2 US12459127B2 US16/980,482 US201816980482A US12459127B2 US 12459127 B2 US12459127 B2 US 12459127B2 US 201816980482 A US201816980482 A US 201816980482A US 12459127 B2 US12459127 B2 US 12459127B2
- Authority
- US
- United States
- Prior art keywords
- stream
- video
- robot device
- data
- control commands
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0022—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Definitions
- This disclosure relates to control of a robot device. In more particular, it relates to methods and arrangements for controlling the robot device over a radio communication network.
- the cloud provides three possible levels at which a framework could be implemented.
- the lowest level is Infrastructure as a Service (IaaS), where bare operating systems are provided on machines, which can be virtualized, in the cloud.
- the second level Platform as a Service (PaaS)
- PaaS Platform as a Service
- SaaS Software as a Service
- the third level is the highest level of structure of which there is a demand these days.
- a current focus of cloud based robotics is to speed up the processing of input data collected from many sensors with big data computation.
- Another approach is to collect various knowledge bases in centralized locations e.g., possible grasping poses of various three-dimensional (3D) objects.
- Controlling robot devices in the cloud over a wireless communication network is currently out of the scope of the robotics industry, at least partly due to lack of wireless links with trustworthy and low delays and strategic decisions.
- Patent document U.S. Pat. No. 8,798,791 B2 teaches a robot control system and a robot control method. This document relates to moving a robot control in the cloud to a point that is closest to the robot that the user wishes to control.
- a robot control would be deployed in the cloud and the robot would be connected to the controller via regular transport protocols, for example, User datagram protocol (UDP)/Transmission control protocol (TCP).
- UDP User datagram protocol
- TCP Transmission control protocol
- Messages carrying robot status and robot command information typically consume a relatively low network bandwidth, for instance, ca. 1 Mega-bit per second (Mbps).
- Said channel may well use wireless connection due to the ease of deployment and/or due to the fact that the robotic device may be on a mobile platform or even on a drone.
- wireless channel options may have certain cost to operate on, for example long term evolution (LTE) versus Wi-Fi, in return of their robustness.
- LTE long term evolution
- the present disclosure provides a method of controlling a robot device within a two-way video session between the robot device and a control device over a mobile radio communication network.
- the method is implemented in the control device.
- the method comprises extracting sensor data of the robot device, from an up-stream video stream of said two-way video session.
- the method also comprises providing the sensor data within the control device.
- the method comprises obtaining control commands for controlling the robot device, based on the provided sensor data.
- the method comprises embedding the control commands in a down-stream video stream of said two-way video session.
- the disclosure provides a method of controlling a robot device within a two-way video session between the robot device and a control device over a mobile radio communication network.
- the method is implemented in the robot device.
- the method comprises extracting control commands of the control device from a down-stream video stream of said two-way video session.
- this method comprises applying the control commands within the robot device.
- the method also comprises obtaining sensor data from sensors of the robot device based on the control commands.
- the method comprises embedding the sensor data in an up-stream video stream of said two-way video session.
- the disclosure provides a control device that is configured to control a robot device and that is adapted to participate in a two-way video session with the robot device over a mobile radio communication network.
- the control device comprises a processor circuit, and a memory that has instructions executable by the processor circuit.
- the processor circuit When executing said instructions, the processor circuit is configured to extract sensor data of the robot device from an up-stream video stream of said two-way video session.
- the processor circuit is also configured to provide the sensor data within the control device.
- the processor circuit is configured to obtain control commands for controlling the robot device, based on the provided sensor data.
- the processor circuit is configured to embed the control commands in a down-stream video stream of said two-way video session.
- the disclosure provides a robot device that is configured to be controlled by a control device within a two-way video session with the control device over a mobile radio communication network.
- the robot device comprises a processor circuit and a memory that has instructions executable by the processor circuit.
- said processor circuit When executing said instructions said processor circuit is configured to extract control commands from a down-stream video stream of said two-way video session. Also, when executing said instructions said processor circuit is configured to apply the control commands within the robot device. Also, when executing said instructions said processor circuit is configured to obtain sensor data from sensors of the robot device, based on the control commands. In addition, when executing said instructions said processor circuit is configured to embed the sensor data in an up-stream video stream of said two-way video session.
- the disclosure provides a computer program that is adapted for controlling a robot device within a two-way video session between the robot device and a control device over a mobile radio communication network.
- the computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method claims.
- the disclosure provides a computer-readable storage medium, having thereon said computer program.
- the present disclosure increases reliability of control of a robot device by using differentiation of control commands.
- the up-stream video stream can be used to transmit video information on the current status of a remote robot device. It is a further advantage that the down-stream video stream can be used to transmit a graphical model of the predicted movements of the robot device.
- FIG. 1 schematically illustrates a generic architecture that is related to embodiments of the present disclosure
- FIG. 2 schematically presents an embodiment of an architecture adapted for controlling a robot device, according to the present disclosure
- FIGS. 3 and 4 present flow charts of actions of methods according to embodiments of the present disclosure
- FIGS. 6 and 7 schematically present a control device and a robot device, respectively, according to embodiments of the present disclosure.
- 4G, 5G or NR mobile communication systems offer a QoS that is adequate for robot control over the cloud.
- the QoS provided using such mobile communication systems fulfils robot control QoS requirements in terms of latency and robustness.
- video over long term evolution (ViLTE) is advantageously used.
- efficient video coding such as H.264, is preferably used.
- robot control over the cloud may utilize calculation of positions or velocities of the robot device, via which a robot device trajectory may be determined.
- FIG. 1 schematically illustrates a generic architecture that is related to embodiments of the present disclosure.
- the generic architecture comprises a robot device 102 , and a control device 106 in two-way video session with each other over a radio communication network 110 .
- the robot device 102 comprises execution logic 104
- the control device comprises execution logic 108 .
- the radio communication network 110 may comprise a fourth generation (4G) mobile communication network, a fifth generation (5G) mobile communication network or a new radio (NR) mobile communication network.
- 4G fourth generation
- 5G fifth generation
- NR new radio
- FIG. 2 schematically presents an embodiment of an architecture adapted for controlling a robot device 202 by a control device 206 within a video session over a radio communication network 210 , according to the present disclosure.
- This architecture also comprises an execution logic 204 located down-streams the radio communication network 210 .
- the architecture also comprises an execution logic 208 up-streams said radio communication network 210 .
- Controlling of the robot device 202 utilizes a control-loop of the control device 206 . Based on sensor data as provided from the robot device, which sensor data is provided to the control device 206 , control commands are determined. These control commands are then provided to the robot device 202 , based on which sensor data of the robot device 202 is provided and delivered to the control device 206 .
- control commands and the sensor data used in the control-loop may depend on the type of robot control.
- the control commands may contain velocity values for servo motors of the robot device.
- the control-loop may then use joint position values, as sensor data, provided by sensors.
- the control commands may contain force values, and sensor data in the form of values from torque sensors may the used in the control-loop of the controlling.
- the present disclosure utilizes a video streaming architecture in a two-way video session between the robot device 202 and a control device 206 .
- 4G, 5G or NR mobile communication systems offer a QoS that is adequate for robot control over the cloud.
- the QoS provided using such mobile communication systems fulfils robot control QoS requirements in terms of latency and robustness.
- ViLTE service may be advantageously used.
- efficient video coding such as H.264, may moreover be preferably used.
- the two-way video session between the robot device and the control device comprises an up-stream video stream, from the robot device 202 to the control device 206 , and a down-stream video stream, from the control device 206 to the robot device 202 .
- both the up-stream video stream and the down-stream video stream comprise video frames and information messages interleaved with one another.
- the present disclosure uses a two-way video session between the robot device 202 and the control device 206 over the radio communication network 210 , the two-way video session needs to be set-up prior to controlling the robot device 202 by the control device 206 .
- setting-up a two-way video session is however well-known, and will thus not be explained further herein. Having accomplished the set-up of the video session, a down-stream video stream as well as an up-stream video stream is thus created.
- control commands are extracted from the down-stream video stream. These control commands are thus extracted from the information messages interleaved with video frames in the down-stream video stream.
- the control commands are provided to the robot device 202 to instruct the robot device whether and/or how to move. Based on these control commands, sensors at the robot device typically detect sensor data. In action S 218 , this sensor data is embedded in information messages interleaved between video frames in the up-stream video stream.
- This up-stream video stream of the two-way video session is advantageously delivered by a video service, such as video over LTE (ViLTE) over a 4G, 5G or NR radio communication network 210 to the control device 206 .
- a video service such as video over LTE (ViLTE) over a 4G, 5G or NR radio communication network 210 to the control device 206 .
- sensor data is extracted from information messages interleaved with video frames in the up-stream video stream.
- the extracted sensor data is then when provided to the control device 206 , to give input in terms of, for instance, position or velocity information of the robot device to the control device 206 , based on which the control data is adapted to determine new or updated control commands for the robot device 202 .
- Such updated control commands are then embedded in information messages interleaved with video frames in the up-stream video stream.
- This up-stream video stream comprising the control commands is then delivered over the radio communication network 210 to the robot device 202 .
- the up-stream video stream now comprises new and/or updated control commands to the extracted from information messages interleaved with video frames. These control commands are then extracted in a further iteration of action S 212 .
- a control-loop is thus established between the robot device 202 and the control device 206 .
- the video session is set-up primarily to establish a communication channel between the robot device a 202 and the control device 206 , which communication channel has a quality of service that is well suited for the control of a robot device over a radio communication network in terms of latency and robustness.
- Such a control will exchange control commands and sensor data.
- video data as such, as comprised in the video frames, as comprised in both the up-stream video stream and the down-stream video stream, may even be unused.
- video data may be advantageously transmitted in both up-streams and down-streams in the video session, to enhance the robustness of the control of the robot device 202 .
- the down-stream video stream is decoded in action S 214 , whereby video data is obtained from the video frames interleaved with information messages, comprising control commands, in the up-stream video stream.
- the thus obtained video data is then provided to a human machine interface (HMI).
- HMI human machine interface
- This video data may comprise graphical presentations of detected positions or velocities, against corresponding positions or velocities as instructed by the control commands.
- up-stream video data may also be used.
- a camera at or near the robot device 202 may adapted to monitor positions and/or movements of the robot device, and to provide video data thereby created, to for instance, the execution logic 204 .
- this video data is encoded into video frames interleaved with information messages in an up-stream video stream.
- sensor data is embedded in the information messages interleaved with the video frames in the up-stream video stream. These video frames hence comprise the video data encoded into the video frames in action S 216 .
- the up-stream video stream comprising video frames with the created video data, and information messages comprising sensor data will arrive at the execution logic 208 , after being delivered over the radio communication network 210 .
- sensor data is extracted from the information messages being interleaved with video frames in the up-stream video stream.
- the video frames also comprise the created video data as originated from camera at or near the robot device 202 .
- the up-stream video stream is decoded, whereby video data from the camera, is obtained from the video frames of the up-stream video stream.
- the thus obtained video-data may then be provided to a graphical user interface (GUI), by the use of which the video data is graphically presented to a user.
- GUI graphical user interface
- video data may be provided.
- this video data is encoded into video frames interleaved with information messages in a down-stream video stream.
- control commands from the control device may be determined.
- control commands may now be embedded into information messages interleaved with video frames, which video frames now comprise the video data encoded in action S 224 .
- FIG. 2 schematically indicates layers which may be used for the communication of up-stream and down-stream video streams. These layers comprise real-time transportation protocol (RTP), user datagram protocol (UDP), internet protocol (IP).
- RTP real-time transportation protocol
- UDP user datagram protocol
- IP internet protocol
- the RTP is a network protocol for delivering audio, video and audio and video over IP networks.
- the UDP is a communication protocol which in general is primarily used for establishing low latency and loss tolerating connection.
- the 4G/5G/NR layer denotes the communication standards being used.
- a video session using a 4G/5G/NR communication standard will have a quality of service (QoS) that is suitable for controlling a robot device from a control device over a radio communication network 210 .
- QoS quality of service
- FIG. 3 presents a flow-chart of actions within a method of controlling a robot device 102 , 202 , 70 within a two-way video session between the robot device and a control device 106 , 206 , 60 over a mobile radio communication network 210 .
- the method is implemented in the control device 106 , 206 , 60 .
- the method comprises:
- the down-stream video stream within the method may comprise down-stream video frames 52 and down-stream information messages 50 interleaved with one another, and wherein embedding the control commands may comprise embedding the control commands in the down-stream information messages.
- the method may comprise providing first video data and encoding S 224 said first video data whereby the down-stream video stream is created.
- the up-stream video stream within the method may comprise up-stream video frames 52 and up-stream information messages 50 interleaved with one another, and wherein extracting sensor data from the up-stream video stream may comprise extracting sensor data from the up-stream information messages.
- the method may further comprise decoding S 222 the up-stream video stream whereby up-stream video data is obtained, and providing the up-stream video data to a user interface at the control device.
- said up-stream video data as comprised in the up-stream video frames 52 and the sensor data as comprised in the up-stream information messages 50 may be delivered in one and the same video session.
- the mobile radio communication network 210 may comprises a 4G, 5G, or new radio (NR) communication network.
- NR new radio
- the video session may use a video service over the mobile radio communication network.
- the video service may comprise video over long term evolution (ViLTE).
- FIG. 4 presents a flow-chart of actions within a method of controlling a robot device 102 , 202 , 70 within a two-way video session between the robot device and a control device 106 , 206 , 60 over a mobile radio communication network 210 .
- the method is implemented in the robot device. The method comprises:
- the up-stream video stream within the method may comprise up-stream video frames 52 and up-stream information messages 50 interleaved with one another, and wherein embedding S 218 , 408 the sensor data may comprise embedding the sensor data in the up-stream information messages.
- the method may further comprise obtaining up-stream video data, and encoding S 216 said up-stream video data, whereby the up-stream video stream is created.
- said up-stream video data as comprised in the up-stream video frames 52 and the sensor data as comprised in the up-stream information messages 50 may be delivered in one and the same video session.
- the down-stream video stream may comprise down-stream video frames 52 and down-stream information messages 50 interleaved with one another, and wherein extracting S 212 , 402 control commands from the down-stream video stream may comprise extracting control commands from the down-stream information messages.
- the method may further comprise decoding S 214 the down-stream video stream whereby down-stream video data is obtained and providing down-stream video data to a human-machine interface (HMI) at the robot device.
- HMI human-machine interface
- the down-stream video data may comprise video data provided by the control device.
- the mobile radio communication network 210 may comprises a 4G, 5G, or new radio (NR) communication network.
- NR new radio
- the video session may use a video service over the mobile radio communication network.
- the video service may comprise video over long term evolution (ViLTE).
- FIG. 5 presents a schematic architecture of a video stream, related to embodiments of this disclosure.
- This schematic architecture is applicable to both the up-stream video stream as well as to the down-stream video stream.
- the schematic architecture comprises video frames interleaved with information messages.
- the video frames are typically time-interleaved with the information messages.
- the video stream may comprise a H.264 video stream. It is noted that an H.264 video stream is packetized into network abstraction layer (NAL) units.
- NAL network abstraction layer
- Sensor data from the robot device and control commands from the control device may be embedded into a video stream, as NAL units containing a “User data supplemental enhancement information (SEI) message”.
- SEI Software data supplemental enhancement information
- NAL units comprising different data types, for instance picture parameters, sequence parameters, and video frame data.
- Control commands for the robot device as well as sensor data may be inserted into NAL unit type 6 .
- the NAL unit type 6 comprises SE messages and the special SEI message type “User data” may be used to carry user defined information.
- a single SEI message may be transmitted interleaved between video frames.
- a single SEI message may comprise multiple control commands and sensor data.
- Each “User data” message is typically identified by a universally unique identifier (UUID). Different UUIDs may be assigned to different communication channels, control targets or sensors.
- the radio communication network can transparently serve the video session using existing video service, for instance ViLTE, and corresponding quality of expectation (QoE)/QoS mechanisms.
- the robot device 202 may additionally use these QoE/QoS mechanisms, for instance different handling based on NAL unit type, available in under-lying video service and optimize its operation.
- FIG. 6 schematically presents a control device 60 according to embodiments of the present disclosure.
- the control device 60 is configured to control a robot device 102 , 202 , 70 and adapted to participate in a two-way video session with the robot device over a mobile radio communication network 210 .
- the control device 106 , 206 , 60 comprises a processor circuit 62 , and a memory 64 that has instructions executable by the processor circuit 62 .
- the processor circuit 62 When executing said instructions, the processor circuit 62 is configured to extract sensor data of the robot device 102 , 202 , 70 from an up-stream video stream of said two-way video session.
- the processor circuit 62 is configured to provide the sensor data within the control device.
- FIG. 7 schematically presents a robot device 70 according to embodiments of the present disclosure.
- the robot device 70 is configured to be controlled by a control device 106 , 206 , 60 within a two-way video session with the control device 106 , 206 , 60 over a mobile radio communication network 210 .
- the robot device comprises a processor circuit 72 , and a memory 74 that has instructions executable by the processor circuit 72 .
- said processor circuit 72 is configured to extract control commands from a down-stream video stream of said two-way video session.
- said processor circuit 72 is configured to apply the control commands within the robot device 70 .
- said processor circuit 72 when executing said instructions said processor circuit 72 is configured to obtain sensor data from sensors of the robot device 70 , based on the control commands. In addition, when executing said instructions said processor circuit 72 is configured to embed the sensor data in an up-stream video stream of said two-way video session.
- the present disclosure also comprises a computer program for emulating wired connectivity between two or more robot devices and a controller.
- the computer program comprises instructions which, when executed on at least one processor, cause the at least one processor to carry out the any one of the actions as mentioned above and/or presented in any FIG. 3 or 4 .
- the present disclosure also comprises a computer program that is adapted for controlling a robot device 102 , 202 , 70 within a two-way video session between the robot device 102 , 202 , 70 and a control device 106 , 206 , 60 over a mobile radio communication network 210 .
- the computer program comprises instructions which, when executed on at least one processor, cause the at least one processor to carry out any one of the actions as mentioned above and/or presented in any FIG. 3 or 4 .
- the present disclosure also comprises a computer-readable storage medium having thereon the computer program as above.
- the present disclosure increases reliability of control of a robot device by using differentiation of control commands.
- the up-stream video stream can be used to transmit video information on the current status of a remote robot device. It is a further advantage that the down-stream video stream can be used to transmit a graphical model of the predicted movements of the robot device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Manipulator (AREA)
- Selective Calling Equipment (AREA)
Abstract
Description
-
- Action 302: Extracting sensor data of the robot device 102, 202, 70, from an up-stream video stream of said two-way video session.
- Action 304: Providing the sensor data within the control device 106, 206, 60.
- Action 306: Obtaining control commands for controlling the robot device 102, 202, 70, based on the provided sensor data.
- Action 308: Embedding the control commands in a down-stream video stream of said two-way video session.
-
- Action 402: Extracting control commands of the control device 106, 206, 60, from a down-stream video stream of said two-way video session.
- Action 404: Applying the control commands within the robot device 102, 202, 70.
- Action 406: Obtaining sensor data from sensors of the robot device 102, 202, 70, based on the control commands.
- Action 408: Embedding the sensor data in an up-stream video stream of said two-way video session.
-
- 3D three-dimensional
- 4G 4th generation mobile communication
- 5G 5th generation mobile communication
- GUI graphical user interface
- HMI human-machine interface
- Hz Hertz
- IaaS infrastructure as a service
- IP Internet protocol
- LTE long term evolution
- Mbps Mega bit per second
- NAL network abstraction layer
- NR new radio
- QoE quality of expectation
- QoS quality of service
- PaaS platform as a service
- RAaaS robotics and automation as a service
- RTP real-time transmission protocol
- SaaS software as a service
- SEI supplemental enhancement information
- TCP transmission control protocol
- UDP user datagram protocol
- ViLTE video over LTE
Claims (18)
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2018/066280 WO2019242846A1 (en) | 2018-06-19 | 2018-06-19 | Methods and arrangements for controlling a robot device over a wireless network |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210023712A1 US20210023712A1 (en) | 2021-01-28 |
| US12459127B2 true US12459127B2 (en) | 2025-11-04 |
Family
ID=62748962
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/980,482 Active 2041-02-25 US12459127B2 (en) | 2018-06-19 | 2018-06-19 | Methods and arrangements for controlling a robot device over a wireless network |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US12459127B2 (en) |
| EP (1) | EP3811700A1 (en) |
| WO (1) | WO2019242846A1 (en) |
Citations (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080015427A1 (en) * | 2006-06-30 | 2008-01-17 | Nathan Kastelein | System and network for remote medical procedures |
| US7330875B1 (en) | 1999-06-15 | 2008-02-12 | Microsoft Corporation | System and method for recording a presentation for on-demand viewing over a computer network |
| US20090135919A1 (en) * | 2007-11-23 | 2009-05-28 | Samsung Electronics Co., Ltd. | Method and an apparatus for embedding data in a media stream |
| US20120063508A1 (en) * | 2010-09-15 | 2012-03-15 | Shinobu Hattori | Transmitting apparatus, transmitting method, receiving apparatus, receiving method, program, and broadcasting system |
| WO2012167927A1 (en) * | 2011-06-09 | 2012-12-13 | Panasonic Corporation | Coding of control data for adaptive loop filters |
| US20140016707A1 (en) * | 2012-07-10 | 2014-01-16 | Qualcomm Incorporated | Coding sei nal units for video coding |
| US20140071976A1 (en) | 2012-09-13 | 2014-03-13 | Qualcomm Incorporated | Apparatus and method for enabling communication on a supplemental channel in a gsm wireless communication network |
| US8718822B1 (en) | 2011-05-06 | 2014-05-06 | Ryan Hickman | Overlaying sensor data in a user interface |
| US8798791B2 (en) | 2010-12-28 | 2014-08-05 | Hon Hai Precision Industry Co., Ltd. | Robot control system and method |
| US20140253601A1 (en) * | 2013-03-11 | 2014-09-11 | Samsung Electronics Co., Ltd. | Display power reduction using sei information |
| US9031692B2 (en) | 2010-08-24 | 2015-05-12 | Shenzhen Institutes of Advanced Technology Chinese Academy of Science | Cloud robot system and method of integrating the same |
| US20150338204A1 (en) | 2014-05-22 | 2015-11-26 | Brain Corporation | Apparatus and methods for distance estimation using multiple image sensors |
| CN105116785A (en) * | 2015-06-26 | 2015-12-02 | 北京航空航天大学 | Multi-platform remote robot general control system |
| US9479732B1 (en) * | 2015-11-10 | 2016-10-25 | Irobot Corporation | Immersive video teleconferencing robot |
| CN106444458A (en) * | 2016-11-11 | 2017-02-22 | 华南智能机器人创新研究院 | Method and system for remotely controlling industrial robot |
| US10029370B2 (en) * | 2012-12-21 | 2018-07-24 | Crosswing Inc. | Control system for mobile robot |
| US10321096B2 (en) * | 2016-10-05 | 2019-06-11 | Avaya Inc. | Embedding content of interest in video conferencing |
| US20190354774A1 (en) * | 2018-05-16 | 2019-11-21 | 360Ai Solutions Llc | Method and System for Detecting a Threat or Other Suspicious Activity in the Vicinity of a Person or Vehicle |
| US10532463B2 (en) * | 2003-12-09 | 2020-01-14 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
| US10574830B2 (en) * | 2017-06-05 | 2020-02-25 | Qualcomm Incoporated | Methods for increasing VoIP network coverage |
| US11202079B2 (en) * | 2018-02-05 | 2021-12-14 | Tencent America LLC | Method and apparatus for video decoding of an affine model in an intra block copy mode |
| US11245939B2 (en) * | 2015-06-26 | 2022-02-08 | Samsung Electronics Co., Ltd. | Generating and transmitting metadata for virtual reality |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8406926B1 (en) * | 2011-05-06 | 2013-03-26 | Google Inc. | Methods and systems for robotic analysis of environmental conditions and response thereto |
| US8374421B1 (en) * | 2011-10-18 | 2013-02-12 | Google Inc. | Methods and systems for extracting still frames from a compressed video |
| US20140015914A1 (en) * | 2012-07-12 | 2014-01-16 | Claire Delaunay | Remote robotic presence |
-
2018
- 2018-06-19 US US16/980,482 patent/US12459127B2/en active Active
- 2018-06-19 EP EP18734157.3A patent/EP3811700A1/en not_active Withdrawn
- 2018-06-19 WO PCT/EP2018/066280 patent/WO2019242846A1/en not_active Ceased
Patent Citations (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7330875B1 (en) | 1999-06-15 | 2008-02-12 | Microsoft Corporation | System and method for recording a presentation for on-demand viewing over a computer network |
| US10532463B2 (en) * | 2003-12-09 | 2020-01-14 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
| US20080015427A1 (en) * | 2006-06-30 | 2008-01-17 | Nathan Kastelein | System and network for remote medical procedures |
| US20090135919A1 (en) * | 2007-11-23 | 2009-05-28 | Samsung Electronics Co., Ltd. | Method and an apparatus for embedding data in a media stream |
| US9031692B2 (en) | 2010-08-24 | 2015-05-12 | Shenzhen Institutes of Advanced Technology Chinese Academy of Science | Cloud robot system and method of integrating the same |
| US20120063508A1 (en) * | 2010-09-15 | 2012-03-15 | Shinobu Hattori | Transmitting apparatus, transmitting method, receiving apparatus, receiving method, program, and broadcasting system |
| US8798791B2 (en) | 2010-12-28 | 2014-08-05 | Hon Hai Precision Industry Co., Ltd. | Robot control system and method |
| US8718822B1 (en) | 2011-05-06 | 2014-05-06 | Ryan Hickman | Overlaying sensor data in a user interface |
| WO2012167927A1 (en) * | 2011-06-09 | 2012-12-13 | Panasonic Corporation | Coding of control data for adaptive loop filters |
| US20140016707A1 (en) * | 2012-07-10 | 2014-01-16 | Qualcomm Incorporated | Coding sei nal units for video coding |
| US20140071976A1 (en) | 2012-09-13 | 2014-03-13 | Qualcomm Incorporated | Apparatus and method for enabling communication on a supplemental channel in a gsm wireless communication network |
| US10029370B2 (en) * | 2012-12-21 | 2018-07-24 | Crosswing Inc. | Control system for mobile robot |
| US20140253601A1 (en) * | 2013-03-11 | 2014-09-11 | Samsung Electronics Co., Ltd. | Display power reduction using sei information |
| US20150338204A1 (en) | 2014-05-22 | 2015-11-26 | Brain Corporation | Apparatus and methods for distance estimation using multiple image sensors |
| CN105116785A (en) * | 2015-06-26 | 2015-12-02 | 北京航空航天大学 | Multi-platform remote robot general control system |
| US11245939B2 (en) * | 2015-06-26 | 2022-02-08 | Samsung Electronics Co., Ltd. | Generating and transmitting metadata for virtual reality |
| US9479732B1 (en) * | 2015-11-10 | 2016-10-25 | Irobot Corporation | Immersive video teleconferencing robot |
| US10321096B2 (en) * | 2016-10-05 | 2019-06-11 | Avaya Inc. | Embedding content of interest in video conferencing |
| CN106444458A (en) * | 2016-11-11 | 2017-02-22 | 华南智能机器人创新研究院 | Method and system for remotely controlling industrial robot |
| US10574830B2 (en) * | 2017-06-05 | 2020-02-25 | Qualcomm Incoporated | Methods for increasing VoIP network coverage |
| US11202079B2 (en) * | 2018-02-05 | 2021-12-14 | Tencent America LLC | Method and apparatus for video decoding of an affine model in an intra block copy mode |
| US20190354774A1 (en) * | 2018-05-16 | 2019-11-21 | 360Ai Solutions Llc | Method and System for Detecting a Threat or Other Suspicious Activity in the Vicinity of a Person or Vehicle |
Non-Patent Citations (6)
| Title |
|---|
| "Bouabdallah, Ahmed; Wehbe, Houssein; Stevant, Bruno; Uplink Transfer of Live Video Synchronized with Multiple Contextual Data; Oct. 2012; Media Synchronization Workshop 2012, Oct. 2012, Berlin, Germany" (Year: 2012). * |
| "Hannuksela, Miska; Yan, Ye; Huang, Xuehui; Li Houqiang; Overview of the Multiview High Efficiency Video Coding (MV-HEVC) Standard; 2015; Nokia Technologies, University of Science and Technology of China" (Year: 2015). * |
| European Office Action for European Patent Application No. 18734157.3 mailed May 9, 2022, 5 pages. |
| Information Technology—"Coding of Audio-Visual Objects"—Part 10: Advanced Video Coding, International Standard, ISO/IEC 14496-10, Second edition, Oct. 1, 2004, 280 pages. |
| PCT International Search Report and Written Opinion mailed Aug. 10, 2018 for International Application PCT/EP2018/066280, 10 pages. |
| Wang et al., "RTP Payload Format for H.264 Video", Internet Engineering Task Force, Request for Comments: 6184, May 2011, 101 pages. |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2019242846A1 (en) | 2019-12-26 |
| US20210023712A1 (en) | 2021-01-28 |
| EP3811700A1 (en) | 2021-04-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR101821124B1 (en) | Method and apparatus for playing media stream on web-browser | |
| CN110572433B (en) | A video scheduling method, system and device | |
| CN115334273A (en) | A protocol conversion audio and video communication method and system | |
| US20130312048A1 (en) | Array of transcoder instances with internet protocol (ip) processing capabilities | |
| US10666351B2 (en) | Methods and systems for live video broadcasting from a remote location based on an overlay of audio | |
| CN106713519A (en) | Network communication method and system based on software-defined networking | |
| US12459127B2 (en) | Methods and arrangements for controlling a robot device over a wireless network | |
| Lim et al. | Cloud based implementation of ROS through VPN | |
| US20250088665A1 (en) | Point cloud data transmission device, point cloud data transmission method, point cloud data reception device, and point cloud data reception method | |
| EP4256774B1 (en) | Migration of remote data processing between servers | |
| CN114205185B (en) | Proxy method and device for control message | |
| CN109361671A (en) | A streaming media transmission architecture method based on SIP protocol | |
| CN111338747B (en) | Data communication method, device, terminal equipment and storage medium | |
| CN104601351B (en) | Network device capability configuration method, network device and system | |
| Bhattacharyya et al. | Improving live-streaming experience for delay-sensitive iot applications: A restful approach | |
| JP6404915B2 (en) | Automatic data compression | |
| Xiaohui et al. | The design and implementation of real-time Internet-based telerobotics | |
| Barone et al. | Seeing Through the Robot’s Eyes: Adaptive Point Cloud Streaming for Immersive Teleoperation | |
| CN120710998B (en) | Unmanned vehicle video backhaul control method and system based on SIP communication protocol | |
| Cervera | Distributed visual servoing: A cross-platform agent-based implementation | |
| Barone et al. | Seeing Through the Robot's Eyes: Adaptive Point Cloud Streaming | |
| Robaina et al. | GOTE: An Edge Computing Architecture for Mobile Gaming. | |
| Dandanov et al. | Communication Framework for Tele-rehabilitation Systems with QoS Guarantee | |
| CN118695004A (en) | Video distribution system, instruction interaction system, method, storage medium and device | |
| Lassó et al. | Microrobot teleoperation through WWW |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: TELEFONAKTIEBOLAGET LM ERICSSON (PUBL), SWEDEN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BADER, ATTILA;FORMANEK, BENCE;RACZ, SANDOR;AND OTHERS;SIGNING DATES FROM 20180716 TO 20180718;REEL/FRAME:053757/0342 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |