SE1100740A1 - Användning av 3D-sensorer för att lära ut flerarmskoordination - Google Patents
Användning av 3D-sensorer för att lära ut flerarmskoordination Download PDFInfo
- Publication number
- SE1100740A1 SE1100740A1 SE1100740A SE1100740A SE1100740A1 SE 1100740 A1 SE1100740 A1 SE 1100740A1 SE 1100740 A SE1100740 A SE 1100740A SE 1100740 A SE1100740 A SE 1100740A SE 1100740 A1 SE1100740 A1 SE 1100740A1
- Authority
- SE
- Sweden
- Prior art keywords
- movement
- robot
- group
- hand
- tool
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 103
- 230000001360 synchronised effect Effects 0.000 claims abstract description 14
- 238000000034 method Methods 0.000 claims abstract 7
- 238000004590 computer program Methods 0.000 claims abstract 4
- 238000006243 chemical reaction Methods 0.000 abstract description 3
- 210000004247 hand Anatomy 0.000 description 7
- 238000001514 detection method Methods 0.000 description 2
- 238000011835 investigation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1669—Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1682—Dual arm manipulator; Coordination of several manipulators
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Uppfinningen hänför sig till ett förfarande, en anordning (10) och en datorprogramprodukt för att bestämma den synkroniserade verktygsrörelsen hos en grupp av robotar. Enligt uppfinningen detekterar en sensor (12) den synkroniserade rörelsen hos en grupp händer, där gruppen innefattar en första och en andra hand, en omvandlingsenhet (14) omvandlar den detekterade handrörelsen till synkroniserad robotverktygsrörelse och en robotverktygsrörelseenhet (16) tillhandahåller, för varje robot, robotverktygsrörelse som minst en sektion av en robotverktygsrörelsebana, längs vilken denna robot ska flytta ett verktyg.Fig. 1
Description
15 20 25 The 3-D sensors are in this embodiment two Nintendo Wii Remote sensors 12A and l2B that use IR positioning.
When a human 28 holds these in his hands, hand movements can be detected and used for teaching a group of robots comprising a first and a second robot 26 and 27.
The simultaneous motion of the two hands can be tracked by the sensors 12A and l2B and sent to the robots 26 and 27 that imitates the movement.
I Depending on the sensor type the tool center points (tcp) or even the full arm configuration of two robot arms can be detected at the same time.
I The detected tcp-positions/arm configurations can be used to jog the position and orientation of the two arms at the same time. This makes it possible to easily teach complex dual-arm tasks.
I By means of the position sensor(s) the paths of two robot tcpzs and/or arm configurations can be recorded from the position sensors in a synchronized manner.
I When executing the recorded paths from a robot program they will reproduce the paths including inter-arm synchronization.
An alternative type of 3D position sensor that can be used to detect the position or movement of the two hands of the operator is MS Kinect using structured light projection. 10 15 20 25 30 Fig. 2 shows a second embodiment of the device using this type of sensor.
The device 10 here comprises one sensor 12, which may employ Microsoft Kinect together with appropriate software in order to track each body joint of an observed human being 28 including the two hands.
Here the sensor 12 is connected to an analyzing unit 20, which in turn is connected to the converting unit 14. The converting unit 14 is as in the first embodiment connected to the robot tool movement providing unit 16. There is also a Command unit 18 connected to the analyzing unit 20 and to the robot tool movement providing unit 16. There is again the robot control unit 22 connected to the robot tool movement providing unit 16. There is finally an optional voice detecting unit 24 connected to the analyzing unit 20.
The joint positions can be used to guide a robot tool center point (tcp) of a pair of robots, and also apply the correct arm configuration of the robots 26 and 27 in order to reach around obstacles etc. The robots can imitate the operator arm motions in real time.
Both movement and re-orientation can be taught simultaneously, making it possible to record full paths in real time.
I The device can produce jogging data, so that the robots can imitate the movement of the operator in real time. 10 15 20 25 30 I A path can be defined by continuously recording positions along a reference path gestured by a lead-through programmer.
I Optional: I Control Commands such as the instruction to record the current arm pose (modpos) could be gestured using a predefined dedicated gesture, such as a “hand wave".
I There may be several dedicated gestures: Rewind, Forward, delete recorded segment etc.
If all body parts are occupied instructing a pose/path, the record command could be signaled by voice command to voice recognition unit 24, such as the one built into the Microsoft Kinect.
The operation of the device according to the second embodiment is, with reference being made to fig. 3, the following.
The detector 12 detects the movement of body joints.
More particularly it detects the movement of a group of hands comprising a first and a second hand in a series of synchronised motions , step 30. This detection may comprise detection of the positions of the group of hands. The analysing unit 20 then analyses the detected hand movements, step 32, and determines if the any of the hand movements comprise a command, step 34. In case a detected hand movement is a command, the analysing unit 20 instructs the command unit 18 to perform the command and otherwise instructs the converting unit 14 10 15 20 25 30 to perform conversion. Then no conversion to robot tool movement is made.
If one detected hand movement was a command, step 34, then the command unit 18 performs a command corresponding to the detected hand movement, step 36, while if none of the detected hand movements was a command, then the converting unit 14 converts the detected hand movements to synchronised robot tool movement, where the robot tool movement is provided for the group of robots, step 38. The robot tool movements here have the same synchronisation in time in relation to each other as the hand movements have to each other.
The converting may here also comprise converting the positions of the group of hands to corresponding positions of the robot tools, where the corresponding robot tool positions have the same position in relation to each other as the corresponding hand positions. The robot tool movement for each robot in the group is then provided as at least a section of a tool movement path along which the corresponding robot is to move a tool, step 40. The robot tool movement may furthermore be recorded by the robot tool movement providing unit 16.
The detected hand movements may thus each form the basis of a section of a tool movement path or a whole tool movement path.
The robot 26 can be controlled on-line, i.e. as the robot tool movement is recorded. However it is also possible to store sections and then later form them into a path that is used for controlling the robot. The on-line control may be set through one of the commands or via robot control panel before operation is started 10 15 20 25 30 Therefore as the robot tool movement has been stored, an investigation is made if the robot is to be controlled on-line or not, step 42, and if it is to be controlled on-line then the robot control unit 22 controls the robot 26 to move the tool along the part of the path currently being recorded, step 44.
In both cases with or without on-line control, the detector 12 thereafter again detects tool movement, step 30, and analyses the tool movement, step 32, which is also done after a command has been performed, step 36.
The analysing unit 20 may here also determine if the hand movement is to be converted into robot movement based on an analysis of a voice command detected by the voice detecting unit 24.
The invention has a number of advantages. Two arms can be manipulated at the same time. It takes less time to teach arm movement paths in multi-arm applications. It saves programming time. The achieved programming result could be enhanced.
It is also possible to detect the movement and positions of some body joints of a human such as of a pair of arms: Such joints may be the hand, wrist elbow and shoulder. These may then be converted into the movement and positions of corresponding robot joints.
While the invention has been described in connection with what is presently considered to be most practical and preferred embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements.
Therefore the present invention is only to be limited by the following claims.
Claims (13)
1. l. A method for determining the synchronised movement of tools of a group of robots comprising the steps of: - detecting (30) the movement of a group of hands in a series of synchronised motions , where the group comprises a first and a second hand, - converting (38) the detected hand movement to synchronised robot tool movement, said robot tool movement being provided for the group of robots, and - providing (40), for each robot, the robot tool movement as at least a section of a tool movement path along which this robot (26) is to move a tool.
2. The method according to claim 1, wherein the step of detecting the movement comprises detecting the positions of the group of hands and the step of converting comprises converting the positions of the group of hands to corresponding positions of the robot tools, where the corresponding robot tool positions have the same position in relation to each other as the corresponding hand positions.
3. The method according to any previous claim, wherein the step of providing comprising recording the robot tool movements.
4. The method according to claim 1 or 2, wherein the step of providing comprises controlling (44) the 10 15 20 25 30 group of robots (26) to move their tools along at least said sections of the corresponding tool movement paths.
5. The method according to any previous claim, wherein some hand movements are control commands and further comprising the steps of analysing (32) the detected hand movements, determining (34) if the detected hand movements comprise a command, and performing (36) a command corresponding to a hand movement in case a hand movement is determined to be a command and otherwise performing the step of converting (38) the detected hand movement to robot tool movement.
6. The method according to any previous claim, further comprising detecting a voice command and performing the step of converting the hand movement into robot tool movement based on an analysis of the voice command.
7. A device (10) for determining the movement of tools of a group of robots, said group comprising a first and a second robot, the device comprising: at least one sensor (12) detecting the movement of a group of hands in a series of synchronised motions, where the group comprises a first and a second hand, a converting unit (14) configured to convert the detected hand movement to synchronised robot tool movement , said robot tool movement being provided for the group of robots, and a robot tool movement providing unit (16) configured to provide, for each robot, the robot tool movement as at least a section of a robot movement path along which this robot is to move a tool. 10 15 20 25 30 10
8. The device for determining the movement of tools according to claim 7, wherein the detecting of movement of hands comprises detecting the positions of the group of hands and the converting comprises converting the positions of the group of hands to corresponding positions of the robot tools, where the corresponding robot tool positions have the same position in relation to each other as the corresponding hand positions.
9. The device according to claim 7 or 8, wherein the robot tool movement providing unit is configured to record the robot tool movements.
10. The device for determining the movement of a robot arm according to any of claims 7 - 9, further comprising a robot control unit (18) configured to control the group of robots to move their tools along at least said sections of the corresponding robot movement paths.
11. The device for determining the movement of tools according to claim 5, wherein some hand movements are control commands and further comprising a command unit (18) configured to perform commands and an analysing unit (20) configured to analyse the detected hand movements, determine if the detected hand movements comprise a command and order the command unit (18) to perform a command corresponding to the hand movement in case the hand movement is determined to be a command and otherwise order the converting unit (14) 10 15 20 25 ll to convert the detected hand movements to robot tool movements.
12. The device for determining the movement of a robot arm according to claim 6, further comprising a voice detecting unit (26) configured to detect a voice command and supply to the analysing unit in order to determine if detected hand movement is to be converted to robot tool movement also based on an analysis of the voice command.
13. A computer program product for determining the synchronised movement of tools of a group of robots, the computer program product comprising a data carrier (46) with computer program code (48) which when run in (10) a device for determining the movement of tools of a group of robots, causes the device to: - obtain movement data of a group of hands in a series of synchronised motions , where the group comprises a first and a second hand, - convert the detected hand movement to synchronised robot tool movement, said robot tool movement being provided for the group of robots, and - provide, for each robot, the robot tool movement as at least a section of a tool movement path along which this robot (26, 27) is to move a tool.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| SE1100740A SE1100740A1 (sv) | 2011-10-06 | 2011-10-06 | Användning av 3D-sensorer för att lära ut flerarmskoordination |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| SE1100740A SE1100740A1 (sv) | 2011-10-06 | 2011-10-06 | Användning av 3D-sensorer för att lära ut flerarmskoordination |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| SE1100740A1 true SE1100740A1 (sv) | 2011-10-10 |
Family
ID=44840582
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| SE1100740A SE1100740A1 (sv) | 2011-10-06 | 2011-10-06 | Användning av 3D-sensorer för att lära ut flerarmskoordination |
Country Status (1)
| Country | Link |
|---|---|
| SE (1) | SE1100740A1 (sv) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111055287A (zh) * | 2020-01-13 | 2020-04-24 | 广州启帆工业机器人有限公司 | 一种双机器人协作同步的方法、系统、装置和存储介质 |
-
2011
- 2011-10-06 SE SE1100740A patent/SE1100740A1/sv not_active Application Discontinuation
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111055287A (zh) * | 2020-01-13 | 2020-04-24 | 广州启帆工业机器人有限公司 | 一种双机器人协作同步的方法、系统、装置和存储介质 |
| CN111055287B (zh) * | 2020-01-13 | 2021-06-08 | 广州机械科学研究院有限公司 | 一种双机器人协作同步的方法、系统、装置和存储介质 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6314134B2 (ja) | ロボット訓練のためのユーザインターフェース | |
| CN110561450B (zh) | 一种基于动捕的机器人装配离线示例学习系统和方法 | |
| EP3272473B1 (en) | Teaching device and method for generating control information | |
| CN110561430B (zh) | 用于离线示例学习的机器人装配轨迹优化方法及装置 | |
| Fritsche et al. | First-person tele-operation of a humanoid robot | |
| JP2011110620A (ja) | ロボットの動作を制御する方法およびロボットシステム | |
| WO2011065035A1 (ja) | ロボットの教示データを作成する方法およびロボット教示システム | |
| JP2018024082A (ja) | 多軸運動制御デバイス、ロボットアームシステム、ロボットアームシステムの動きを制御する方法及び多軸運動駆動デバイスの動きを制御するための方法 | |
| US20210200311A1 (en) | Proxy controller suit with optional dual range kinematics | |
| CN113119077A (zh) | 一种工业机器人手持示教装置和示教方法 | |
| JP2018015863A (ja) | ロボットシステム、教示データ生成システム及び教示データ生成方法 | |
| JPWO2009096408A1 (ja) | 多関節構造体教示装置 | |
| CN104827474A (zh) | 学习人的虚拟示教机器人智能编程方法及辅助装置 | |
| CN205068294U (zh) | 机器人人机交互装置 | |
| Lopez et al. | Taichi algorithm: Human-like arm data generation applied on non-anthropomorphic robotic manipulators for demonstration | |
| Lee et al. | Reinforcement learning-based virtual fixtures for teleoperation of hydraulic construction machine | |
| US20230278211A1 (en) | Robot Teaching Device and Work Teaching Method | |
| Meier et al. | Synchronized multimodal recording of a table setting dataset | |
| Grasshoff et al. | 7dof hand and arm tracking for teleoperation of anthropomorphic robots | |
| CN120552080B (zh) | 一种面向机器人训练与数据采集的人机交互系统和方法 | |
| CN119871445B (zh) | 人形机器人遥操作控制系统 | |
| CN119927906B (zh) | 一种用于人形机器人上肢协同控制的交互方法及装置 | |
| SE1100740A1 (sv) | Användning av 3D-sensorer för att lära ut flerarmskoordination | |
| Guan et al. | On semi-autonomous robotic telemanipulation employing electromyography based motion decoding and potential fields | |
| US20250214237A1 (en) | Robot Teaching Method and Device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| NAV | Patent application has lapsed |