[go: up one dir, main page]

WO2011107278A1 - Procédé et dispositif de commande d'un robot - Google Patents

Procédé et dispositif de commande d'un robot Download PDF

Info

Publication number
WO2011107278A1
WO2011107278A1 PCT/EP2011/001045 EP2011001045W WO2011107278A1 WO 2011107278 A1 WO2011107278 A1 WO 2011107278A1 EP 2011001045 W EP2011001045 W EP 2011001045W WO 2011107278 A1 WO2011107278 A1 WO 2011107278A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
governor
actuators
sensors
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2011/001045
Other languages
German (de)
English (en)
Inventor
Marcel Reese
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2011107278A1 publication Critical patent/WO2011107278A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J3/00Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
    • B25J3/04Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements involving servo mechanisms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present invention relates to the control of a governor by a user.
  • a corresponding user can be a human or another living being.
  • the governor to be controlled may be a real machine, a virtual machine, or some other virtual creature.
  • a preferred machine is a real or virtual robot.
  • This invention relates to the field of motion simulation.
  • Simulators are known, for example, as flight simulators for pilot training or for use in gaming halls. Different movements are applied to a human body while displaying a simulated environment.
  • exoskeletons are known. They are similar to orthotics that are for the support of individual limbs, such as for a hand or a
  • CONFIRMATION COPY Knee joint may be suitable.
  • Such an exoskeleton is described for example in DE 10 2007 035 401 A1.
  • Exoskeletons can support or enhance movements of the wearer by actively driving joints on the exoskeleton by servomotors. These are driven by sensor signals that are a measure of
  • Exoskeletons which are a type of robotic suit, are known by Raytheon Company (www.ravtheon.com, last accessed June 2010). These are essentially a portable robot that reinforces the power, endurance and maneuverability of the user. They include a combination of sensors, actuators and controls, allowing the user to carry a human on his back or lift heavy loads hundreds of times without getting tired. On the other hand, this suit is agile enough that its users can play football, climb a staircase and the like.
  • the movement of the robot is in the said article by means of a
  • governor such as a machine or a virtual being, allow, at which the governor understands essential movements that are given by a user.
  • signals are detected and evaluated by user sensors whose signals are a measure of positions, postures and / or movements of individual body parts of the user and / or a measure of the forces or torques acting on them.
  • Such sensors may, for example, be in direct or indirect contact with the user body and be designed as stretch marks, electrical sensors for muscle impulses or the like.
  • sensors are also possible which are located at some distance from the user and detect their positions and / or movements visually, acoustically, capacitively or the like.
  • the user sensor signals are used to control governor actuators that bring or move a governor in a predetermined position accordingly
  • a governor can in particular a real machine, a virtual
  • Machine or another virtual being is a preferred machine.
  • a preferred machine is a real or virtual robot. Such a position may be, for example, sitting, lying or the like. A movement can go, run etc. be.
  • Appropriate actuators may include electric motors, but also hydraulic or pneumatic elements.
  • the user sensor signals can also be detected and processed and used to control user actuators. This serves in particular when using an exoskeleton to reduce the user perceived weight forces and dynamic forces exerted by the exoskeleton on the user.
  • governor sensors that detect the positions and movements of the governor and / or on him from the outside or by themselves acting forces, torques or deformations.
  • These sensors can basically be designed and arranged similar to user sensors. Their signals are detected and processed in such a way that they serve to control user actuators that influence the user or individual parts of his body accordingly. Such an influence can on the one hand be the cause of a posture or movement by the application of forces and torques. But it is also possible that the user such forces, torques or deformations are exercised, which are comparable to a haptic feedback when touching an object.
  • the governor sensor signals can also be recorded and processed and used to control governor actuators. This serves in particular to make the user not feel the weight and the dynamic forces of the governor.
  • the user sensors detect the positions, positions, movements, forces and / or torques of at least one body part, such as a leg, an arm, a hand, a foot or the like, and a part, ie leg, arm, hand, foot or the like, the governor is moved accordingly.
  • the controlling body part and the controlled part may be similar, such as arm-arm, leg-leg and the like.
  • they it is also conceivable that they
  • exoskeletons are disparate, so that, for example, a user's hand controls a governor's leg.
  • the use of an exoskeleton is particularly advantageous if suitable actuators and / or sensors are contained therein. It is particularly advantageous if the user and the governor wear or use identical exoskeletons.
  • a suitable motion simulator is provided in one embodiment of the invention. This may for example be designed as (a) gimbals in conjunction with a translation unit, (b) stewart motion platform, (c) multi-axis industrial robots or (d) more spherical
  • the governor is designed as a real robot, or other real machine, various teleoperations are possible.
  • the user can control a robot, for example, in a life-threatening environment, such as in a radioactive space, in a combat mission or the like.
  • the task of such a system is therefore the applicability of real robots to areas and To broaden problems in which they can act autonomously or semi-autonomously, not or only to a limited extent. This includes, for example, locomotion of a robot in desert sand, swamps, forests, complex situations in buildings, interaction with sensitive organisms, etc.
  • Complex movements such as
  • the robot transmits its state, such as movements, forces, torques, etc., to the user via said sensors and actuators, it can adequately control the robot through its body control, which is possibly assisted by an exoskeleton.
  • Information can be obtained by the user on the basis of optical, acoustic or other impressions, which are recorded by means of suitable sensors in the area of the robot and appropriately processed and forwarded to the user.
  • the master robot is preferably designed as an exoskeleton for this purpose.
  • the number of degrees of freedom of the robots may generally be different.
  • Moving state of the slave experienced by the master The more realistic the user's impression of the slave's environment is, the greater is the "transparency" of the tele-operative connection, including the type of control and communication involved, and the bilateral teleoperation also allows multiple users to haptically interact or use multiple masters and a shared slave interact with the environment, especially the slave and slave environment can be simulated, so be virtually trained. It is also important to the user of the sensation of the immediate
  • the body weight of the user or of parts of the user should also be borne by the robots. Both, the (partial) isolation of the user of master and slave properties, as well as the (partial) removal of his body weight by the robot, is established by so-called gravity compensation.
  • Teleoperation methods commonly also use the scaling of distances and forces. This is especially important for non-identical master and slave robots. This means that a user can also experience larger or smaller forces and other distances (also angular distances) at the master than at the slave.
  • Position-force control means that the actual position of the master is transmitted to the slave, which uses it as a setpoint around its position
  • the forces and / or torques measured by the slave are then transmitted to the master, where they are used as desired forces and transmitted to the user by actuators.
  • the methods can only be realized with actuator sensors (angle, angular velocity, torque) or with additional force and torque sensors.
  • the invention then becomes easily applicable if it controls the reality or the virtual reality by the governor according to their laws. That means he has to get used to it.
  • a task of the user would then be to regulate, for example, the balance of the governor.
  • the governor will usually have a different geometry and body dynamics than the user. If, however, the body dynamics of the governor is transferred to the user through the use of a movement simulator, the latter notes, for example, by means of an equally accelerated movement of the head forward, that the governor begins to fall over. He must compensate for this with appropriate measures, such as shifting the governor's weight, using his own legs, which control the legs of the governor until the acceleration stops.
  • the user can never fall over himself in the exoskeleton in the movement simulator, as long as his governor does not fall over.
  • dynamic and static forces of the governor can be passed on to the user.
  • any linear or non-linear transformation between the parameter spaces of the governor and the user can be made. This can be used to speed up habituation or improve customization.
  • the regulation of the exoskeleton and the motion simulator can be effected by applying proven concepts and techniques of machine and robot kinematics as well as teleoperation.
  • the simulation of virtual worlds based on physical models has found widespread use in computer games, scientific simulation and film.
  • the same hardware and software can be used for efficient simulation of the necessary control signals and also for image calculation.
  • the governor can also be designed as a virtual being, as virtual
  • Robot or virtual creature in a video game or the like can serve the user to certain movements in To practice dependency on different environments that can also be represented virtually.
  • Another embodiment of the invention allows the user to experience predetermined influences. These can be movement sequences or other haptic impressions.
  • the user can experience a kind of haptic film, which can be supplemented accordingly by means of suitable output means for optical, acoustic and other signals.
  • suitable output means for optical, acoustic and other signals In the interaction between the user and the robot may be too
  • Time delays occur, which can be caused in particular by the response times of sensors and actuators, computing speed of control units and transit times of the various signals.
  • Time delays can be largely compensated if time derivatives of the quantities measured by the sensors are evaluated.
  • sensors such as ultrasound transducers, for example, are provided, which quickly detect the distance to the object in the manner of a parking aid , By using this data for contact prediction, the impression of a contact and a realistic feedback can be generated in time for the user.
  • Movement in partial or complete weightlessness is simulated.
  • This situation is very similar to weightlessness in space, but differs in the high damping of any movement through the fluid.
  • the actuators that move his body or individual parts of his body driven accordingly to support movements.
  • sub-water motion with scaled damping or motion in space without damping or the like can be simulated.
  • the transparency of the teleoperation with real or virtual machines, real or virtual robots or virtual beings in situations in space or under water or the like can thus be improved.
  • a further embodiment of the invention provides that the user in the master (user side) influences another user in the slave (governor side). Both users are preferably humans.
  • the master is designed here as an exoskeleton with rototranslator and the slave as a freely movable exoskeleton.
  • the master and slave are interconnected by a bilateral haptic connection of all or some body parts and other signals, such as acoustic or the like.
  • the degree of influence of the master by the slave and vice versa is controlled by the control unit, under default of the master and / or slave side. This makes it possible to accompany a user of a freely movable exoskeleton on the slave side from a distance and to support or to take over the control of the exoskeleton in whole or in part.
  • This embodiment is particularly interesting for training purposes in sports exercises or other tasks with high physical share.
  • a further embodiment of the invention provides that several users interact in their own user units via a common control unit with the same governor. Both users thus experience the state of the governor and can generally influence him cooperatively but to a different degree. Users can generally influence different functions and body parts of the governor. The users can thus share tasks of governing the governor, coping with them, or even participating in an observational and advisory role. This embodiment is particularly interesting for training purposes in sports exercises or other tasks with a high physical share. Further features and advantages of the invention are explained below with reference to preferred embodiments. Show
  • Fig. 1 A system for controlling a robot by means of an exoskeleton
  • Fig. 2 is a side view of a user in the exoskeleton
  • FIG. 1 symbolically illustrates a system that allows a user 10 to control a robot 110.
  • the user 10 is preferably a human with a corresponding body comprising a trunk 12 and arms 14, legs 16, feet 17 and a head 18.
  • a user exoskeleton 20 is attached, which is drawn hatched. This is located in particular on the arms 14, on the fuselage 12, on the back 13 (see FIG. 2), on the legs 16 and on the feet 17, which in the preferred embodiment do not touch the ground but only stop on the user exoskeleton 20.
  • This is designed such that it partially or completely encloses the body parts mentioned in such a way that it can detect and also influence movements of the body.
  • a helmet 22 which in the preferred embodiment is part of the exoskeleton 20 and in which a visual display unit 24 and an acoustic display unit 26 are included.
  • Display unit 24 generates different displays for the right and left eyes, such as LCD displays, projection devices, or the like, and thus can provide a stereo effect.
  • the acoustic reproduction unit 26 preferably comprises two headphones or loudspeakers, which may allow the user 10 a surround sound. Furthermore, a microphone 28
  • the robot 110 is here designed as a humanoid robot consisting essentially of a robot exoskeleton with corresponding body parts, such as robot body 112, robot arms 1 4, robot legs 16, robot feet 17 and robot head 118.
  • the robotic exoskeleton is at this preferred
  • Embodiment largely mechanically identical to the user exoskeleton 20.
  • the robot exoskeleton is connected to a work space which is subdivided into various chambers, which can be roughly designated as the trunk work space 130, the arms work space 132, the leg work space 134 and the head work space 136.
  • a work space which is subdivided into various chambers, which can be roughly designated as the trunk work space 130, the arms work space 132, the leg work space 134 and the head work space 136.
  • operating materials In these work spaces, operating materials , Aggregates, tools and means for
  • Control, power supply and the like can be accommodated.
  • the robot head 118 includes a camera system 122, which preferably consists of two individual cameras mounted at the locations where
  • a loudspeaker 124 which can emit acoustic signals into the environment of the robot 110.
  • a loudspeaker 124 which can emit acoustic signals into the environment of the robot 110.
  • the ear area is a loudspeaker 124
  • Microphone system 126 that can pick up sounds from the environment. It is particularly advantageous if the microphone system is designed as a dummy head stereomicrophone.
  • the nose area contains olfactory sensors 128 that can pick up olfactory signals from the environment. For these sensors 128 work chemically, optically and / or the like.
  • the user 10 and the robot 110 are connected to one another via an electronic control, which is indicated only symbolically in FIGS. 1, 2 and in particular contains an electronic control unit 30.
  • These user sensors NS include, in particular, the microphone 28 and user body sensors, not shown, which can absorb movements, spatial position, forces and / or torques of the user 10 or his body parts.
  • trunk 12, back 13, arms 14, legs 16, feet 17 and head 18, these body parts also include other parts of the body, in particular the hands, the fingers and the like.
  • These user body sensors may be at least partially contained in the exoskeleton 20.
  • the controller 30 also receives signals from sensors provided on the robot 110 side via a second sensor line 33. These are indicated in Fig. 1 with RS and include in particular the camera system 122, the microphone system 126, the odor sensors 128 and not shown robot body sensors that record movements of the robot 110 and can be arranged and configured similar to the user described above. body sensors.
  • the electronic control unit 30 outputs various signals.
  • first all signals that drive on the part of the user 10 user output units ND via a first playback line 34, such as the optical display unit 24 or the acoustic playback unit 26.
  • first signals are emitted
  • User actuators can control NA. These may be at least partially part of the exoskeleton 20 and may be designed so that they can exert pressure and deformations on different parts of the body of the user. They may also be arranged and designed so that movements of user body parts can be initiated or inhibited. For example servomotors, hydraulic or pneumatic elements or the like are conceivable.
  • the use of hydraulic and pneumatic elements offers the possibility of tempering the liquids or gases used and thereby additionally giving the user, or possibly also via other means, a temperature feedback. Since in the preferred embodiment, the helmet 22 is part of the exoskeleton 20, and the attitude of the head 18 can be detected and also influenced.
  • the electronic control unit 30 outputs various signals. These include the signals output via a second reproduction line 35 for the robot output units RD, to which in particular the loudspeaker system 124 belongs. Further optical and / or acoustic output units are conceivable.
  • control signals for robot actuators RA are output via a second control line 37. This includes in particular positioning elements within the robot 110, which can cause or inhibit movements of robot body parts.
  • Fig. 2 shows symbolically a side view of the user 10 and other elements of the system. It can be clearly seen that in the preferred embodiment, the user body parts in particular at the back and on the outer sides of the
  • Exoskeleton 20 are surrounded.
  • the electronic control unit 30 receives various sensor signals from the exoskeleton 20 and also outputs various control signals to associated actuators, so that the user body can be influenced accordingly.
  • various sensor signals from the exoskeleton 20 and also outputs various control signals to associated actuators, so that the user body can be influenced accordingly.
  • a user camera 38 and a user motion sensor 39 are connected to the control unit 30. These record optical and / or acoustic signals from the environment of the user 10. In the preferred
  • the camera 38 is arranged and configured so that it can capture the facial expressions of the user.
  • the exemplary embodiment serves to enable a substantial part of the body of user 10 to be used to control the robot 1 10, by sensors being arranged directly or indirectly on the user body, which sensors can detect as many movements or other reactions as possible. This is supplemented by further sensors, such as camera 38 and motion sensor 39, which are arranged in the area of the user 10 and also detect its movements. In addition, further signals from the area of the user 10 can be detected, such as by means of the microphone 28, which can be reproduced in a corresponding manner on the part of the robot 110.
  • the embodiment is not only for the control of the robot 110 by the user 10, but conversely also signals on the part of Robot 110 are supplied to the user in a suitable manner.
  • signals on the part of Robot 110 are supplied to the user in a suitable manner.
  • actuators which may be part of the exoskeleton 20.
  • This is supplemented by acoustic, optical, olfactory and / or gustatory signals, which are detected by suitable sensors in the area of the robot 110 and fed to the user 10 via corresponding output means.
  • FIGS. 1 and 2 can not give the user 10 an optimal impression of the position, speed and acceleration of the robot 110.
  • a motion simulator is necessary.
  • Fig. 3 shows a preferred example of a suitable motion simulator.
  • This is a rototranslator 40, which contains a first part 40a and a second part 40b.
  • the first part 40a is a translation unit that allows three degrees of freedom due to their Cartesian structure along the axes, x, y and z.
  • the second part 40b is a rotation unit, which also allows three degrees of freedom and is designed here as a gimbal.
  • the preferred embodiment to avoid the so-called gimbal-lock effect 4 axes 42, 44, 46 and 48, on which rotatable elements 43, 45, 47 and 49 are attached.
  • the preferred embodiment to avoid the so-called gimbal-lock effect 4 axes 42, 44, 46 and 48, on which rotatable elements 43, 45, 47 and 49 are attached.
  • the preferred embodiment the first part 40a is a translation unit that allows three degrees of freedom due to their Cartesian structure along the axes, x, y and z
  • Elements 43, 45, 47 a ring-like shape and the fourth element 49 is almost semi-annular.
  • an attachment point 50 On the underside there is an attachment point 50, on which the user 10 can be arranged. This can be done, for example, by attaching the exoskeleton 20 or a user capsule at point 50, or in a similar manner.
  • Fig. 4 shows a preferred embodiment of such attachment.
  • the exoskeleton 20, with which the user 10 is firmly connected is fastened to the back side via a fastening element 52 and a fastening arm 54 at the point 50.
  • the fastener 52 preferably includes a
  • High-frequency rotation translation unit which is suitably controlled by the controller 30, to move the user 10 quickly, thereby For example, a shaking as in a car ride on bumpy track can be simulated.
  • the rototranslator 40 is likewise controlled by the control unit 30.
  • a multiplicity of actuators (not illustrated here) are provided which allow movements in all possible degrees of freedom both in the translation unit 40a and in the rotation unit 40b.
  • the pose can no longer be transmitted one-to-one. Then, the actuators of the rototranslator 40 are controlled by the controller 30 such that a motion-cueing process is realized. This makes it possible to transmit the pose of the robot 1 10 to the user 10 such that the working space of the rototranslator 40 does not leave, but nevertheless a realistic overall impression for the user 10 is generated.
  • a second display unit 56 is arranged in the field of view of the user 10, which can be used additionally or instead to the first display unit 24.
  • the display unit 56 can, in particular in such cases, the first
  • the display unit 24 in which by means of the camera 38 (Fig. 2), the facial expression of the user 10 should be detected as completely as possible.
  • the display unit 56 may be curved, spherical or flat and configured such that a
  • Playback of 2- and / or 3-dimensional images is possible.
  • loudspeakers not shown here for surround sound can be provided.
  • the user 10 and the robot 110 are at a local distance from each other, such as in different rooms, different buildings, or the like. It is particularly advantageous if at least some of the lines 32 - 37 are designed wirelessly. For this, different technologies are suitable, such as transmission via
  • the lines 32-37 which are shown in the figures as simple lines, can be designed in many ways. It is important that signals from sensors and signals to actuators and display units are transmitted in a suitable manner via suitable channels. It is also possible that the control unit 30 shown in the figures as a central unit is designed decentralized, so that, for example, parts of the control can take place directly on the master, on the slave, on individual joints and / or anywhere else. For this purpose, lines within the control system can be wireless and / or wired. For the most accurate control of the robot 110 is a variety of
  • Body sensors provided on the user 10 and in its surroundings. These sensors can be designed in various ways. So are measurements of angles and / or torques between individual body parts whose
  • the body sensors may be located at various locations, such as directly at the user 10, within the exoskeleton 20, and / or in between.
  • a plurality of actuators is provided there. These can act on manipulated variables such as Angle, positions, forces or torques and their time derivatives. It is particularly advantageous if the control of at least one of the actuators by means of a control process takes place. For that will be
  • corresponding sensor signals of the line 32 are evaluated as an actual value in the control unit 30, whereupon the control of the associated actuators is adapted to a predetermined setpoint.
  • the control of the Rototranslators 40 can be based on a
  • Control process done by actual signals corresponding sensors are received in the control unit 30 and evaluated appropriately, with a desired spatial position and position can be adjusted.
  • a desired spatial position and position can be adjusted.
  • additional time derivatives of such values are determined.
  • sensors are arranged on the body of the robot 110 and in the vicinity of a plurality of sensors. This can be done in a similar manner as in the user 10. These sensors also describe the current spatial position and position and, where appropriate, their time derivatives.
  • the control unit 30 controls the actuators and display units based on the received sensor signals such that a bilateral haptic teleoperability is made possible.
  • the actuators on the user side and the robot can be controlled such that in addition also a
  • time delays may be caused in particular by the response times of sensors and actuators, computing speed of the control unit 30 and transit times of the various signals.
  • Time delays can be largely compensated if time derivatives of the quantities measured by the sensors are evaluated and values are estimated in the future.
  • time derivatives of the quantities measured by the sensors are evaluated and values are estimated in the future.
  • such an estimation can lead to problems especially when the robot 110 is in contact with an object of its environment occurs, are in a development of the
  • Embodiment provided ultrasonic transducers, which can quickly recognize the distance to the object in the manner of a parking aid.
  • the described embodiments allow the user 10 and the robot 110 to assume the same postures as standing, sitting, lying, headstand, as well as performing joint movements such as walking, running, jumping, somersaulting, etc.
  • the user 10 gets one haptic feedback. Because if he moves the robot 110 and this with any of his
  • the preferred embodiment also has the advantage of achieving a complete, unlimited mobility in six degrees of freedom (6-dof mobility) of the governor and to provide the user at any time complete realistic power feedback.
  • 6-dof mobility six degrees of freedom
  • the user 10 may control the robot 110, for example, in a life-threatening environment such as a radioactive space, a combat mission or the like.
  • the task of such a system is thus to extend the applicability of real robots to areas and problems in which they can not act autonomously or semi-autonomously. This includes, for example, locomotion of the robot 110 in desert sand, swamps, forests, complex situations in buildings, interaction with sensitive animals, etc.
  • the user exoskeleton 20 and the robot exoskeleton may be different or identical.
  • the slave may also be a humanoid or other robotic general body.
  • a memory unit is present, the predetermined - calculated, for example, based on existing 3-dimensional movement sequences - or already carried out by the user 10 or by another user positions and / or movements and / or forces and / or other to be reproduced
  • the embodiments of the invention may also be used to allow multiple users to communicate and interact over a greater distance.
  • one or more additional users are located in the area of the robot 110, each of which likewise carries an exoskeleton or the like and thereby controls a second robot which is located in the area of the user 10.
  • therapeutic measures, gymnastic exercises or the like can be trained.
  • the controlled robots regardless of location and number of users, can interact in the same room and communicate with users.
  • both or more users each use an exoskeleton or the like and each control a governor in a common virtual reality. This allows the haptic, acoustic, visual, etc.
  • the exoskeleton 20 may be attached.
  • the whole can be mounted on a translation unit to obtain a total of six degrees of freedom.
  • a tilting and pivoting bar mounted on the floor or on the ceiling of a room where the exoskeleton is movably mounted, may be sufficient to represent many movements in a realistic manner.
  • a gimbal also two or three axles are sufficient, if appropriate restrictions are accepted.
  • the physical robot 110 in a real environment may be replaced by another governor, such as a virtual model of a robot or a living being in a virtual environment. This eliminates all associated robot sensors and robot actuators; These are replaced by a corresponding simulation.
  • another governor such as a virtual model of a robot or a living being in a virtual environment. This eliminates all associated robot sensors and robot actuators; These are replaced by a corresponding simulation.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne la commande d'un automate par un utilisateur. Un tel automate est en particulier un robot ou un être virtuel. L'automate est en l'occurrence commandé par des signaux de capteurs qui sont placés sur l'utilisateur ou qui se trouvent dans son environnement, et qui peuvent capter les positions et les mouvements de l'utilisateur. D'autres capteurs se trouvent en outre sur l'automate ou dans son environnement, lesquels peuvent capter son emplacement et ses mouvements. Les signaux de ces capteurs servent à commander le déroulement du mouvement chez l'utilisateur. L'invention se caractérise en ce que les capteurs indiqués détectent les postures, positions, mouvements, forces et/ou couples de rotation au moins d'une partie du corps de l'utilisateur et en ce que des parties de l'automate peuvent être mues en conséquence. L'invention peut être utilisée par exemple pour des téléopérations.
PCT/EP2011/001045 2010-03-02 2011-03-02 Procédé et dispositif de commande d'un robot Ceased WO2011107278A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
DE102010010010.2 2010-03-02
DE102010010010 2010-03-02
DE102010010246 2010-03-03
DE102010010246.6 2010-03-03
DE102010023914.3 2010-06-16
DE201010023914 DE102010023914A1 (de) 2010-03-02 2010-06-16 Verfahren und Vorrichtung zur Steuerung eines Statthalters

Publications (1)

Publication Number Publication Date
WO2011107278A1 true WO2011107278A1 (fr) 2011-09-09

Family

ID=44503053

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/001045 Ceased WO2011107278A1 (fr) 2010-03-02 2011-03-02 Procédé et dispositif de commande d'un robot

Country Status (2)

Country Link
DE (1) DE102010023914A1 (fr)
WO (1) WO2011107278A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2800908A4 (fr) * 2012-01-06 2015-09-30 Marvel Tech Inc Boucle à double rétroaction interactive électro/hydraulique
US10166680B2 (en) 2015-07-31 2019-01-01 Heinz Hemken Autonomous robot using data captured from a living subject
US10195738B2 (en) 2015-07-31 2019-02-05 Heinz Hemken Data collection from a subject using a sensor apparatus
US20200249654A1 (en) * 2019-02-06 2020-08-06 Cy-Fighter LLC Robotic control via a virtual world simulation

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101317383B1 (ko) * 2011-10-12 2013-10-11 한국과학기술연구원 로봇을 이용한 인지 능력 훈련 장치 및 그 방법
DE102012002786B4 (de) * 2012-02-15 2017-06-01 Festo Ag & Co. Kg Manipulatorsystem und Verfahren zum Betreiben eines Manipulatorsystems
DE102012211190B4 (de) * 2012-06-28 2019-07-18 Deutsches Zentrum für Luft- und Raumfahrt e.V. Bewegungssimulator
CN103257027B (zh) * 2013-05-24 2015-10-21 山东泰山体育器材有限公司 一种体操器材运动仿真测试系统
DE102016007741A1 (de) 2016-06-27 2017-12-28 Marcel Reese Erfindung das Gebiet betreffend von Exoskeletten, humanoiden Robotern und auch deren Verwendung in teleoperativen Anwendungen in virtuellen Welten oder der realen Welt
DE102016117760A1 (de) 2016-09-21 2018-03-22 Andre HERZOG Anordnung für einen Bewegungssimulator in Kombination mit einem stationären Ganzkörperexoskelett, einem Head-Mounted-Display und einem virtuellen Raum
US10821614B2 (en) 2016-11-11 2020-11-03 Sarcos Corp. Clutched joint modules having a quasi-passive elastic actuator for a robotic assembly
US10843330B2 (en) * 2017-12-07 2020-11-24 Sarcos Corp. Resistance-based joint constraint for a master robotic system
WO2024098070A1 (fr) 2022-11-04 2024-05-10 Sarcos Corp. Effecteur terminal robotique ayant des éléments de raidissement dynamiques avec des entretoises élastiques pour conformer une interaction d'objet

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2344454A (en) 1942-12-07 1944-03-14 Bell Telephone Labor Inc Training device
WO1997009153A1 (fr) * 1995-09-08 1997-03-13 Ross-Hime Designs, Inc. Robot manipulateur
US6016385A (en) * 1997-08-11 2000-01-18 Fanu America Corp Real time remotely controlled robot
US6629896B2 (en) 2001-12-29 2003-10-07 Steven Jones Nimble virtual reality capsule using rotatable drive assembly
GB2400686A (en) * 2003-04-04 2004-10-20 Christopher Charles Box Motion logging and robotic control and display system
DE102007035401A1 (de) 2007-07-26 2009-01-29 Technische Universität Berlin Anordnung für ein Exoskelett und Gliedmaßen-Exoskelett

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997002520A1 (fr) * 1995-06-30 1997-01-23 Ross-Hime Designs, Inc. Robot manipulateur
JPH09109069A (ja) * 1995-10-13 1997-04-28 Gen Sugano パワード・インテリジェント方法及びユニット
US6007338A (en) * 1997-11-17 1999-12-28 Disney Enterprises, Inc. Roller coaster simulator
JP2007276052A (ja) * 2006-04-06 2007-10-25 Sony Corp 制御システム、記録システム、情報処理装置および方法、プログラム、並びに記録媒体
JP5093498B2 (ja) * 2008-07-09 2012-12-12 花王株式会社 マニプレータシステム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2344454A (en) 1942-12-07 1944-03-14 Bell Telephone Labor Inc Training device
WO1997009153A1 (fr) * 1995-09-08 1997-03-13 Ross-Hime Designs, Inc. Robot manipulateur
US6016385A (en) * 1997-08-11 2000-01-18 Fanu America Corp Real time remotely controlled robot
US6629896B2 (en) 2001-12-29 2003-10-07 Steven Jones Nimble virtual reality capsule using rotatable drive assembly
GB2400686A (en) * 2003-04-04 2004-10-20 Christopher Charles Box Motion logging and robotic control and display system
DE102007035401A1 (de) 2007-07-26 2009-01-29 Technische Universität Berlin Anordnung für ein Exoskelett und Gliedmaßen-Exoskelett

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Springer Handbook of Robotics", 2008, SPINGER
HASUNUMA H ET AL: "A tele-operated humanoid robot drives a backhoe", 2003 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, PISCATAWAY, NJ, USA; [PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION],, vol. 3, 14 September 2003 (2003-09-14), pages 2998 - 3004, XP010668435, ISBN: 978-0-7803-7736-3, DOI: DOI:10.1109/ROBOT.2003.1242051 *
HITOSHI HASUNUMA ET AL: "The Tele-operation of the Humanoid Robot-Whole Body Operation for Humanoid Robots in Contact with Environment-", HUMANOID ROBOTS, 2006 6TH IEEE-RAS INTERNATIONAL CONFERENCE ON, IEEE, PI, 1 December 2006 (2006-12-01), pages 333 - 339, XP031053041, ISBN: 978-1-4244-0199-4 *
SHIU HANG IP ET AL.: "Novel 3-DOF Reconfigurable Spherical Motion Generator with Unlimited Workspace", ACRA, 2009, Retrieved from the Internet <URL:http://www.araa.asn.au/acralacra2009/papers/pap145s1.pdf>

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2800908A4 (fr) * 2012-01-06 2015-09-30 Marvel Tech Inc Boucle à double rétroaction interactive électro/hydraulique
US10166680B2 (en) 2015-07-31 2019-01-01 Heinz Hemken Autonomous robot using data captured from a living subject
US10195738B2 (en) 2015-07-31 2019-02-05 Heinz Hemken Data collection from a subject using a sensor apparatus
US20200249654A1 (en) * 2019-02-06 2020-08-06 Cy-Fighter LLC Robotic control via a virtual world simulation
US11762369B2 (en) 2019-02-06 2023-09-19 Sensory Robotics, Inc. Robotic control via a virtual world simulation

Also Published As

Publication number Publication date
DE102010023914A1 (de) 2011-09-08

Similar Documents

Publication Publication Date Title
WO2011107278A1 (fr) Procédé et dispositif de commande d&#39;un robot
AU2007335256B2 (en) Method and apparatus for haptic control
CN104363982B (zh) 上肢康复机器人系统
EP1402503A1 (fr) Simulateur d&#39;articulation programmable avec retour de force et de mouvement
JP5946767B2 (ja) 触覚フィードバックにより特有の動きを模擬する方法および該方法を実行する装置
Ceccarelli et al. Designs and prototypes of mobile robots
Santamato et al. Anywhere is possible: An avatar platform for social telepresence with full perception of physical interaction
CN109213306A (zh) 一种机器人远程控制平台及其设计方法
Karpushkin et al. Structural model of software and hardware platform for the training complex based on a controlled treadmill
JP3716134B2 (ja) 動揺装置の動作指令データ生成方法及び動揺装置
Rodriguez et al. A 3-D hand rehabilitation system using haptic device
EP3587043A1 (fr) Dispositif et procédé d&#39;immersion haptique d&#39;un utilisateur
Endo et al. A finger skill transfer system using a multi-fingered haptic interface robot and a hand motion image
Mizanoor Rahman Grasp rehabilitation of stroke patients through object manipulation with an intelligent power assist robotic system
Houda Human Interaction in a large workspace parallel robot platform with a virtual environment
CN116901066B (zh) 基于场景信息和神经调制机制驱动的机器人社会行为同步控制方法
Tri Dung et al. Collaboration Between Human–Robot Interaction Based on CDPR in a Virtual Reality Game Environment
Ranasinghe et al. An optimal state dependent haptic guidance controller via a hard rein
Tanaka et al. Analysis and Modeling of Human Force Perception Properties during the Operation of a Driving Interface System Using Limbs
Osawa et al. Enhancing empathy toward an agent by immersive learning
Kwon et al. 7-DOF horseback riding simulator based on a crank mechanism with variable radius and its inverse kinematics solution
Otis et al. Cartesian control of a cable-driven haptic mechanism
DE102005024667A1 (de) Haptische Schnittstelle
Ogrinc Information acquisition in physical human-machine interaction
Munih et al. Analysis and synthesis of human and machine motion at UL FE

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11709868

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 11709868

Country of ref document: EP

Kind code of ref document: A1