[go: up one dir, main page]

WO2022140151A1 - Manipulation contrainte d'objets - Google Patents

Manipulation contrainte d'objets Download PDF

Info

Publication number
WO2022140151A1
WO2022140151A1 PCT/US2021/063776 US2021063776W WO2022140151A1 WO 2022140151 A1 WO2022140151 A1 WO 2022140151A1 US 2021063776 W US2021063776 W US 2021063776W WO 2022140151 A1 WO2022140151 A1 WO 2022140151A1
Authority
WO
WIPO (PCT)
Prior art keywords
task
parameter
measured
end effector
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2021/063776
Other languages
English (en)
Inventor
Navid AGHASADEGHI
Alfred Anthony RIZZI
Gina Fay
Robert Eugene PAOLINI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boston Dynamics Inc
Original Assignee
Boston Dynamics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boston Dynamics Inc filed Critical Boston Dynamics Inc
Priority to EP21844471.9A priority Critical patent/EP4263151A1/fr
Priority to CN202180091621.3A priority patent/CN116745076A/zh
Publication of WO2022140151A1 publication Critical patent/WO2022140151A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/4155Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39346Workspace impedance control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40062Door opening
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40269Naturally compliant robot arm
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40517Constraint motion planning, variational dynamic programming

Definitions

  • This disclosure relates to constrained manipulation of objects using a robotic arm.
  • Robotic arms are increasingly being used in constrained or otherwise restricted environments to perform a variety of tasks or functions. These robotic arms often need to efficiently manipulate constrained objects, such as doors or switches, without requiring large computations. As robotic arms become more prevalent, there is a need for arm path planning that quickly determines and executes a path associated with a constrained object.
  • One aspect of the disclosure provides a computer-implemented method.
  • the computer-implemented method When executed by data processing hardware of a robot, the computer-implemented method causes the data processing hardware to perform operations.
  • the robot includes an articulated arm having an end effector engaged with a constrained object.
  • the operations include receiving a measured task parameter set for the end effector.
  • the measured task parameter set includes position parameters defining a position of the end effector.
  • the operations further include determining, using the measured task parameter set, at least one axis of freedom and at least one constrained axis for the end effector within a workspace.
  • the operations also include assigning a first impedance value to the end effector along the at least one axis of freedom and assigning a second impedance value to the end effector along the at least one constrained axis. Additionally, the operations include instructing the articulated arm to move the end effector along the at least one axis of freedom.
  • determining the at least one axis of freedom and the at least one constrained axis includes determining a task space model for the constrained object using the measured task parameter set.
  • the operations further include storing at least a portion of the measured task parameter set in a task buffer as task parameter records.
  • storing at least a portion of the measured task parameter set includes comparing at least one measured parameter from the measured parameter task set to a recorded parameter of one of the task parameter records of the task buffer and generating a new task parameter record when a difference between the at least one measured parameter and the recorded parameter satisfies a recording threshold.
  • the measured parameter and the recorded parameter each include a respective position parameter and/or a respective velocity parameter.
  • the operations further include evaluating the position parameters of the task parameter records to determine the at least one axis of freedom associated with the task parameter records. In other further embodiments, the operations further include evaluating the task parameter records to determine whether the end effector is engaged with the constrained object.
  • the robot includes an articulated arm, data processing hardware in communication with the articulated arm, and memory hardware in communication with the data processing hardware.
  • the articulated arm has an end effector for engaging a constrained object.
  • the memory hardware stores instructions that when executed on the data processing hardware cause the data processing hardware to perform operations.
  • the operations include receiving a measured task parameter set for the end effector.
  • the measured task parameter set includes position parameters defining a position of the end effector.
  • the operations further include determining, using the measured task parameter set, at least one axis of freedom and at least one constrained axis for the end effector within a workspace.
  • the operations also include assigning a first impedance value to the end effector along the at least one axis of freedom and assigning a second impedance value to the end effector along the at least one constrained axis. Additionally, the operations include instructing the articulated arm to move the end effector along the at least one axis of freedom. [0007] Aspects of the disclosure may include one or more of the following optional features.
  • determining the at least one axis of freedom and the at least one constrained axis includes determining a task space model for the constrained object using the measured task parameter set.
  • the operations further include storing at least a portion of the measured task parameter set in a task buffer as task parameter records.
  • storing at least a portion of the measured task parameter set includes comparing at least one measured parameter from the measured parameter task set to a recorded parameter of one of the task parameter records of the task buffer and generating a new task parameter record when a difference between the at least one measured parameter and the recorded parameter satisfies a recording threshold.
  • the measured parameter and the recorded parameter each include a respective position parameter and/or a respective velocity parameter.
  • the operations further include evaluating the position parameters of the task parameter records to determine the at least one axis of freedom associated with the task parameter records. In other further examples, the operations further include evaluating the task parameter records to determine whether the end effector is engaged with the constrained object.
  • the computer program product is encoded on a non-transitory computer readable storage medium connected to a robot.
  • the robot includes an articulated arm having an end effector for engaging a constrained object.
  • the computer program product includes instructions that when executed by a data processing apparatus cause the data processing apparatus to perform operations.
  • the operations include receiving a measured task parameter set for the end effector.
  • the measured task parameter set includes position parameters defining a position of the end effector.
  • the operations further include determining, using the measured task parameter set, at least one axis of freedom and at least one constrained axis for the end effector within a workspace.
  • the operations also include assigning a first impedance value to the end effector along the at least one axis of freedom and assigning a second impedance value to the end effector along the at least one constrained axis. Furthermore, the operations include instructing the articulated arm to move the end effector along the at least one axis of freedom.
  • determining at the least one axis of freedom and the at least one constrained axis includes determining a task space model for the constrained object using the measured task parameter set.
  • the operations further include storing at least a portion of the measured task parameter set in a task buffer as task parameter records.
  • storing at least the portion of the measured task parameters set includes comparing at least one measured parameter from the measured parameter task set to a recorded parameter of one of the task parameter records of the task buffer and generating a new task parameter record when a difference between the at least one measured parameter and the recorded parameter exceeds a recording threshold.
  • the measured parameter and the recorded parameter each include a respective position parameter and/or a respective velocity parameter.
  • the operations further include evaluating the position parameters of the task parameter records to determine the at least one axis of freedom associated with the task parameter records. In other further embodiments, the operations further include evaluating the task parameter records to determine whether the end effector is engaged with the constrained object.
  • FIG. 1 A is a schematic view of an example robot executing an arm controller for executing manipulation tasks with an arm of the robot.
  • FIG. IB is a schematic view of the arm controller of FIG. 1 A.
  • FIG. 2A is schematic view of a remote device and task manager of the arm controller of FIG. 1A.
  • FIG. 2B is a schematic view of a task instructor of the arm controller of FIG. 1A.
  • FIG. 3 is a schematic view of a task space estimator of the arm controller of FIG. 1A.
  • FIG. 4A is a schematic view of an example operation of a task observer of the arm controller of FIG. 1A.
  • FIG. 4B is a schematic view of another example operation of the task observer of FIG. 4 A.
  • FIG. 4C is a schematic view of another example operation of the task observer of FIG. 4 A.
  • FIG. 5 is a flowchart of an example arrangement of operations for a method of constrained object manipulation for in a robot arm.
  • FIG. 6 is a schematic view of an example computing device that may be used to implement the systems and methods described herein.
  • Many robots include multi-axis articulable appendages configured to execute complex movements for completing tasks, such as material handling or industrial operations (e.g., welding, gluing, and/or fastening).
  • These appendages also referred to as manipulators, typically include an end-effector or hand attached at the end of a series appendage segments or portions, which are connected to each other by one or more appendage joints.
  • the appendage joints cooperate to configure the appendage in a variety of poses P within a space associated with the robot.
  • pose refers to the position and orientation of the appendage.
  • a robot or robotic device 10 includes a base 12 having a body 13 and two or more legs 14. Each leg 14 may have an upper leg portion 15 and a lower leg portion 16.
  • the upper leg portion 15 may be attached to the body 13 at an upper joint 17 (i.e., a hip joint) and the lower leg portion 16 may be attached to the upper leg portion 15 by an intermediate joint 18 (i.e., a knee joint).
  • Each leg 14 further includes a contact pad or foot 19 disposed at a distal end of the lower leg portion 16, which provides a ground-contacting point for the base 12 of the robot 10.
  • the robot 10 further includes one or more appendages, such as an articulated arm 20 or manipulator disposed on the body 13 and configured to move relative to the body 13.
  • the articulated arm 20 may be interchangeably referred to as a manipulator, an appendage arm, or simply an appendage.
  • the articulated arm 20 includes two arm portions 22, 22a, 22b rotatable relative to one another and the body 13.
  • the articulated arm 20 may include more or less arm portions 22 without departing from the scope of the present disclosure.
  • a third arm portion 24 of the articulated arm may be interchangeably coupled to a distal end of the second portion 22b of the articulated arm 20 and may include one or more actuators 25 for gripping/grasping objects 4.
  • the articulated arm 20 includes a plurality of joints 26, 26a-26c disposed between adjacent ones of the arm portions 22, 24.
  • the first arm portion 22a is attached to the body 13 of the robot 10 by a first two-axis joint 26a, interchangeably referred to as a shoulder 26a.
  • a single-axis joint 26b connects the first arm portion 22a to the second arm portion 22b.
  • the second joint 26b includes a single axis of rotation and may be interchangeably referred to as an elbow 26b of the articulated arm 20.
  • a second two axis joint 26c connects the second arm portion 22b to the hand 24, and may be interchangeably referred to as a wrist 26c of the articulated arm 20.
  • the joints 26 cooperate to provide the articulated arm 20 with five degrees of freedom (i.e., five axes of rotation). While the illustrated example shows a five-axis articulated arm 20, the principles of the present disclosure are applicable to robotic arms having any number of axes. Furthermore, the principles of the present disclosure are applicable to robotic arms mounted to different types of bases, such as mobile bases including one or more wheels or stationary bases.
  • the robot 10 also includes a vision system 30 with at least one imaging sensor or camera 31, each sensor or camera 31 capturing image data or sensor data of the environment 2 surrounding the robot 10 with an angle of view 32 and within a field of view 34.
  • the vision system 30 may be configured to move the field of view 34 by adjusting the angle of view 32 or by panning and/or tilting (either independently or via the robot 10) the camera 31 to move the field of view 34 in any direction.
  • the vision system 30 may include multiple sensors or cameras 31 such that the vision system 30 captures a generally 360-degree field of view around the robot 10.
  • the camera(s) 31 of the vision system 30, in some implementations, include one or more stereo cameras (e.g., one or more RGBD stereo cameras).
  • the vision system 30 includes one or more radar sensors such as a scanning light-detection and ranging (LIDAR) sensor, or a scanning laser-detection and ranging (LADAR) sensor, a light scanner, a time-of-flight sensor, or any other three-dimensional (3D) volumetric image sensor (or any such combination of sensors).
  • the vision system 30 provides image data or sensor data derived from image data captured by the cameras or sensors 31 to the data processing hardware 36 of the robot 10.
  • the data processing hardware 36 is in digital communication with memory hardware 38 and, in some implementations, may be a remote system.
  • the remote system may be a single computer, multiple computers, or a distributed system (e.g., a cloud environment) having scalable / elastic computing resources and/or storage resources.
  • the robot 10 executes an arm controller 100 on the data processing hardware 36 of the robot.
  • the arm controller 100 executes on a remote device 40 in communication with the robot 10.
  • a model 342 of the constrained task space may be computed on a remote device 40 and a control system executing on the robot 10 may receive the model and determine the limited torque requests using the model.
  • the arm controller 100 may execute on a remote device 40 and the remote device 40 may provide an object manipulation task request 44 to the robot 10 to move/control the articulated arm 20 for manipulating a constrained object 4.
  • the arm controller 100 of the robot 10 controls moving the articulated arm 20 between arm poses P20.
  • the articulated arm 20 may need to move from a start pose P20 to a target pose P20 when the robot 10 is executing the task request 44.
  • the robot arm controller 100 will need to move the articulated arm 20 from a first pose P20 where the door is in a closed position to a second pose P20 where the door is in an open position.
  • the arm controller 100 may include a task manager 200, a task space estimator 300, and a task observer 400.
  • the task manager 200 receives or obtains a task request 44 for manipulating a constrained object 4 and generates task instructions 222 (FIG. 2A) including impedance parameters 238 (e.g., stiffness) (FIG. 2A) and path parameters 248 (e.g., force, position) (FIG. 2A) for executing the task request 44.
  • the task space estimator 300 is configured to receive a measured task parameter set 322 (e.g., position, force, speed) (FIGS. 2A-3) from the robot arm 20 and generate a task space model 342 (FIG. 3) defining the degrees of freedom and/or constraints associated with the task request 44.
  • the task observer 400 evaluates the measured task parameter set 322 of the robot arm 20 during execution of the task request 44 to determine whether the robot arm 20 is successfully executing the task request 44.
  • Movements and poses of the robot 10 and robot appendages 14, 20 may be defined in terms of a robot workspace based on a Cartesian coordinate system.
  • the robot workspace may be defined by six dimensions including the translational axes x, y, z and rotational axes 0x, 0 y , 0z (SE(3) manifolds).
  • actions of the robot 10 and/or the robot arm 20 may be defined using lower-dimensional spaces or manifolds including less axes than the number of axes (six) of the workspace.
  • the task request 44 may be constrained to a single axis within the workspace so that path parameters 248 can be efficiently computed along the single axis.
  • Appendages 14, 20 of the robot 10 may also be described in terms of a joint space, which refers to a space representing all possible combinations of joint configurations of a robot appendage, and is directly related to the number of degrees of freedom of the robot appendage.
  • a joint space refers to a space representing all possible combinations of joint configurations of a robot appendage, and is directly related to the number of degrees of freedom of the robot appendage.
  • a robot arm having n degrees of freedom will have an ⁇ -dimensional joint space.
  • the articulated arm has five degrees of freedom defining a five-dimensional joint space.
  • the task manager 200 includes a task interpreter 210 configured to receive or obtain task requests 44 from the remote device 40 and generate translated task requests 212 executable by the arm controller 100 to move the robot arm 20.
  • the task manager 200 further includes a task instructor 220 configured to generate task instructions 222 using the translated task request 212 and/or the task space model 342 provided by the task space estimator 300.
  • the task manager 200 generally receives a task request 44 and generates task instructions 222 that comply with the task space model 342.
  • FIG. 2 A shows an example operation of the task interpreter 210 generating the translated task request 212 based on the task request 44 received or obtained from the remote device 40.
  • a user may interact with a user interface 42 displayed on a screen in communication with the remote device 40 to select task characteristics 46, 46a-d for the task request 44.
  • the user interface 42 may graphically display one or more buttons 46a for defining the type of object 4 (e.g., crank, door, drawer, etc.) for manipulation and one or more buttons 46b for selecting task parameters (e.g., speed, force, direction, etc.).
  • the user interface 42 may also include a force application window 46c including a graphical representation of the robot 10 or the robot arm 20.
  • a user may use touch gestures to indicate a general direction for moving the robot arm 20 or applying force, such as an initial direction of movement for the arm 20 or initial direction for applying force. For example, a user may swipe to the left from the displayed end effector 24 to indicate a desired initial movement of the end effector 24 to the left.
  • the user interface 42 also includes a task location window 46d for displaying a location in the robot environment 2 corresponding to the initial movement of the end effector 24.
  • the task location window 46d may be based on the image data from the camera 31 of the robot 10. Thus, a user can select a location within the field of view 34 of the robot 10. [0035] In the example of FIG.
  • the user interface 42 is shown with user inputs or selection points 48, -c corresponding to each of the task characteristics 46.
  • the inputs 48 include a first input 48a identifying the constrained task type 46a as being a door and a second input 48b selecting that task parameters 46b include moving the arm 20 in a downward or backward direction depending on the constrained task type 46a (e.g., a door would be backward, a switch would be downward).
  • the user selects the right-hand side of the end effector 24 for applying the selected task parameters 46b.
  • the task request 44 generally includes characteristics 46 for pulling a door handle in a backward direction using the end effector 24.
  • the task interpreter 210 of the task manager 200 receives the task request 44 and translates the task characteristics 46 into translational and/or rotational coordinates based on the robot workspace. For example, the task interpreter 210 may translate the user-selected backward direction 48b associated with the task type 46a of opening a door into movements along the x-y plane of the workspace (FIG. 1). The translated task request 212 is then sent to the task instructor 220.
  • the task instructor 220 includes an impedance manager 230 for assigning the impedance parameters 238 to each Cartesian dimension of the end effector 24 and a path manager 240 for assigning instructed path parameters 248 to the end effector 24.
  • the task instructor 220 generates task instructions 222 including the impedance parameters 238 and the path parameters 248.
  • the task instructor 220 receives the translated task request 212 and generates an initial iteration of task instructions 222ifor moving the robot arm 20 according to the task request 44.
  • the initial iteration of the task instructions 222i may be based on user-defined task parameters 238, 248 and/or the translated task request 212 since the task space model 342 associated with the task request 44 (i.e., the degrees of freedom associated with the task type 46a) is undefined.
  • the impedance manager 230 may assign a user-defined impedance setpoint 232 as the instructed impedance parameters 238i until the task space estimator 300 has sufficient data about the task request 44 to define the task space model 342 for the task request 44.
  • the path manager 240 may generate path parameters 2481 based on the directional and rotational coordinates of the translated task request 212 corresponding to the one or more of the user inputs 48a-48c.
  • the initial task instructions 222i are then sent to the robot arm 20 for initiating the task request 44.
  • the robot arm 20 includes one or more sensors that measure the position, force, and/or velocity parameters of the robot arm 20 after each iteration of the task instructions 222 and generate a corresponding measured task parameter set 322.
  • Each iteration of the measured task parameter set 322 is sent to the task space estimator 300, which uses the measured task parameter set 322 to generate the task space model 342 associated with the task request 44.
  • the task space estimator 300 may receive or obtain the path parameters 248 from the task instructor 220 to generate the task space model 342 using the same steps as described below with respect to the measured task parameter set 322.
  • a task recorder 320 receives each iteration of the measured task parameter set 322iand analyzes the received iteration of the measured task parameter set 322i to determine whether to store the measured task parameter set 3221 as a new task parameter record 324 at a task buffer 330, e.g., residing on the memory hardware 38 of the robot 10.
  • the task recorder 320 is configured to generate and store a new task parameter record 324 based on force parameters F322 and/or position parameters P322 of the measured task parameter set 3221.
  • task parameter records 324 are not added to the task buffer 330 for every iteration of the measured task parameter set 322, but only when a value of one of the parameters of the measured task parameter set 3221 satisfies a recording threshold A324.
  • the task recorder 320 generates a new task parameter record 324 based on a measured position parameter P3221.
  • the task recorder 320 stores the first iteration of the measured task parameter set 322n as an initial task parameter record 3241+1.
  • the task recorder 320 compares the measured position parameter P3221 of the received iteration of the measured task parameter set 3221 against a position parameter P3241-1 of the last-stored task parameter record 324i-i in the task buffer 330.
  • the task recorder 320 generates and stores, in the task buffer 330, a new task parameter record 324i when the measured position parameter P3221 of the measured task parameter set 3221 is different from the stored position parameter P3241-1 of the last-stored task parameter record 324i-i by a position-based recording threshold A324P set by the robot user.
  • task parameter records 324 are not added to the task buffer 330 for every iteration of the task request 44, but only when the change in position exceeds the recording threshold A324P.
  • the task recorder 320 may optionally generate and add task parameter records 324 based on the measured force parameter F3221 of the measured task parameter set 322i.
  • the task recorder 320 compares a measured force parameter F3221 of the received iteration of the measured task parameter set 322i against a stored force parameter F3241-1 of the last-stored task parameter record 324i-i in the task buffer 330.
  • the task recorder 320 generates and stores a new task parameter record 324i when the measured force parameter F3221 of the measured task parameter set 322i is different from the stored force parameter F3241-1 of the last-stored task parameter record 324i-i by a force-based recording threshold A324F calibrated by the robot operator.
  • force-based recording results in records 324 being added to the task buffer 330 only when the record 324 represents movement along a path associated with the task (e.g, horizontal arc of a door, vertical arc of a switch).
  • the force-based recording may filter measured task parameter set 3221 where the measured forces F3221 are too high, as high forces may be associated with a constrained axis of the manipulated object 4.
  • the switch may have a known pull-force (i.e., the force threshold Fthresh) associated with the arcuate switch path.
  • Using position-based task parameter recording and/or force-based task parameter recording ensures that task parameter records 324 that are added to the task buffer 330 represent actual changes in position of the robot arm 20 along the axes of freedom of the object 4, which can then be evaluated by the task space generator 340 to determine actual movement of the robot arm 20 and the task space model 342.
  • using time-based or velocity -based storage may result in storage of a large number of records 324 associated with a relatively small change in position and/or undesired changes in position along constrained axes of the object 4.
  • the task buffer 330 is configured as a circular buffer 330 having a fixed-sized queue and a first-in-first-out (FIFO) data characteristic.
  • the task buffer 330 only stores a task parameter record set 332 including a predetermined number n of measured position parameter records 324.
  • the task parameter record set 332 includes six populated task parameter records 324i- n arranged between a write pointer 331a and a read pointer 33 lb.
  • New task parameter records 324 are added to vacant record slots in front of the write pointer 331a.
  • the size of the task parameter record set 332 i.e., the number of task parameter records 324 may be tuned by a user depending on desired characteristics of the task space model 342 and/or task instructions 222. Larger task parameter record sets 332 will provide a more complete representation of the task position history, but including too many task parameter records 324 may result in the older task parameter records 324 inaccurately representing the current trajectory of the end effector 24.
  • the task space generator 340 obtains the current task parameter record set 332i including the added task parameter record 324iand generates a task space model 342 based on the task parameter record set 332i.
  • the task space generator 340 evaluates the task parameter record set 332ito determine main axes or a plane along which the task parameter records 324 of the task parameter record set 332iare best-fit. For example, in FIG.
  • the task space generator 340 may decompose the task parameter record set 332i, which includes the task parameter records 324 including translational (x, y, z) and/or rotational (0x, 0 y , 0 Z ) coordinates, into a lower-dimensional (e.g., one-dimensional, two-dimensional) task space model 342.
  • the task space generator 340 may use singular value decomposition (SVD) to determine the axes or plane of freedom associated with the task parameter record set 332i. For example, where the task request 44 includes opening a door 4, the task space generator 340 decomposes the task parameter record set 332iinto a two- dimensional task space model 342a projected to the x-y plane of the workspace.
  • singular value decomposition singular value decomposition
  • the task space generator 340 In other examples, such as opening a drawer, the task space generator 340 generates a singledimensional task space model 342b projected to a single axis of the workspace (e.g., the y-axis). In yet another example where the task request 44 is associated with turning a wheel or crank, the task space generator 340 generates a two-dimensional task space model 342c on the x-z plane.
  • Each iteration of the task space model 342i is sent to or obtained by the task instructor 220, which uses the iteration of the task space model 342i to generate a new iteration of task instructions 222i+i for the robot arm 20.
  • the task space model 342i includes an arcuate path based on the task parameter record set 332i (FIG. 3B) that is constrained to the x-y plane.
  • the task space model 342i provides the path manager 240 with a lower-dimensional task space for determining a new iteration of task instructions 222i+i.
  • the path manager 240 includes a task path estimator 242 and a task path filter 244.
  • the task path estimator 242 evaluates the task space model 342 and generates a task path model 243, 243i representing the estimated task direction for the next iteration of the task instructions 222i+i.
  • Generating the task path model 243 i for each iteration may include further decomposing the task space model 342i into the lower-dimensional task path model 243 (i.e., from two-dimensional plane to one-dimensional axis).
  • the task path model 243 includes a onedimensional tangent to the two-dimensional task space model 342i.
  • the task path generator 242 extrapolates the task instructions 222 to determine the current location of the end effector 24. The tangent is then taken at the point of the task space model 342i associated with the current position of the end effector 24.
  • the task path filter 244 determines whether the task path model 243i complies with a path model quality threshold. For example, the task path filter 244 may compare or fit the task path model 243ito a previous iteration of the task path model 243i-i and/or the task space model 342ito determine the quality of the task path model 243i. Where the task path model 243i exceeds a threshold value (e.g. an error value), the task path filter 244 may discard the current iteration of the task path model 243iand select the previous iteration of the task path model 243i-i.
  • a threshold value e.g. an error value
  • the task path filter 244 sends the filtered task path model 243f (i.e., either current or previous iteration of the task path model 243i, 243i-i) to the task path instructor 246, which generates new path parameters 248i+i for the robot arm 20.
  • the new path parameters 248i+i are based on the lower-dimensional filtered task path model 243f and include force or position parameters for moving the end effector 24 along the direction of the filtered task path model 243f.
  • the task path instructor 246 simply applies a force along the direction tangent to the path (e.g., door arc) associated with the task request 44.
  • the task instructor 220 can quickly compute the task instructions 222 for manipulating the constrained object 4 by applying forces only along the axes of freedom of the object 4.
  • the impedance manager 230 receives or obtains the task space model 342i from the task space estimator 300 and determines the impedance (i.e., stiffness) of the end effector 24 of the robot arm 20 for the current task request 44.
  • the impedance manager 230 is configured to evaluate the task space model 342i and to assign lower impedance to the end effector 24 along axes that the arm controller 100 expects the end effector 24 to travel and to assign higher impedance along axes that the arm controller 100 expects the end effector 24 to be constrained.
  • the impedance manager 230 assigns relatively low impedance values (i.e., joint stiffness) to the end effector 24 along the x-axis and the y-axis. Conversely, the impedance manager 230 assigns a relatively high impedance value (i.e., stiffness) along the z-axis since the task space model 342i indicates that the object 4 is constrained along z-direction.
  • Assigning low impedance values along the free axes of the task space model 342i and high impedance values along the constrained axes of the task space model 342i allows the end effector 24 to rotate or pivot along the task path as the arm 20 executes the task request 44 (e.g., opening the door) while maintaining stiffness along directions that are not expected to have movement.
  • the impedance manager 230 may consider other inputs in determining the impedance parameters 238i+i. For example, the impedance manager 230 may select impedance parameters 238i+r based on the impedance setpoint 232 provided by the user. Additionally or alternatively, the impedance manager 230 may consider the input task parameters 48b received from the remote device 40.
  • the impedance manager 230 may give greater weight to the impedance setpoint 232 and/or the input task parameters 48b to determine the impedance values for the next iteration of impedance parameters 238i+i.
  • the impedance manager 230 will determine and assign impedance values based on the task input 48b and/or the impedance setpoint 232.
  • the task instructions 222i+i allow the end effector 24 to execute the task request 44 with minimal computation required at arm controller 100 by allowing the end effector 24 to follow the constrained path of the object 4 associated with the task request 44 (e.g., door arc, wheel circle, drawer axis).
  • the robot 10 executes the task instructions 222i+i, a new iteration of a measured task parameter set 322i+ 1 is transmitted to the task space estimator 300 for evaluation and generating an updated iteration of the task space model 342i+i.
  • the task observer 400 is configured to continuously evaluate the measured task parameter set 322 (e.g., position, speed) of the end effector 24 to determine whether the end effector 24 is still engaged with the constrained object 4.
  • the task observer 400 then generates a task status 402 identifying whether the end effector 24 is engaged and transmits the task status 402 to the task instructor 220.
  • the task instructor 220 may terminate the task request 44 and cease generating new task instructions 222 and revert to a safe stopping behavior until the end effector 24 can be reengaged with the object 4.
  • the task observer 400 evaluates the most-recent task parameter records 324i and determines the task status 402 based on whether the measured position P322 and/or velocity V322 of the end effector 24 satisfies task criteria. For example, the task observer 400 may evaluate the task parameter records 324 to determine whether the end effector 24 fits the task space model 342 and/or satisfies a velocity threshold Vthresh. [0055] In FIG. 4A, the task observer 400 includes a task evaluator 410 that receives the current task parameter record set 332i and/or the task space model 342i including the task parameter records 324.
  • the task evaluator 410 may segregate the task parameter records 324 into an evaluation record set 412 including a number of the most-recent task parameter records 324 and a model record set 414 including a number of the records 324 immediately preceding the records 324 selected for the evaluation record set 412.
  • the evaluation record set 412 includes the four most-recent task parameter records 324 and the model record set 414 includes the ten task parameter records 324 preceding the evaluation record set 412.
  • the task evaluator 410 may be tuned to select any number of records 324 for the evaluation record set 412 and the model record set 414.
  • the task evaluator 410 checks a fit of the evaluation record set 412 against the model record set 414.
  • the task evaluator 410 determines that the end effector 24 is disengaged (e.g., lost grip) from the constrained object 4.
  • the task evaluator 410 may be configured to determine a quality of the current iteration of the task parameter record set 332i and to evaluate the task parameter record set 332i based on the quality. For example, as previously discussed, the task evaluator 410 segregates the task parameter records set 332i into an evaluation record set 412 including a first number of task parameter records 324 and a model record set 414 including a second number of task parameter records 324.
  • the task evaluator 410 determines the quality of the task parameter record set 332i based on whether the task parameter record set 332i includes a quantity of task parameter records 324 needed to populate the evaluation record set 412 and the model record set 414. Depending on the number of records 324 in the task parameter record set 332i, the task evaluator 410 designates the task parameter record set 332i as an optimal task parameter record set 332a (FIG. 4A), a sufficient record set 332b (FIG. 4B), or an insufficient task parameter record set 332c (FIG. 4C).
  • the evaluation record set 412 includes four (4) slots for task parameter records 324 and the model record set 414 includes ten (10) slots for task parameter records 324.
  • the task evaluator 410 determines that a task parameter record set 332i is an optimal task parameter record set 332a (FIG. 4 A) when the task parameter record set 332i includes enough task parameter records 324 to populate each of the record sets 412, 414 with unique task parameter records 324 (i.e., no shared task parameter records between record sets 412, 414).
  • an optimal task parameter record set 332a includes at least fourteen (14) task parameter records 324 for uniquely populating the record sets 412, 414, as shown in FIG. 4A.
  • the task evaluator 410 determines that a current iteration of the task parameter record set 332i is a sufficient task parameter record set 332b (i.e., less than optimal) when the task parameter record set 332i includes at least enough task parameter records 324 to populate each of the evaluation record set 412 and the model record set 414, but with some of the task parameter records 324 of the sufficient task parameter record set 332b being shared between the record sets 412, 414.
  • a sufficient task parameter record set 332b includes twelve (12) task parameter records 324 (FIG.
  • the evaluation record set 412 is populated with the four most-recent task parameter records 324 (e.g., records 1-4) and the model record set 414 is populated with the ten (10) oldest records (e.g., records 3-12) such that the third and fourth records 324 of the sufficient task parameter record set 332b are included in both record sets 412, 414.
  • the model record set 414 is populated with the ten (10) oldest records (e.g., records 3-12) such that the third and fourth records 324 of the sufficient task parameter record set 332b are included in both record sets 412, 414.
  • the task evaluator 410 determines that a current iteration of the task parameter record set 332i is an insufficient task parameter record set 332c (i.e., cannot be evaluated) when the task parameter record set 332i does not include enough task parameter records 324 to populate the model record set 414 and provide at least one unique record to the evaluation record set 412.
  • the task parameter record set 332i would need at least eleven task parameter records 324 to populate the ten (10) slots of the model record set 414 and provide one (1) unique record to the evaluation record set 412.
  • an insufficient task parameter record set 332c includes eight (8) task parameter records 324 (FIG. 4C)
  • the task evaluator 410 will not evaluate the task parameter record set 332c. If the task parameter record set 332i is an insufficient task parameter record set 332c, the task observer 400 does not evaluate the insufficient task parameter record set 332c.
  • the task observer 400 determines that the end effector 24 has prematurely disengaged the constrained object 4 (i.e., lost grip) and generates a task status 402 indicating that end effector 24 has failed to complete the task request 44.
  • the task observer 400 evaluates the overall quality of a fit of the evaluation record set 412 by determining the mean squared error of the fit of the evaluation record set 412 to the model record set 414.
  • the task observer 400 Where the mean squared error exceeds a threshold fit error value, the task observer 400, generates the task status 402 indicating that the end effector 24 has become disengaged from the constrained object 4. Additionally or alternatively, the task observer 400 may determine that the end effector 24 has disengaged the object 4 when a measured speed or velocity V322 of the end effector 24 exceeds a threshold velocity Vthresh.
  • the threshold velocity Vthresh may be a fixed value or may be based on the measured velocities V322 included in the task parameter records 324 of the model record set 414.
  • FIG. 5 is a flowchart of an example arrangement of operations for a method 500 for manipulating constrained objects using a robot arm 20.
  • the method 500 may be a computer implemented method executed by data processing hardware of the robot 10, which causes the data processing hardware to perform operations.
  • the method 500 includes receiving a plurality of measured task parameter sets 322 for the end effector 24, each measured task parameter set 322 including position parameters P322 defining a position of the end effector 24 relative to a workspace 2.
  • the method 500 also includes determining, using the measured task parameter sets 322, at least one axis of freedom x, y, z, 0 X , 0 y , 0z and at least one constrained axis x, y, z, 0 X , 0 y , 0 Z for the end effector 24 within the workspace 2.
  • Another operation 506 of the method includes assigning a first impedance value 238 to the end effector 24 along the at least one axis of freedom x, y, z, 0 X , 0 y , 0 Z and assigning a second impedance value 238 to the end effector 24 along the at least one constrained axis x, y, z, 0 X , 0 y , 0 Z , the second impedance value 238 greater than the first impedance value 238.
  • the method 500 includes instructing the articulated arm 20 to move the end effector 24 along the at least one axis of freedom x, y, z, 0 X , 0 y , 0 Z .
  • FIG. 6 is schematic view of an example computing device 600 that may be used to implement the systems and methods described in this document.
  • the computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
  • the components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document.
  • the computing device 600 includes a processor 610, memory 620, a storage device 630, a high-speed interface/controller 640 connecting to the memory 620 and high-speed expansion ports 650, and a low speed interface/controller 660 connecting to a low speed bus 670 and a storage device 630.
  • Each of the components 610, 620, 630, 640, 650, and 660 are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 610 can process instructions for execution within the computing device 600, including instructions stored in the memory 620 or on the storage device 630 to display graphical information for a graphical user interface (GUI) on an external input/output device, such as display 680 coupled to high speed interface 640.
  • GUI graphical user interface
  • multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
  • multiple computing devices 600 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
  • the memory 620 stores information non-transitorily within the computing device 600.
  • the memory 620 may be a computer-readable medium, a volatile memory unit(s), or non-volatile memory unit(s).
  • the non-transitory memory 620 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by the computing device 600.
  • non-volatile memory examples include, but are not limited to, flash memory and read-only memory (ROM) / programmable read-only memory (PROM) / erasable programmable read-only memory (EPROM) / electronically erasable programmable readonly memory (EEPROM) (e.g., typically used for firmware, such as boot programs).
  • volatile memory examples include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes.
  • the storage device 630 is capable of providing mass storage for the computing device 600.
  • the storage device 630 is a computer- readable medium.
  • the storage device 630 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
  • a computer program 700 product is tangibly embodied in an information carrier.
  • the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
  • the information carrier is a computer- or machine-readable medium, such as the memory 620, the storage device 630, or memory on processor 610.
  • the high speed controller 640 manages bandwidth-intensive operations for the computing device 600, while the low speed controller 660 manages lower bandwidthintensive operations. Such allocation of duties is exemplary only.
  • the high-speed controller 640 is coupled to the memory 620, the display 680 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 650, which may accept various expansion cards (not shown).
  • the low-speed controller 660 is coupled to the storage device 630 and a low-speed expansion port 690.
  • the low-speed expansion port 690 which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • the computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 600a or multiple times in a group of such servers 600a, as a laptop computer 600b, or as part of a rack server system 600c.
  • Various implementations of the systems and techniques described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
  • ASICs application specific integrated circuits
  • These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • the processes and logic flows described in this specification can be performed by one or more programmable processors, also referred to as data processing hardware, executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
  • processors and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • a display device e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
  • a keyboard and a pointing device e.g., a mouse or a trackball
  • a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Numerical Control (AREA)

Abstract

Un procédé mis en œuvre par ordinateur (500) exécuté par un matériel de traitement de données (36) d'un robot (10) amène le matériel de traitement de données à réaliser des opérations. Le robot comporte un bras articulé (20) présentant un effecteur terminal (24) en prise avec un objet contraint (4). Les opérations comportent la réception d'un ensemble de paramètres de tâche mesurés (322) pour l'effecteur terminal, comportant des paramètres de position (P322) définissant une position de l'effecteur terminal. Les opérations comportent la détermination, au moyen de l'ensemble de paramètres de tâche mesuré, d'au moins un axe de liberté et d'au moins un axe contraint pour l'effecteur terminal. Les opérations comportent l'attribution d'une première valeur d'impédance (238) à l'effecteur terminal le long dudit axe de liberté et d'une seconde valeur d'impédance (238) à l'effecteur terminal le long dudit axe contraint. Les opérations comportent l'instruction donnée au bras articulé de déplacer l'effecteur terminal le long dudit axe de liberté.
PCT/US2021/063776 2020-12-21 2021-12-16 Manipulation contrainte d'objets Ceased WO2022140151A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21844471.9A EP4263151A1 (fr) 2020-12-21 2021-12-16 Manipulation contrainte d'objets
CN202180091621.3A CN116745076A (zh) 2020-12-21 2021-12-16 对象的约束操纵

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063128573P 2020-12-21 2020-12-21
US63/128,573 2020-12-21

Publications (1)

Publication Number Publication Date
WO2022140151A1 true WO2022140151A1 (fr) 2022-06-30

Family

ID=79686710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/063776 Ceased WO2022140151A1 (fr) 2020-12-21 2021-12-16 Manipulation contrainte d'objets

Country Status (4)

Country Link
US (2) US12325131B2 (fr)
EP (1) EP4263151A1 (fr)
CN (1) CN116745076A (fr)
WO (1) WO2022140151A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12325131B2 (en) 2020-12-21 2025-06-10 Boston Dynamics, Inc. Constrained manipulation of objects

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11287826B2 (en) 2018-10-12 2022-03-29 Boston Dynamics, Inc. Terrain aware step planning system
KR102762233B1 (ko) 2019-08-06 2025-02-04 보스턴 다이나믹스, 인크. 제약된 이동성 매핑
US12468300B2 (en) 2021-06-04 2025-11-11 Boston Dynamics, Inc. Detecting negative obstacles

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010136961A1 (fr) * 2009-05-29 2010-12-02 Koninklijke Philips Electronics N.V. Dispositif et procédé de commande d'un robot

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0864401B1 (fr) 1996-07-24 2006-02-15 Fanuc Ltd Procede de commande de deplacement graduel de robots
US6522952B1 (en) * 1999-06-01 2003-02-18 Japan As Represented By Secretary Of Agency Of Industrial Science And Technology Method and system for controlling cooperative object-transporting robot
JP3833567B2 (ja) * 2002-05-01 2006-10-11 本田技研工業株式会社 移動ロボットの姿勢制御装置
WO2007080733A1 (fr) * 2006-01-13 2007-07-19 Matsushita Electric Industrial Co., Ltd. Dispositif et procede de commande de bras robotise, robot et programme
KR101494344B1 (ko) 2008-04-25 2015-02-17 삼성전자주식회사 휴머노이드 로봇의 움직임 제어 시스템 및 그 방법
DE102009012328A1 (de) 2009-03-09 2010-09-16 Weber Maschinenbau Gmbh Breidenbach Vorrichtung zum Betreiben eines Roboters
US8918211B2 (en) * 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9205887B2 (en) 2010-02-25 2015-12-08 Honda Motor Co., Ltd. Constrained resolved acceleration control
US8781629B2 (en) 2010-09-22 2014-07-15 Toyota Motor Engineering & Manufacturing North America, Inc. Human-robot interface apparatuses and methods of controlling robots
WO2012142587A1 (fr) 2011-04-15 2012-10-18 Irobot Corporation Procédé et système de recherche automatique pour un véhicule télécommandé
KR20130000496A (ko) 2011-06-23 2013-01-03 현대중공업 주식회사 가속도센서와 자이로센서를 구비한 로봇 교시장치와 이를 이용한 로봇제어방법
US10402503B2 (en) 2012-09-13 2019-09-03 Ben-Gurion University Of The Negev & Development Authority Method and system for designing a common end effector for a robot which is capable of grasping plurality of parts, each having its own geometry
JP6206804B2 (ja) 2013-09-27 2017-10-04 パナソニックIpマネジメント株式会社 移動体追跡装置、移動体追跡システムおよび移動体追跡方法
US9364951B1 (en) 2013-10-14 2016-06-14 Hrl Laboratories, Llc System for controlling motion and constraint forces in a robotic system
US9283674B2 (en) 2014-01-07 2016-03-15 Irobot Corporation Remotely operating a mobile robot
JP6361213B2 (ja) 2014-03-26 2018-07-25 セイコーエプソン株式会社 ロボット制御装置、ロボット、ロボットシステム、教示方法、及びプログラム
JP6443837B2 (ja) * 2014-09-29 2018-12-26 セイコーエプソン株式会社 ロボット、ロボットシステム、制御装置、及び制御方法
JP6690265B2 (ja) 2015-03-19 2020-04-28 株式会社デンソーウェーブ ロボット操作装置、ロボット操作方法
US20170259433A1 (en) * 2016-03-11 2017-09-14 Seiko Epson Corporation Robot control device, information processing device, and robot system
US10315311B2 (en) 2016-03-22 2019-06-11 The Boeing Company Robots, robotic systems, and related methods
US9687983B1 (en) * 2016-05-11 2017-06-27 X Development Llc Generating a grasp pose for grasping of an object by a grasping end effector of a robot
US10166676B1 (en) 2016-06-08 2019-01-01 X Development Llc Kinesthetic teaching of grasp parameters for grasping of objects by a grasping end effector of a robot
JP6889574B2 (ja) 2017-03-03 2021-06-18 株式会社キーエンス ロボット設定装置、ロボット設定方法、ロボット設定プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器
US10766140B2 (en) 2017-04-13 2020-09-08 Battelle Memorial Institute Teach mode collision avoidance system and method for industrial robotic manipulators
EP3515671B1 (fr) 2017-06-19 2020-05-13 Google LLC Prédiction de saisie robotique au moyen de réseaux neuronaux et d'une représentation d'objet sensible à la géométrie
US20190143517A1 (en) 2017-11-14 2019-05-16 Arizona Board Of Regents On Behalf Of Arizona State University Systems and methods for collision-free trajectory planning in human-robot interaction through hand movement prediction from vision
US11103994B2 (en) 2018-07-02 2021-08-31 Teradyne, Inc. System and method for natural tasking of one or more robots
US11035183B2 (en) 2018-08-03 2021-06-15 National Oilwell Varco, L.P. Devices, systems, and methods for top drive clearing
US11279044B2 (en) 2018-08-03 2022-03-22 Yaskawa America, Inc. Robot instructing apparatus, teaching pendant, and method of instructing a robot
RU2700246C1 (ru) 2019-03-21 2019-09-20 Публичное Акционерное Общество "Сбербанк России" (Пао Сбербанк) Способ и система захвата объекта с помощью роботизированного устройства
US20200306998A1 (en) 2019-03-25 2020-10-01 Boston Dynamics, Inc. Multi-Body Controller
JP7065802B2 (ja) * 2019-03-26 2022-05-12 株式会社日立製作所 軌道生成装置、軌道生成方法、プログラム、及びロボットシステム
US11813758B2 (en) 2019-04-05 2023-11-14 Dexterity, Inc. Autonomous unknown object pick and place
US11325256B2 (en) 2020-05-04 2022-05-10 Intrinsic Innovation Llc Trajectory planning for path-based applications
US11597078B2 (en) 2020-07-28 2023-03-07 Nvidia Corporation Machine learning control of object handovers
WO2022140190A1 (fr) 2020-12-21 2022-06-30 Boston Dynamics, Inc. Préhension autonome supervisée
EP4263151A1 (fr) 2020-12-21 2023-10-25 Boston Dynamics, Inc. Manipulation contrainte d'objets
US20220193906A1 (en) 2020-12-21 2022-06-23 Boston Dynamics, Inc. User Interface for Supervised Autonomous Grasping
CN118742420A (zh) 2022-01-21 2024-10-01 波士顿动力公司 机器人设备的协调主体运动的系统和方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010136961A1 (fr) * 2009-05-29 2010-12-02 Koninklijke Philips Electronics N.V. Dispositif et procédé de commande d'un robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ALEXEY SKRYNNIK ET AL: "Forgetful Experience Replay in Hierarchical Reinforcement Learning from Demonstrations", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 17 June 2020 (2020-06-17), XP081697976 *
NAIR ASHVIN ET AL: "Overcoming Exploration in Reinforcement Learning with Demonstrations", 2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), IEEE, 21 May 2018 (2018-05-21), pages 6292 - 6299, XP033403793, DOI: 10.1109/ICRA.2018.8463162 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12325131B2 (en) 2020-12-21 2025-06-10 Boston Dynamics, Inc. Constrained manipulation of objects

Also Published As

Publication number Publication date
CN116745076A (zh) 2023-09-12
US20250269522A1 (en) 2025-08-28
US20220193898A1 (en) 2022-06-23
US12325131B2 (en) 2025-06-10
EP4263151A1 (fr) 2023-10-25

Similar Documents

Publication Publication Date Title
US12325131B2 (en) Constrained manipulation of objects
Lippiello et al. Position-based visual servoing in industrial multirobot cells using a hybrid camera configuration
CN102814814B (zh) 一种双臂机器人基于Kinect的人机交互方法
KR20230137334A (ko) 동적 사이트들에서의 로봇 자율성을 위한 시맨틱 모델들
US20150112482A1 (en) Teaching system and teaching method
KR102001214B1 (ko) 가상 현실 기반 양팔로봇 교시 장치 및 방법
US12346116B2 (en) Online authoring of robot autonomy applications
US20220193894A1 (en) Supervised Autonomous Grasping
US12064879B2 (en) Global arm path planning with roadmaps and precomputed domains
US20240351217A1 (en) Object-Based Robot Control
US12440970B2 (en) Arm and body coordination
US20220193906A1 (en) User Interface for Supervised Autonomous Grasping
US11999059B2 (en) Limiting arm forces and torques
CN113966264B (zh) 用于基于接触定位在通过机器人操纵时可移动的对象的方法、计算机程序产品和机器人控制装置以及机器人
US20240269838A1 (en) Limiting arm forces and torques
WO2022140199A1 (fr) Coordination de bras et de corps
Sieusankar et al. A review of current techniques for robotic arm manipulation and mobile navigation
US20250196339A1 (en) Automated constrained manipulation
Wang et al. Human-machine collaboration in robotics: Integrating virtual tools with a collision avoidance concept using conglomerates of spheres

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21844471

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021844471

Country of ref document: EP

Ref document number: 202180091621.3

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021844471

Country of ref document: EP

Effective date: 20230721