US20220193900A1 - Arm and Body Coordination - Google Patents
Arm and Body Coordination Download PDFInfo
- Publication number
- US20220193900A1 US20220193900A1 US17/318,435 US202117318435A US2022193900A1 US 20220193900 A1 US20220193900 A1 US 20220193900A1 US 202117318435 A US202117318435 A US 202117318435A US 2022193900 A1 US2022193900 A1 US 2022193900A1
- Authority
- US
- United States
- Prior art keywords
- base
- robot
- arm
- articulated arm
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1615—Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
- B25J9/162—Mobile manipulator, movable base with manipulator arm mounted on it
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D57/00—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
- B62D57/02—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
- B62D57/032—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39172—Vehicle, coordination between manipulator arm and its moving vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40298—Manipulator on vehicle, wheels, mobile
Definitions
- This disclosure relates to coordinating arm and body controls in a robot.
- Robotic arms are increasingly being used in constrained or otherwise restricted environments to perform a variety of tasks or functions. These robotic arms often need to efficiently manipulate constrained objects, such as doors or switches, without requiring large computations. As robotic arms become more prevalent, there is a need for coordinating movements of the robot body based on commands or instructions of the robotic arm.
- One aspect of the disclosure provides computer-implemented method that, when executed by data processing hardware of a robot having an articulated arm and a base, causes the data processing hardware to perform operations.
- the operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot.
- the operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location.
- the operations also include generating base parameters associated with the task request.
- the operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.
- instructing the base of the robot to move from the current base configuration to the anticipatory base configuration includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location.
- instructing the base of the robot to move the workspace of the articulated arm from the first location to the second location includes instructing the base of the robot to change one of a location or a pose of the base.
- the operations further include generating arm instructions for changing a configuration of the articulated arm within the workspace of the articulated arm.
- the operations further include receiving arm sensor data of the articulated arm associated with changing the configuration of the articulated arm, and instructing, based on the arm sensor data, the base of the robot to move from the anticipatory base configuration to a responsive base configuration.
- the base parameters include at least one of position coordinates for the robot base or balancing parameters for the robot base.
- the task request includes a request to move an object outside of the workspace of the articulated arm at the first location. In some configurations, the task request includes instructing the base of the robot to follow a continuous path outside of the workspace of the articulated arm at the first location.
- a robot including a base, an articulated arm coupled to the base, data processing hardware and memory hardware storing instructions that, when executed by the data processing hardware, cause the data processing hardware to perform operations.
- the operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot.
- the operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location.
- the operations also include generating base parameters associated with the task request.
- the operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.
- instructing the base of the robot to move from the current base configuration to the anticipatory base configuration includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location.
- instructing the base of the robot to move the workspace of the articulated arm from the first location to the second location includes instructing the base of the robot to change one of a location or a pose of the base.
- the operations further include generating arm instructions for changing a configuration of the articulated arm within the workspace of the articulated arm.
- the operations further include receiving arm sensor data of the articulated arm associated with changing the configuration of the articulated arm, and instructing, based on the arm sensor data, the base of the robot to move from the anticipatory base configuration to a responsive base configuration.
- the base parameters include at least one of position coordinates for the robot base or balancing parameters for the robot base.
- the task request includes a request to move an object outside of the workspace of the articulated arm at the first location. In some configurations, the task request includes instructing the base of the robot to follow a continuous path outside of the workspace of the articulated arm at the first location.
- Another aspect of the disclosure provides a computer program product encoded on a non-transitory computer readable storage medium of a robot including a base and an articulated arm coupled to the base, the computer readable storage medium including instructions that when executed by a data processing apparatus cause the data processing apparatus to perform operations.
- the operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot.
- the operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location.
- the operations also include generating base parameters associated with the task request.
- the operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.
- instructing the base of the robot to move from the current base configuration to the anticipatory base configuration includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location.
- instructing the base of the robot to move the workspace of the articulated arm from the first location to the second location includes instructing the base of the robot to change one of a location or a pose of the base.
- the operations further include generating arm instructions for changing a configuration of the articulated arm within the workspace of the articulated arm. In some implementations, the operations further include receiving arm sensor data of the articulated arm associated with changing the configuration of the articulated arm, and instructing, based on the arm sensor data, the base of the robot to move from the anticipatory base configuration to a responsive base configuration.
- FIGS. 1A and 1B are schematic views of an example robot executing an arm controller and a body controller for performing tasks with an arm of the robot.
- FIGS. 1C and 1D are schematic views of another example robot executing an arm controller and a body controller for performing tasks with an arm of the robot.
- FIG. 2A is a schematic view of an example of an arm controller and base controller for a robot having a mobile base and an articulated arm.
- FIG. 2B is a schematic view of a robot arm controller and a remote device for controlling a robot.
- FIG. 3 is a flowchart of an example arrangement of operations for a method of constrained object manipulation for a robot arm.
- FIG. 4 is a schematic view of an example computing device that may be used to implement the systems and methods described herein.
- Many robots include multi-axis articulable appendages configured to execute complex movements for completing tasks, such as material handling or industrial operations (e.g., welding, gluing, and/or fastening).
- These appendages also referred to as manipulators, typically include an end-effector or hand attached at the end of a series appendage segments or portions, which are connected to each other by one or more appendage joints.
- the appendage joints cooperate to configure the appendage in a variety of poses P within an environment associated with the robot.
- pose refers to the position and orientation of the appendage.
- the position of the appendage may be defined by coordinates (x, y, z) of the appendage within a workspace (Cartesian space) associated with the arm, and the orientation may be defined by angles ( ⁇ x , ⁇ y , ⁇ z ) of the appendage within the workspace.
- the appendage may need to perform tasks that are located within the robot environment but outside of the current workspace (i.e., the reach) of the appendage.
- the robot may need to move within the environment to place the task target within reach of the appendage.
- FIGS. 1A-1D various examples of a robot 10 , 10 a , 10 b are shown.
- the robot 10 include, but are not limited to, a quadrupedal robot 10 a ( FIGS. 1A-1B ) and a wheeled robot 10 b ( FIGS. 1C-1D ).
- Each robot 10 includes a base 12 , 12 a , 12 b having a body 14 , 14 a , 14 b and a plurality of legs 16 , 16 a , 16 b in communication with a base control system 200 .
- Each leg 16 may have an upper leg portion 18 , 18 a , 18 b and a lower leg portion 20 , 20 a , 20 b .
- the upper leg portion 18 may be attached to the body 14 at an upper joint 22 , 22 a , 22 b (i.e., a hip joint) and the lower leg portion 20 may be attached to the upper leg portion 18 by an intermediate joint 24 , 24 a , 24 b (i.e., a knee joint).
- Each leg 16 further includes a foot 26 , 26 a , 26 b disposed at a distal end of the lower leg portion 20 , which provides a ground-contacting point for the base 12 of the robot 10 .
- the foot 26 a is a stationary contact pad 26 a .
- FIGS. 1A-1B the foot 26 a is a stationary contact pad 26 a .
- the foot 26 b includes a mobile element, such as a wheel 26 b for allowing the robot base 12 b to roll along a ground surface.
- the foot 26 a is omitted and the distal end of the lower leg portion 20 provides the ground-contacting point.
- the robot 10 further includes one or more appendages, such as an articulated arm 30 , 30 a , 30 b or manipulator disposed on the body 14 and configured to move relative to the body 14 .
- the articulated arm 30 may be interchangeably referred to as a manipulator, an appendage arm, or simply an appendage.
- the articulated arm 30 includes a first arm portion 32 , 32 a , 32 b rotatable relative to the body 14 and a second arm portion 34 , 34 a , 34 b rotatable relative to the first arm portion 32 .
- the articulated arm 30 may include more or less arm portions 32 , 34 without departing from the scope of the present disclosure.
- a third arm portion 36 , 36 a , 36 b of the articulated arm may be interchangeably coupled to a distal end of the second portion 22 b of the articulated arm 30 and may include one or more actuators 38 , 38 a , 38 b for gripping/grasping objects within the environment 2 .
- the actuators 38 a include an articulated grasp or claw for clamping an object.
- the actuator 38 b may include adhesive grip including magnetic or vacuum actuators.
- the articulated arm 30 includes a plurality of joints 40 , 42 , 44 disposed between adjacent ones of the arm portions 32 , 34 , 36 .
- the first arm portion 32 is attached to the body 14 of the robot 10 by a first joint 40 , 40 a , 40 b interchangeably referred to as a shoulder 40 .
- a second joint 42 , 42 a , 42 b connects the first arm portion 32 to the second arm portion 34 .
- the second joint 42 includes a single axis of rotation and may be interchangeably referred to as an elbow 42 of the articulated arm 30 .
- a third joint 44 , 44 a , 44 b connects the second arm portion 34 to the end effector 36 , and may be interchangeably referred to as a wrist 44 of the articulated arm 30 .
- the joints 40 , 42 , 44 cooperate to provide the articulated arm 30 with a number of degrees of freedom corresponding to the total number of axis of the joints 40 , 42 , 44 (e.g., five axes of rotation). While the illustrated example shows five-axis articulated arms 30 , the principles of the present disclosure are applicable to robotic arms having any number of axes.
- the arm portions 32 , 34 , 36 and joints 40 , 42 , 44 may be selectively reconfigured to position and orient the end effector 36 within a workspace 4 , as discussed below.
- the robot 10 also includes a vision system 50 with at least one imaging sensor or camera 52 .
- Each sensor or camera 52 captures image data or sensor data of the environment 2 surrounding the robot 10 within an angle of view 54 and a field of view 56 .
- the vision system 50 may be configured to move the field of view 56 by adjusting the angle of view 54 or by panning and/or tilting (either independently or via the robot 10 ) the camera 52 to move the field of view 56 in any direction.
- the vision system 50 may include multiple sensors or cameras 52 such that the vision system 50 captures a generally 360-degree field of view around the robot 10 .
- the camera(s) 52 of the vision system 50 include one or more stereo cameras (e.g., one or more RGBD stereo cameras).
- the vision system 50 includes one or more radar sensors such as a scanning light-detection and ranging (LIDAR) sensor, or a scanning laser-detection and ranging (LADAR) sensor, a light scanner, a time-of-flight sensor, or any other three-dimensional (3D) volumetric image sensor (or any such combination of sensors).
- the vision system 50 may also incorporate a VICON® sensor (e.g., for motion capture), perception sensors, a global positioning system (GPS) device, and/or other sensors for capturing information of the environment 2 in which the robot 10 is operating.
- VICON® sensor e.g., for motion capture
- perception sensors e.g., for motion capture
- GPS global positioning system
- the base 12 is connected to a base control system 200 configured to monitor and control operation of the robot base 12 . While the base control system 200 is illustrated in FIG. 1A with respect to the example of the robot 10 a , the robot 10 b in FIG. 1C also includes the base control system 200 . In some implementations, the robot base 12 is configured to operate autonomously and/or semi-autonomously. However, a user may also operate the base 12 by providing commands/directions to the base 12 via a remote device 60 .
- the base control system 200 includes a base controller 202 (e.g., data processing hardware), memory hardware 104 , actuators 206 , one or more sensors 208 , an inertial measurement unit (IMU) 210 , and one or more power sources 212 .
- the base control system 200 is not limited to the components shown, and may include additional or less components without departing from the scope of the present disclosure.
- the components may communicate via wireless or wired connections and may be distributed across multiple locations of the base 12 .
- the base control system 200 interfaces with a remote computing device and/or a user.
- the base control system 200 may include various components for communicating with the base 12 , such as a joystick, buttons, wired communication ports, and/or wireless communication ports for receiving inputs from the remote computing device and/or user, and providing feedback to the remote computing device and/or user.
- a joystick such as a joystick, buttons, wired communication ports, and/or wireless communication ports for receiving inputs from the remote computing device and/or user, and providing feedback to the remote computing device and/or user.
- the base controller 202 corresponds to data processing hardware that may include one or more general purpose processors, digital signal processors, and/or application specific integrated circuits (ASICs). In some implementations, the base controller 202 is a purpose-built embedded device configured to perform specific operations with one or more subsystems of the base 12 .
- the memory hardware 104 is in communication with the base controller 202 and may include one or more non-transitory computer-readable storage media such as volatile and/or non-volatile storage components. For instance, the memory hardware 104 may be associated with one or more physical devices in communication with one another and may include optical, magnetic, organic, or other types of memory or storage.
- the memory hardware 104 is configured to, inter alia, store instructions (e.g., computer-readable program instructions), that when executed by the base controller 202 , cause the base controller 202 to perform numerous operations, such as, without limitation, altering a pose of the robot base 12 for maintaining balance, maneuvering the robot base 12 across the ground surface, transporting objects, and/or executing a sit-to-stand routine.
- instructions e.g., computer-readable program instructions
- the base controller 202 may directly or indirectly interact with, the actuators 206 , the sensor(s) 208 , the inertial measurement unit 210 , and the power source(s) 212 for monitoring and controlling operation of the robot 10 .
- the base controller 202 is configured to process data relating to the inertial measurement unit 210 , the actuators 206 , and the sensor(s) 208 for operating the robot base 12 .
- the base controller 202 receives measurements from the inertial measurement unit 210 and the one or more sensors 208 disposed on the base 12 , and instructs actuation of at least one of the actuators 206 to change a configuration (i.e., a location L 12 and/or pose P 12 ) of the base 12 .
- the actuators 206 of the base control system 200 may include, without limitation, one or more of pneumatic actuators, hydraulic actuators, electro-mechanical actuators, or the like. Furthermore, the actuators 206 may be configured as linear actuators, rotary actuators, or a combination thereof. The actuators 206 may be disposed on the robot 10 at various locations to effect motion of the base 12 . For example, each of the legs 16 of the robot 10 may include a plurality of actuators 206 to change a configuration of one or more joints 22 , 24 .
- the sensor(s) 208 of the base control system 200 may include, without limitation, one or more of force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors (linear and/or rotational position sensors), motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras.
- the sensors 208 may be disposed on the base 12 at various locations such as the body 14 and/or the legs 16 , and are configured to provide corresponding base sensor data to the base controller 202 for monitoring and controlling operation of the robot 10 within the environment 2 .
- the base controller 202 is configured to receive base sensor data from sensors 208 physically separated from the robot 10 .
- the base controller 202 may receive sensor data from a proximity sensor disposed on a target object the robot 10 , or from a remote sensor within the environment of the robot 10 .
- the base sensor data from the sensors 208 may allow the base controller 202 to evaluate conditions for maneuvering the robot 10 , altering a pose of the base 12 , and/or actuating various actuators 206 for moving/rotating mechanical components such as one of the legs 16 .
- the base control system 200 employs one or more force sensors to measure load on the actuators 206 that move the base 12 .
- the sensors 208 may further include position sensors to sense states of extension, retraction, and/or rotation of the body 14 and/or the legs 16 .
- the inertial measurement unit 210 is configured to measure an inertial measurement indicative of a movement of the robot 10 that results in a change to the pose P 12 of the robot base 12 .
- the inertial measurement measured by the inertial measurement unit 210 may indicate a translation or shift of the center of mass of the robot 10 .
- the translation or shift of the center of mass may occur along one or more of the fore-aft axis (x-axis), the lateral axis (y-axis), or the vertical axis (z-axis).
- the inertial measurement unit 210 may detect and measure an acceleration, a tilt, a roll, a pitch, a rotation, or yaw of the robot 10 , as the inertial measurement, using an initial pose as an inertial reference frame.
- the base control system 200 includes one or more power sources 212 configured to power various components of the robot 10 .
- the power sources 212 employed by the robot 10 may include, without limitation, a hydraulic system, an electrical system, energy storage device(s) (e.g. batteries), and/or pneumatic devices.
- one or more energy storage devices may provide power to various components (e.g., actuators 206 ) of the base 12 .
- the body 14 defines a compartment for storing and retaining energy storage devices.
- the energy storage devices may be chargeable via wired connections or wireless (e.g. induction) connections to an external power source. Energy storage devices could also be charged using solar energy (e.g., generated via solar panels disposed on the robot 10 ).
- the energy storage devices are removable so that depleted energy storage devices can be replaced with fully-charged energy storage devices.
- Gasoline engines could also be employed.
- a hydraulic system may employ hydraulic motors and cylinders for transmitting pressurized fluid for operating various components of the robot 10 .
- the robot 10 includes an arm control system 100 connected to the arm 30 and operating independently of the base control system 200 .
- the arm control system 100 includes an arm controller 102 (e.g., data processing hardware), memory hardware 104 , actuators 106 , and one or more sensors 108 .
- the memory hardware 104 , actuators 106 , and sensors 108 may include similar components and configurations as those described above with respect to the memory hardware 104 , actuators 206 , and sensors 208 of the base control system 200 .
- one or more of the memory hardware 104 , actuators 106 , 206 , and sensors 108 , 208 may be shared between the control systems 100 , 200 .
- portions of the base controller 202 and the arm controller 102 execute on a remote device 60 in communication with the robot 10 .
- the remote device 60 may provide commands 62 to the robot 10 to move/control the base 12 and/or the articulated arm 30 for performing a task.
- the sensor(s) 108 of the arm control system 100 may include, without limitation, one or more of force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors (linear and/or rotational position sensors), motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras.
- the sensors 108 may be disposed on the arm 30 at various locations such as the arm portions 32 , 34 , 36 and/or the joints 40 , 42 , 44 , and are configured to provide corresponding arm sensor data 109 to the arm controller 102 and/or the base controller 202 for monitoring and controlling operation of the robot 10 within the environment 2 .
- the arm controller 102 is configured to receive the arm sensor data 109 from sensors 108 physically separated from the robot 10 .
- the arm controller 102 may receive arm sensor data 109 from a proximity sensor disposed on a target object the robot 10 , or from a remote sensor within the environment of the robot 10 .
- the arm controller 102 of the robot 10 controls moving the articulated arm 30 between arm poses P 30 within the arm workspace 4 .
- the articulated arm 30 may need to move from a start pose P 30 to a target pose P 30 when the robot 10 is executing the task request 62 .
- the robot arm controller 102 will need to move the articulated arm 30 from a first arm pose P 30 where the door is in a closed position to a second arm pose P 30 where the door is in an open position.
- Movements and poses of the robot 10 and robot appendages 16 , 30 may be defined in terms of a robot environment 2 based on a Cartesian coordinate system.
- the robot environment 2 may be defined by six dimensions including the translational axes x, y, z and the rotational axes ⁇ x , ⁇ y , ⁇ z .
- the pose P 30 of the arm 30 can be selectively reconfigured to change positions and orientations of the end effector 36 within a workspace 4 of the articulated arm 30 .
- the workspace 4 of the end effector 36 is provided relative to the base 12 of the robot 10 .
- the end effector 36 of the articulated arm 30 can reach any position within the workspace 4 without adjusting a pose P 12 of the robot base 12 .
- the robot 10 may need to perform tasks that are outside of the arm workspace 4 .
- the robot 10 may need to move a box 6 b or apply a mark 6 a that falls outside of the arm workspace 4 .
- the robot 10 must coordinate movements of the arm 30 and the base 12 to complete the task, whereby the base 12 of the robot 10 must move within the environment 2 to allow the articulated arm 30 to reach the location of the task 6 a , 6 b.
- the arm controller 102 includes a task manager 110 having a task interpreter 120 configured to receive or obtain task requests 62 from the remote device 60 and to generate translated task requests 122 executable by the arm controller 102 to move the robot arm 30 and/or base 12 .
- the task manager 110 further includes a task instructor 130 including an arm instructor 140 and a base instructor 150 configured to generate unique task instructions 142 for the arm 30 and parameters 152 for the base 12 using the translated task request 122 .
- the task manager 110 generally receives a task request 62 and generates a first set of task instructions 142 to be executed by the arm 30 within the arm workspace 4 and a second set of parameters 152 to be executed by the base 12 for moving the workspace 4 within the robot environment 2 .
- FIG. 2B shows an example operation of the task interpreter 120 generating the translated task request 122 based on the task request 62 received or obtained from the remote device 60 .
- task requests 62 may be autonomously generated by the remote device 60 based on a program.
- a user may engage with a user interface 64 in communication with the remote device 60 to select task characteristics 68 , 68 a , 68 b for the task request 62 .
- the user interface 64 may include one or more buttons 66 a for selecting task parameters (e.g., speed, force, direction, etc.).
- the user interface 64 also includes a task location window 66 b for identifying location of a path 6 a or object 6 b associated with the task request 62 .
- the task location window 66 b may be based on the image data from the camera 52 of the robot 10 . Thus, a user can select a location within the field of view 56 of the robot 10 . As shown, the task location window 66 b may present graphical representation of the arm workspace 4 relative to the field of view 56 of the robot 10 , allowing an operator to visualize the position of the task 6 a , 6 b relative to current position of the workspace 4 . For example, the task location window 66 b shows a path marking 6 a and a box 6 b that are positioned outside of the current workspace 4 of the articulated arm 30 .
- the task request 62 generated by the remote device 60 may not be directly executable by the robot 10 . Accordingly, the task interpreter 120 of the task manager 110 receives the task request 62 and translates the task characteristics 66 into translational and/or rotational coordinates based on the robot environment 2 . The translated task request 122 is then sent to the task instructor 130 .
- the task instructor 130 includes the arm instructor 140 and the base instructor 150 .
- the task instructor 130 receives the translated task request 122 from the task interpreter 120 and the arm instructor 140 and base instructor 150 cooperate to generate respective arm instructions 142 and base parameters 152 for performing the task 6 a , 6 b .
- the instructions 142 and parameters 152 are generated from the perspective of the end effector 36 and the current location of the workspace 4 .
- the arm controller 102 determines that the location of the task 6 a , 6 b is not within the current location of the workspace 4
- the arm controller 102 determines the necessary base parameters 152 for repositioning the workspace 4 at the location of the task 6 a , 6 b and generates arm instructions 142 for performing the task within the new workspace 4 location.
- the arm instructor 140 may be described as generating micro-level arm instructions 142 for positioning the end effector 36 within the arm workspace 4 .
- the arm instructor 140 generates arm instructions 142 including micro-position coordinates 142 a (x, y, z), orientation coordinates 142 b ( ⁇ x , ⁇ y , ⁇ z ), and end effector commands 142 c (e.g., actuate, deactuate).
- the base instructor 150 may be described as generating macro-level base parameters 152 for controlling locomotion of the robot 10 to move the arm workspace 4 within the environment 2 .
- FIG. 2A shows the base instructor 150 generates base parameters 152 including macro-position coordinates 152 a (x, y, z) and balance parameters 152 b for the robot base 12 .
- the macro-position coordinates 152 a may be associated with a location of a center of mass of the robot base 12 or with the location of the first joint 40 of the robot arm 30 .
- the base instructor 150 sends the base parameters 152 to the base controller 202 .
- the base controller 202 may then adjust a location L 12 or pose P 12 of the robot base 12 to move the arm workspace 4 to the task 6 a , 6 b .
- the base parameters 152 may include balance parameters 152 b identifying arm forces anticipated by task instructor 130 in response to the arm instructions 142 .
- the balance parameters 152 b may indicate the direction and magnitude of the arm force for consideration and compensation (i.e., counter-balancing) by the robot base 12 .
- the base parameters 152 are evaluated by a base location manager 220 and a base posture manager 230 to generate base instructions 254 for moving and/or configuring the robot base 12 according to the base parameters 152 .
- the base location manager 220 will generate base instructions 254 a for moving the location L 12 or pose P 12 of the robot base 12 to move the robot workspace 4 .
- the base posture manager 230 will generate base instructions 254 b for changing the pose P 12 of the robot base 12 to counteract forces applied by or to the end effector 36 based on the balance parameters 152 b generated by the task instructor 130 .
- the base controller 202 receives arm sensor data 109 from the arm sensors 108 .
- the base controller 202 evaluates the arm sensor data 109 from the arm sensors 108 to generate or modify the base instructions 254 for moving and/or configuring the robot base 12 .
- the base controller 202 may also generate responsive base instructions 254 to move the base 12 to a responsive base configuration to account for actual forces measured at the robot arm 30 .
- the base instructions 254 may anticipatorily instruct the base 12 to adjust the posture the base 12 to an anticipatory base configuration to counteract the expected force to be applied by the arm 30 to open the door. Additionally, the base controller 202 may evaluate the arm sensor data 109 received from the arm sensors 108 during the task to further adjust the location or posture of the base 12 . Thus, the base controller 202 can modify or tune the base instructions 254 to account for variables (e.g., difference forces) not anticipated by the base instructor 150 when generating the base parameters 152 .
- variables e.g., difference forces
- the robot 10 of the present disclosure takes advantage of discrete control systems 100 , 200 to coordinate movements of the robot 10 associated with performing a task using the arm 30 .
- the robot 10 of the present disclosure manages the robot 10 from the perspective of the arm controller 102 .
- the arm controller 102 determines the parameters for executing the task and then segregates the task into arm instructions 142 and base parameters 152 .
- the base controller 202 then evaluates the base parameters 152 and determines appropriate base instructions 254 for complying with the base parameters 152 provided by the arm controller 102 , thereby minimizing the computational load on the base controller 202 .
- the base controller 202 provides arm sensor data 109 from the arm sensors 108 to the base controller 202 to intelligently adjust the location or posture of the base 12 to account for measured forces applied to or by the arm 30 .
- conventional robotic systems only generate responsive actions at a base using the base sensor data received from the base sensors.
- the base when a force is applied to an arm of a conventional robotic system, the base is not informed of the magnitude or direction of the applied force by the arm sensors and only observes the forces from the perspective of the sensors of the base. Accordingly, the base can only react in response to base sensor data obtained from sensors within the base (e.g., leg sensors, IMU) to address instability caused by the force applied at the arm.
- the configuration of the present disclosure allows the robot base 12 and the articulated arm 30 to be provided as modular components and minimizes computational requirements for the base controller 202 .
- a first example of the robot 10 a including the arm controller 102 and the base controller 202 is shown.
- the user has generated a task request 62 with the user interface 64 ( FIG. 2B ) for marking continuous path 6 a along the ground surface of the robot environment 2 using the end effector 36 .
- the end effector 36 may grasp or include a marking device (e.g., paint, chalk) that can be moved along the ground surface to apply the mark.
- a marking device e.g., paint, chalk
- a portion of the mark 6 a is located outside of the arm workspace 4 associated with the current location L 12 and/or pose P 12 of the robot base 12 .
- the robot 10 must change locations L 12 and/or reconfigure poses P 12 to move the workspace 4 .
- the robot base 12 and lower the body 14 must travel along the path of the mark 6 a so that the arm 30 can reach the ground surface.
- task instructor 130 of the arm controller 102 generates, via the base instructor 150 , base parameters 152 indicating the macro-positional coordinates 152 a and balance parameters 152 b that must be accommodated by the base 12 to position and move the workspace 4 of the arm 30 along the path 6 a .
- the base controller 202 instructs the robot base 12 to move along the path 6 a.
- the robot 10 b includes the arm control system 100 and the base control system 200 .
- the arm controller 102 receives a task request 62 from the remote device 60 for moving an object 6 b (e.g., a box) that is located outside of the current location of the robot workspace 4 .
- the arm instructor 140 of the arm controller 102 generates arm instructions 142 for moving the arm 30 b within the workspace 4 while the base instructor 150 of the arm controller 102 sends base parameters 152 to the base controller 202 for repositioning the workspace 4 within the environment 2 .
- the base controller 202 may evaluate the base parameters 152 received from the arm controller 102 and determine that the location L 12 and/or the pose P 12 of the base 12 need to be adjusted to allow the articulated arm 30 to reach the object 6 b or to move the object 6 b to a new location. Additionally or alternatively, the arm controller 102 may receive task requests 62 for moving objects 6 b that are within the arm workspace 4 to a location outside of the arm workspace 4 .
- FIG. 3 is a flowchart of an example arrangement of operations for a method 300 for coordinating robot base 12 and arm 30 tasks using an arm controller 102 and a base controller 202 .
- the method 300 may be a computer implemented method executed by data processing hardware of the articulated arm 30 , which causes the data processing hardware to perform operations.
- the method 300 includes determining a first location of a workspace of the articulated arm associated with a current configuration of the base of the robot.
- the method 300 further includes, at operation 304 , receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location.
- the method 300 includes generating base parameters associated with the task request.
- the method 300 further includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location using the generated base parameters.
- FIG. 4 is schematic view of an example computing device 400 that may be used to implement the systems and methods described in this document.
- the computing device 400 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
- the components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document.
- the computing device 400 includes a processor 410 , memory 420 , a storage device 430 , a high-speed interface/controller 440 connecting to the memory 420 and high-speed expansion ports 450 , and a low speed interface/controller 460 connecting to a low speed bus 470 and a storage device 430 .
- Each of the components 410 , 420 , 430 , 440 , 450 , and 460 are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
- the processor 410 can process instructions for execution within the computing device 400 , including instructions stored in the memory 420 or on the storage device 430 to display graphical information for a graphical user interface (GUI) on an external input/output device, such as display 480 coupled to high speed interface 440 .
- GUI graphical user interface
- multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
- multiple computing devices 400 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
- the memory 420 stores information non-transitorily within the computing device 400 .
- the memory 420 may be a computer-readable medium, a volatile memory unit(s), or non-volatile memory unit(s).
- the non-transitory memory 420 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by the computing device 400 .
- non-volatile memory examples include, but are not limited to, flash memory and read-only memory (ROM)/programmable read-only memory (PROM)/erasable programmable read-only memory (EPROM)/electronically erasable programmable read-only memory (EEPROM) (e.g., typically used for firmware, such as boot programs).
- volatile memory examples include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes.
- the storage device 430 is capable of providing mass storage for the computing device 400 .
- the storage device 430 is a computer-readable medium.
- the storage device 430 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
- a computer program product is tangibly embodied in an information carrier.
- the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
- the information carrier is a computer- or machine-readable medium, such as the memory 420 , the storage device 430 , or memory on processor 410 .
- the high speed controller 440 manages bandwidth-intensive operations for the computing device 400 , while the low speed controller 460 manages lower bandwidth-intensive operations. Such allocation of duties is exemplary only.
- the high-speed controller 440 is coupled to the memory 420 , the display 480 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 490 , which may accept various expansion cards (not shown).
- the low-speed controller 460 is coupled to the storage device 430 and a low-speed expansion port 490 .
- the low-speed expansion port 490 which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
- input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
- the computing device 400 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 400 a or multiple times in a group of such servers 400 a , as a laptop computer 400 b , or as part of a rack server system 400 c.
- implementations of the systems and techniques described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
- ASICs application specific integrated circuits
- These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- the processes and logic flows described in this specification can be performed by one or more programmable processors, also referred to as data processing hardware, executing one or more computer programs to perform functions by operating on input data and generating output.
- the processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
- a processor will receive instructions and data from a read only memory or a random access memory or both.
- the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
- a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
- mass storage devices for storing data
- a computer need not have such devices.
- Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
- the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- one or more aspects of the disclosure can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
- Other kinds of devices can be used to provide interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Human Computer Interaction (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Transportation (AREA)
- Manipulator (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
Abstract
Description
- This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application 63/129,398, filed on Dec. 22, 2020. The disclosure of this prior application is considered part of the disclosure of this application and is hereby incorporated by reference in its entirety.
- This disclosure relates to coordinating arm and body controls in a robot.
- Robotic arms are increasingly being used in constrained or otherwise restricted environments to perform a variety of tasks or functions. These robotic arms often need to efficiently manipulate constrained objects, such as doors or switches, without requiring large computations. As robotic arms become more prevalent, there is a need for coordinating movements of the robot body based on commands or instructions of the robotic arm.
- One aspect of the disclosure provides computer-implemented method that, when executed by data processing hardware of a robot having an articulated arm and a base, causes the data processing hardware to perform operations. The operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot. The operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location. The operations also include generating base parameters associated with the task request. The operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.
- Implementations of the disclosure may include one or more of the following optional features. In some implementations, instructing the base of the robot to move from the current base configuration to the anticipatory base configuration includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location. In some examples, instructing the base of the robot to move the workspace of the articulated arm from the first location to the second location includes instructing the base of the robot to change one of a location or a pose of the base.
- In some examples, the operations further include generating arm instructions for changing a configuration of the articulated arm within the workspace of the articulated arm. In some implementations, the operations further include receiving arm sensor data of the articulated arm associated with changing the configuration of the articulated arm, and instructing, based on the arm sensor data, the base of the robot to move from the anticipatory base configuration to a responsive base configuration. In some configurations, the base parameters include at least one of position coordinates for the robot base or balancing parameters for the robot base.
- In some examples, the task request includes a request to move an object outside of the workspace of the articulated arm at the first location. In some configurations, the task request includes instructing the base of the robot to follow a continuous path outside of the workspace of the articulated arm at the first location.
- Another aspect of the disclosure provides a robot including a base, an articulated arm coupled to the base, data processing hardware and memory hardware storing instructions that, when executed by the data processing hardware, cause the data processing hardware to perform operations. The operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot. The operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location. The operations also include generating base parameters associated with the task request. The operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.
- Implementations of the disclosure may include one or more of the following optional features. In some implementations, instructing the base of the robot to move from the current base configuration to the anticipatory base configuration includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location. In some examples, instructing the base of the robot to move the workspace of the articulated arm from the first location to the second location includes instructing the base of the robot to change one of a location or a pose of the base.
- In some examples, the operations further include generating arm instructions for changing a configuration of the articulated arm within the workspace of the articulated arm. In some implementations, the operations further include receiving arm sensor data of the articulated arm associated with changing the configuration of the articulated arm, and instructing, based on the arm sensor data, the base of the robot to move from the anticipatory base configuration to a responsive base configuration. In some configurations, the base parameters include at least one of position coordinates for the robot base or balancing parameters for the robot base.
- In some examples, the task request includes a request to move an object outside of the workspace of the articulated arm at the first location. In some configurations, the task request includes instructing the base of the robot to follow a continuous path outside of the workspace of the articulated arm at the first location.
- Another aspect of the disclosure provides a computer program product encoded on a non-transitory computer readable storage medium of a robot including a base and an articulated arm coupled to the base, the computer readable storage medium including instructions that when executed by a data processing apparatus cause the data processing apparatus to perform operations. The operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot. The operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location. The operations also include generating base parameters associated with the task request. The operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.
- Implementations of the disclosure may include one or more of the following optional features. In some implementations, instructing the base of the robot to move from the current base configuration to the anticipatory base configuration includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location. In some examples, instructing the base of the robot to move the workspace of the articulated arm from the first location to the second location includes instructing the base of the robot to change one of a location or a pose of the base.
- In some examples, the operations further include generating arm instructions for changing a configuration of the articulated arm within the workspace of the articulated arm. In some implementations, the operations further include receiving arm sensor data of the articulated arm associated with changing the configuration of the articulated arm, and instructing, based on the arm sensor data, the base of the robot to move from the anticipatory base configuration to a responsive base configuration.
- The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
-
FIGS. 1A and 1B are schematic views of an example robot executing an arm controller and a body controller for performing tasks with an arm of the robot. -
FIGS. 1C and 1D are schematic views of another example robot executing an arm controller and a body controller for performing tasks with an arm of the robot. -
FIG. 2A is a schematic view of an example of an arm controller and base controller for a robot having a mobile base and an articulated arm. -
FIG. 2B is a schematic view of a robot arm controller and a remote device for controlling a robot. -
FIG. 3 is a flowchart of an example arrangement of operations for a method of constrained object manipulation for a robot arm. -
FIG. 4 is a schematic view of an example computing device that may be used to implement the systems and methods described herein. - Like reference symbols in the various drawings indicate like elements.
- Many robots include multi-axis articulable appendages configured to execute complex movements for completing tasks, such as material handling or industrial operations (e.g., welding, gluing, and/or fastening). These appendages, also referred to as manipulators, typically include an end-effector or hand attached at the end of a series appendage segments or portions, which are connected to each other by one or more appendage joints. The appendage joints cooperate to configure the appendage in a variety of poses P within an environment associated with the robot. Here, the term “pose” refers to the position and orientation of the appendage. For example, the position of the appendage may be defined by coordinates (x, y, z) of the appendage within a workspace (Cartesian space) associated with the arm, and the orientation may be defined by angles (Θx, Θy, Θz) of the appendage within the workspace. In use, the appendage may need to perform tasks that are located within the robot environment but outside of the current workspace (i.e., the reach) of the appendage. Thus, to perform the task, the robot may need to move within the environment to place the task target within reach of the appendage.
- Referring to
FIGS. 1A-1D , various examples of a 10, 10 a, 10 b are shown. Examples of the robot 10 include, but are not limited to, arobot quadrupedal robot 10 a (FIGS. 1A-1B ) and awheeled robot 10 b (FIGS. 1C-1D ). Each robot 10 includes a 12, 12 a, 12 b having abase 14, 14 a, 14 b and a plurality ofbody 16, 16 a, 16 b in communication with alegs base control system 200. Each leg 16 may have an 18, 18 a, 18 b and aupper leg portion 20, 20 a, 20 b. The upper leg portion 18 may be attached to the body 14 at an upper joint 22, 22 a, 22 b (i.e., a hip joint) and thelower leg portion lower leg portion 20 may be attached to the upper leg portion 18 by an intermediate joint 24, 24 a, 24 b (i.e., a knee joint). Each leg 16 further includes a 26, 26 a, 26 b disposed at a distal end of thefoot lower leg portion 20, which provides a ground-contacting point for thebase 12 of the robot 10. In some examples (FIGS. 1A-1B ), thefoot 26 a is astationary contact pad 26 a. In other examples (FIGS. 1C-1D ), thefoot 26 b includes a mobile element, such as awheel 26 b for allowing therobot base 12 b to roll along a ground surface. In some other examples, thefoot 26 a is omitted and the distal end of thelower leg portion 20 provides the ground-contacting point. - In some implementations, the robot 10 further includes one or more appendages, such as an articulated
30, 30 a, 30 b or manipulator disposed on the body 14 and configured to move relative to the body 14. Moreover, the articulatedarm arm 30 may be interchangeably referred to as a manipulator, an appendage arm, or simply an appendage. In the example shown, the articulatedarm 30 includes a 32, 32 a, 32 b rotatable relative to the body 14 and afirst arm portion 34, 34 a, 34 b rotatable relative to the first arm portion 32. However, the articulatedsecond arm portion arm 30 may include more or less arm portions 32, 34 without departing from the scope of the present disclosure. A 36, 36 a, 36 b of the articulated arm, referred to as an end effector 36 or hand 36, may be interchangeably coupled to a distal end of thethird arm portion second portion 22 b of the articulatedarm 30 and may include one or 38, 38 a, 38 b for gripping/grasping objects within themore actuators environment 2. In the example ofFIGS. 1A-1B , theactuators 38 a include an articulated grasp or claw for clamping an object. However, in other examples (FIGS. 1C-1D ), theactuator 38 b may include adhesive grip including magnetic or vacuum actuators. - The articulated
arm 30 includes a plurality of joints 40, 42, 44 disposed between adjacent ones of the arm portions 32, 34, 36. In the examples shown, the first arm portion 32 is attached to the body 14 of the robot 10 by a first joint 40, 40 a, 40 b interchangeably referred to as a shoulder 40. A second joint 42, 42 a, 42 b connects the first arm portion 32 to the second arm portion 34. In both examples, the second joint 42 includes a single axis of rotation and may be interchangeably referred to as an elbow 42 of the articulatedarm 30. A third joint 44, 44 a, 44 b connects the second arm portion 34 to the end effector 36, and may be interchangeably referred to as a wrist 44 of the articulatedarm 30. Accordingly, the joints 40, 42, 44 cooperate to provide the articulatedarm 30 with a number of degrees of freedom corresponding to the total number of axis of the joints 40, 42, 44 (e.g., five axes of rotation). While the illustrated example shows five-axis articulatedarms 30, the principles of the present disclosure are applicable to robotic arms having any number of axes. The arm portions 32, 34, 36 and joints 40, 42, 44 may be selectively reconfigured to position and orient the end effector 36 within aworkspace 4, as discussed below. - In some examples, the robot 10 also includes a
vision system 50 with at least one imaging sensor orcamera 52. Each sensor orcamera 52 captures image data or sensor data of theenvironment 2 surrounding the robot 10 within an angle ofview 54 and a field ofview 56. Thevision system 50 may be configured to move the field ofview 56 by adjusting the angle ofview 54 or by panning and/or tilting (either independently or via the robot 10) thecamera 52 to move the field ofview 56 in any direction. Alternatively, thevision system 50 may include multiple sensors orcameras 52 such that thevision system 50 captures a generally 360-degree field of view around the robot 10. The camera(s) 52 of thevision system 50, in some implementations, include one or more stereo cameras (e.g., one or more RGBD stereo cameras). In other examples, thevision system 50 includes one or more radar sensors such as a scanning light-detection and ranging (LIDAR) sensor, or a scanning laser-detection and ranging (LADAR) sensor, a light scanner, a time-of-flight sensor, or any other three-dimensional (3D) volumetric image sensor (or any such combination of sensors). Thevision system 50 may also incorporate a VICON® sensor (e.g., for motion capture), perception sensors, a global positioning system (GPS) device, and/or other sensors for capturing information of theenvironment 2 in which the robot 10 is operating. - With continued reference to
FIG. 1A , thebase 12 is connected to abase control system 200 configured to monitor and control operation of therobot base 12. While thebase control system 200 is illustrated inFIG. 1A with respect to the example of therobot 10 a, therobot 10 b inFIG. 1C also includes thebase control system 200. In some implementations, therobot base 12 is configured to operate autonomously and/or semi-autonomously. However, a user may also operate the base 12 by providing commands/directions to thebase 12 via aremote device 60. In the example shown, thebase control system 200 includes a base controller 202 (e.g., data processing hardware),memory hardware 104,actuators 206, one ormore sensors 208, an inertial measurement unit (IMU) 210, and one ormore power sources 212. Thebase control system 200 is not limited to the components shown, and may include additional or less components without departing from the scope of the present disclosure. The components may communicate via wireless or wired connections and may be distributed across multiple locations of thebase 12. In some configurations, thebase control system 200 interfaces with a remote computing device and/or a user. For instance, thebase control system 200 may include various components for communicating with thebase 12, such as a joystick, buttons, wired communication ports, and/or wireless communication ports for receiving inputs from the remote computing device and/or user, and providing feedback to the remote computing device and/or user. - The
base controller 202 corresponds to data processing hardware that may include one or more general purpose processors, digital signal processors, and/or application specific integrated circuits (ASICs). In some implementations, thebase controller 202 is a purpose-built embedded device configured to perform specific operations with one or more subsystems of thebase 12. Thememory hardware 104 is in communication with thebase controller 202 and may include one or more non-transitory computer-readable storage media such as volatile and/or non-volatile storage components. For instance, thememory hardware 104 may be associated with one or more physical devices in communication with one another and may include optical, magnetic, organic, or other types of memory or storage. Thememory hardware 104 is configured to, inter alia, store instructions (e.g., computer-readable program instructions), that when executed by thebase controller 202, cause thebase controller 202 to perform numerous operations, such as, without limitation, altering a pose of therobot base 12 for maintaining balance, maneuvering therobot base 12 across the ground surface, transporting objects, and/or executing a sit-to-stand routine. - The
base controller 202 may directly or indirectly interact with, theactuators 206, the sensor(s) 208, theinertial measurement unit 210, and the power source(s) 212 for monitoring and controlling operation of the robot 10. Thebase controller 202 is configured to process data relating to theinertial measurement unit 210, theactuators 206, and the sensor(s) 208 for operating therobot base 12. Thebase controller 202 receives measurements from theinertial measurement unit 210 and the one ormore sensors 208 disposed on thebase 12, and instructs actuation of at least one of theactuators 206 to change a configuration (i.e., a location L12 and/or pose P12) of thebase 12. - The
actuators 206 of thebase control system 200 may include, without limitation, one or more of pneumatic actuators, hydraulic actuators, electro-mechanical actuators, or the like. Furthermore, theactuators 206 may be configured as linear actuators, rotary actuators, or a combination thereof. Theactuators 206 may be disposed on the robot 10 at various locations to effect motion of thebase 12. For example, each of the legs 16 of the robot 10 may include a plurality ofactuators 206 to change a configuration of one or more joints 22, 24. - The sensor(s) 208 of the
base control system 200 may include, without limitation, one or more of force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors (linear and/or rotational position sensors), motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras. Thesensors 208 may be disposed on the base 12 at various locations such as the body 14 and/or the legs 16, and are configured to provide corresponding base sensor data to thebase controller 202 for monitoring and controlling operation of the robot 10 within theenvironment 2. In some examples, thebase controller 202 is configured to receive base sensor data fromsensors 208 physically separated from the robot 10. For instance, thebase controller 202 may receive sensor data from a proximity sensor disposed on a target object the robot 10, or from a remote sensor within the environment of the robot 10. - The base sensor data from the
sensors 208 may allow thebase controller 202 to evaluate conditions for maneuvering the robot 10, altering a pose of thebase 12, and/or actuatingvarious actuators 206 for moving/rotating mechanical components such as one of the legs 16. In some examples, thebase control system 200 employs one or more force sensors to measure load on theactuators 206 that move thebase 12. Thesensors 208 may further include position sensors to sense states of extension, retraction, and/or rotation of the body 14 and/or the legs 16. - The
inertial measurement unit 210 is configured to measure an inertial measurement indicative of a movement of the robot 10 that results in a change to the pose P12 of therobot base 12. The inertial measurement measured by theinertial measurement unit 210 may indicate a translation or shift of the center of mass of the robot 10. The translation or shift of the center of mass may occur along one or more of the fore-aft axis (x-axis), the lateral axis (y-axis), or the vertical axis (z-axis). For instance, theinertial measurement unit 210 may detect and measure an acceleration, a tilt, a roll, a pitch, a rotation, or yaw of the robot 10, as the inertial measurement, using an initial pose as an inertial reference frame. - In some implementations, the
base control system 200 includes one ormore power sources 212 configured to power various components of the robot 10. Thepower sources 212 employed by the robot 10 may include, without limitation, a hydraulic system, an electrical system, energy storage device(s) (e.g. batteries), and/or pneumatic devices. For instance, one or more energy storage devices may provide power to various components (e.g., actuators 206) of thebase 12. In some examples, the body 14 defines a compartment for storing and retaining energy storage devices. The energy storage devices may be chargeable via wired connections or wireless (e.g. induction) connections to an external power source. Energy storage devices could also be charged using solar energy (e.g., generated via solar panels disposed on the robot 10). In some examples, the energy storage devices are removable so that depleted energy storage devices can be replaced with fully-charged energy storage devices. Gasoline engines could also be employed. A hydraulic system may employ hydraulic motors and cylinders for transmitting pressurized fluid for operating various components of the robot 10. - In the example shown, the robot 10 includes an
arm control system 100 connected to thearm 30 and operating independently of thebase control system 200. In the example shown, thearm control system 100 includes an arm controller 102 (e.g., data processing hardware),memory hardware 104,actuators 106, and one ormore sensors 108. Thememory hardware 104,actuators 106, andsensors 108 may include similar components and configurations as those described above with respect to thememory hardware 104,actuators 206, andsensors 208 of thebase control system 200. - Optionally, one or more of the
memory hardware 104, 106, 206, andactuators 108, 208 may be shared between thesensors 100, 200. In some implementations, portions of thecontrol systems base controller 202 and thearm controller 102 execute on aremote device 60 in communication with the robot 10. Optionally, theremote device 60 may providecommands 62 to the robot 10 to move/control thebase 12 and/or the articulatedarm 30 for performing a task. - The sensor(s) 108 of the
arm control system 100 may include, without limitation, one or more of force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors (linear and/or rotational position sensors), motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras. Thesensors 108 may be disposed on thearm 30 at various locations such as the arm portions 32, 34, 36 and/or the joints 40, 42, 44, and are configured to provide correspondingarm sensor data 109 to thearm controller 102 and/or thebase controller 202 for monitoring and controlling operation of the robot 10 within theenvironment 2. In some examples, thearm controller 102 is configured to receive thearm sensor data 109 fromsensors 108 physically separated from the robot 10. For instance, thearm controller 102 may receivearm sensor data 109 from a proximity sensor disposed on a target object the robot 10, or from a remote sensor within the environment of the robot 10. - The
arm controller 102 of the robot 10 controls moving the articulatedarm 30 between arm poses P30 within thearm workspace 4. For instance, the articulatedarm 30 may need to move from a start pose P30 to a target pose P30 when the robot 10 is executing thetask request 62. For instance, in a scenario when the robot 10 needs to open a door while navigating in an environment, therobot arm controller 102 will need to move the articulatedarm 30 from a first arm pose P30 where the door is in a closed position to a second arm pose P30 where the door is in an open position. - Movements and poses of the robot 10 and
robot appendages 16, 30 may be defined in terms of arobot environment 2 based on a Cartesian coordinate system. In the examples of the robot 10 provided inFIGS. 1A-1D , therobot environment 2 may be defined by six dimensions including the translational axes x, y, z and the rotational axes Θx, Θy, Θz. Referring toFIGS. 1B and 1D , the pose P30 of thearm 30 can be selectively reconfigured to change positions and orientations of the end effector 36 within aworkspace 4 of the articulatedarm 30. Theworkspace 4 of the end effector 36 is provided relative to thebase 12 of the robot 10. In other words, the end effector 36 of the articulatedarm 30 can reach any position within theworkspace 4 without adjusting a pose P12 of therobot base 12. However, in some scenarios, the robot 10 may need to perform tasks that are outside of thearm workspace 4. For example, the robot 10 may need to move abox 6 b or apply amark 6 a that falls outside of thearm workspace 4. In these scenarios, the robot 10 must coordinate movements of thearm 30 and the base 12 to complete the task, whereby thebase 12 of the robot 10 must move within theenvironment 2 to allow the articulatedarm 30 to reach the location of the 6 a, 6 b.task - With reference to
FIG. 2A , thearm controller 102 includes atask manager 110 having atask interpreter 120 configured to receive or obtain task requests 62 from theremote device 60 and to generate translated task requests 122 executable by thearm controller 102 to move therobot arm 30 and/orbase 12. Thetask manager 110 further includes atask instructor 130 including anarm instructor 140 and abase instructor 150 configured to generateunique task instructions 142 for thearm 30 andparameters 152 for the base 12 using the translatedtask request 122. Thus, as described later, thetask manager 110 generally receives atask request 62 and generates a first set oftask instructions 142 to be executed by thearm 30 within thearm workspace 4 and a second set ofparameters 152 to be executed by thebase 12 for moving theworkspace 4 within therobot environment 2. -
FIG. 2B shows an example operation of thetask interpreter 120 generating the translatedtask request 122 based on thetask request 62 received or obtained from theremote device 60. In some examples, task requests 62 may be autonomously generated by theremote device 60 based on a program. Additionally or alternatively, a user may engage with auser interface 64 in communication with theremote device 60 to select 68, 68 a, 68 b for thetask characteristics task request 62. For example, theuser interface 64 may include one ormore buttons 66 a for selecting task parameters (e.g., speed, force, direction, etc.). Theuser interface 64 also includes atask location window 66 b for identifying location of apath 6 a orobject 6 b associated with thetask request 62. Thetask location window 66 b may be based on the image data from thecamera 52 of the robot 10. Thus, a user can select a location within the field ofview 56 of the robot 10. As shown, thetask location window 66 b may present graphical representation of thearm workspace 4 relative to the field ofview 56 of the robot 10, allowing an operator to visualize the position of the 6 a, 6 b relative to current position of thetask workspace 4. For example, thetask location window 66 b shows a path marking 6 a and abox 6 b that are positioned outside of thecurrent workspace 4 of the articulatedarm 30. - While presenting the
user interface 64 with gesture-basedbuttons 66 a andselection windows 66 b simplifies user control of therobot arm 30 by providing an intuitive interface, thetask request 62 generated by theremote device 60 may not be directly executable by the robot 10. Accordingly, thetask interpreter 120 of thetask manager 110 receives thetask request 62 and translates the task characteristics 66 into translational and/or rotational coordinates based on therobot environment 2. The translatedtask request 122 is then sent to thetask instructor 130. - The
task instructor 130 includes thearm instructor 140 and thebase instructor 150. Thetask instructor 130 receives the translatedtask request 122 from thetask interpreter 120 and thearm instructor 140 andbase instructor 150 cooperate to generaterespective arm instructions 142 andbase parameters 152 for performing the 6 a, 6 b. Thetask instructions 142 andparameters 152 are generated from the perspective of the end effector 36 and the current location of theworkspace 4. Thus, where thearm controller 102 determines that the location of the 6 a, 6 b is not within the current location of thetask workspace 4, thearm controller 102 determines thenecessary base parameters 152 for repositioning theworkspace 4 at the location of the 6 a, 6 b and generatestask arm instructions 142 for performing the task within thenew workspace 4 location. - The
arm instructor 140 may be described as generatingmicro-level arm instructions 142 for positioning the end effector 36 within thearm workspace 4. For example, where a 6 a, 6 b is located within then thetask arm workspace 4, thearm instructor 140 generatesarm instructions 142 including micro-position coordinates 142 a (x, y, z), orientation coordinates 142 b (Θx, Θy, Θz), and end effector commands 142 c (e.g., actuate, deactuate). Conversely, thebase instructor 150 may be described as generatingmacro-level base parameters 152 for controlling locomotion of the robot 10 to move thearm workspace 4 within theenvironment 2. For example, where a 6 a, 6 b is located outside of thetask arm workspace 4 at the current location L12 or pose P12,FIG. 2A shows thebase instructor 150 generatesbase parameters 152 including macro-position coordinates 152 a (x, y, z) andbalance parameters 152 b for therobot base 12. The macro-position coordinates 152 a may be associated with a location of a center of mass of therobot base 12 or with the location of the first joint 40 of therobot arm 30. - When the
task instructor 130 determines that the 6 a, 6 b is positioned outside of thetask arm workspace 4 at the current location L12 or pose P12 and that macro-level positioning is needed from therobot base 12, thebase instructor 150 sends thebase parameters 152 to thebase controller 202. Using thebase parameters 152, thebase controller 202 may then adjust a location L12 or pose P12 of therobot base 12 to move thearm workspace 4 to the 6 a, 6 b. In addition to using macro-positioning for expanding thetask effective workspace 4 of the articulatedarm 30, thebase parameters 152 may includebalance parameters 152 b identifying arm forces anticipated bytask instructor 130 in response to thearm instructions 142. For example, where thearm instructions 142 include providing relatively high forces at the end effector 36 and/or changing the center of mass for the robot 10 by extending the articulatedarm 30, thebalance parameters 152 b may indicate the direction and magnitude of the arm force for consideration and compensation (i.e., counter-balancing) by therobot base 12. - At the
base controller 202, thebase parameters 152 are evaluated by abase location manager 220 and abase posture manager 230 to generatebase instructions 254 for moving and/or configuring therobot base 12 according to thebase parameters 152. For example, where thebase parameters 152 includemacro-position coordinates 152 a, thebase location manager 220 will generatebase instructions 254 a for moving the location L12 or pose P12 of therobot base 12 to move therobot workspace 4. Additionally or alternatively, thebase posture manager 230 will generatebase instructions 254 b for changing the pose P12 of therobot base 12 to counteract forces applied by or to the end effector 36 based on thebalance parameters 152 b generated by thetask instructor 130. - Additionally or alternatively, the
base controller 202 receivesarm sensor data 109 from thearm sensors 108. Thebase controller 202 evaluates thearm sensor data 109 from thearm sensors 108 to generate or modify thebase instructions 254 for moving and/or configuring therobot base 12. Thus, in addition to generatinganticipatory base instructions 254 based on thebase parameters 152 generated by thearm controller 102, thebase controller 202 may also generateresponsive base instructions 254 to move the base 12 to a responsive base configuration to account for actual forces measured at therobot arm 30. For example, where theinitial base instructions 152 correspond to arminstructions 142 associated with opening a door, thebase instructions 254 may anticipatorily instruct the base 12 to adjust the posture the base 12 to an anticipatory base configuration to counteract the expected force to be applied by thearm 30 to open the door. Additionally, thebase controller 202 may evaluate thearm sensor data 109 received from thearm sensors 108 during the task to further adjust the location or posture of thebase 12. Thus, thebase controller 202 can modify or tune thebase instructions 254 to account for variables (e.g., difference forces) not anticipated by thebase instructor 150 when generating thebase parameters 152. - As described, the robot 10 of the present disclosure takes advantage of
100, 200 to coordinate movements of the robot 10 associated with performing a task using thediscrete control systems arm 30. Unlike conventional robotic systems, which rely on a computationally-intensive centralized controller for coordinating arm and base movements, the robot 10 of the present disclosure manages the robot 10 from the perspective of thearm controller 102. Thus, thearm controller 102 determines the parameters for executing the task and then segregates the task intoarm instructions 142 andbase parameters 152. Thebase controller 202 then evaluates thebase parameters 152 and determinesappropriate base instructions 254 for complying with thebase parameters 152 provided by thearm controller 102, thereby minimizing the computational load on thebase controller 202. - Additionally, providing
arm sensor data 109 from thearm sensors 108 to thebase controller 202 allows thebase controller 202 to intelligently adjust the location or posture of the base 12 to account for measured forces applied to or by thearm 30. In contrast, conventional robotic systems only generate responsive actions at a base using the base sensor data received from the base sensors. Thus, when a force is applied to an arm of a conventional robotic system, the base is not informed of the magnitude or direction of the applied force by the arm sensors and only observes the forces from the perspective of the sensors of the base. Accordingly, the base can only react in response to base sensor data obtained from sensors within the base (e.g., leg sensors, IMU) to address instability caused by the force applied at the arm. Thus, the configuration of the present disclosure allows therobot base 12 and the articulatedarm 30 to be provided as modular components and minimizes computational requirements for thebase controller 202. - With reference to
FIG. 1B , a first example of therobot 10 a including thearm controller 102 and thebase controller 202 is shown. In this example, the user has generated atask request 62 with the user interface 64 (FIG. 2B ) for markingcontinuous path 6 a along the ground surface of therobot environment 2 using the end effector 36. For instance, the end effector 36 may grasp or include a marking device (e.g., paint, chalk) that can be moved along the ground surface to apply the mark. As shown inFIGS. 1B and 2B , a portion of themark 6 a is located outside of thearm workspace 4 associated with the current location L12 and/or pose P12 of therobot base 12. Thus, to follow the entire path associated with themark 6 a, the robot 10 must change locations L12 and/or reconfigure poses P12 to move theworkspace 4. For example, therobot base 12 and lower the body 14 must travel along the path of themark 6 a so that thearm 30 can reach the ground surface. To move theworkspace 4,task instructor 130 of thearm controller 102 generates, via thebase instructor 150,base parameters 152 indicating the macro-positional coordinates 152 a andbalance parameters 152 b that must be accommodated by the base 12 to position and move theworkspace 4 of thearm 30 along thepath 6 a. Using thebase parameters 152, thebase controller 202 instructs therobot base 12 to move along thepath 6 a. - Referring to
FIGS. 1D and 2B , in another example therobot 10 b includes thearm control system 100 and thebase control system 200. In this example, thearm controller 102 receives atask request 62 from theremote device 60 for moving anobject 6 b (e.g., a box) that is located outside of the current location of therobot workspace 4. In the present example, thearm instructor 140 of thearm controller 102 generatesarm instructions 142 for moving thearm 30 b within theworkspace 4 while thebase instructor 150 of thearm controller 102 sendsbase parameters 152 to thebase controller 202 for repositioning theworkspace 4 within theenvironment 2. Thus, thebase controller 202 may evaluate thebase parameters 152 received from thearm controller 102 and determine that the location L12 and/or the pose P12 of the base 12 need to be adjusted to allow the articulatedarm 30 to reach theobject 6 b or to move theobject 6 b to a new location. Additionally or alternatively, thearm controller 102 may receive task requests 62 for movingobjects 6 b that are within thearm workspace 4 to a location outside of thearm workspace 4. -
FIG. 3 is a flowchart of an example arrangement of operations for amethod 300 for coordinatingrobot base 12 andarm 30 tasks using anarm controller 102 and abase controller 202. Themethod 300 may be a computer implemented method executed by data processing hardware of the articulatedarm 30, which causes the data processing hardware to perform operations. Atoperation 302, themethod 300 includes determining a first location of a workspace of the articulated arm associated with a current configuration of the base of the robot. Themethod 300 further includes, atoperation 304, receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location. Atoperation 306, themethod 300 includes generating base parameters associated with the task request. Atoperation 308, themethod 300 further includes instructing the base of the robot to move the workspace of the articulated arm from the first location to a second location using the generated base parameters. -
FIG. 4 is schematic view of anexample computing device 400 that may be used to implement the systems and methods described in this document. Thecomputing device 400 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document. - The
computing device 400 includes aprocessor 410,memory 420, astorage device 430, a high-speed interface/controller 440 connecting to thememory 420 and high-speed expansion ports 450, and a low speed interface/controller 460 connecting to a low speed bus 470 and astorage device 430. Each of the 410, 420, 430, 440, 450, and 460, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. Thecomponents processor 410 can process instructions for execution within thecomputing device 400, including instructions stored in thememory 420 or on thestorage device 430 to display graphical information for a graphical user interface (GUI) on an external input/output device, such asdisplay 480 coupled tohigh speed interface 440. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also,multiple computing devices 400 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system). - The
memory 420 stores information non-transitorily within thecomputing device 400. Thememory 420 may be a computer-readable medium, a volatile memory unit(s), or non-volatile memory unit(s). Thenon-transitory memory 420 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by thecomputing device 400. Examples of non-volatile memory include, but are not limited to, flash memory and read-only memory (ROM)/programmable read-only memory (PROM)/erasable programmable read-only memory (EPROM)/electronically erasable programmable read-only memory (EEPROM) (e.g., typically used for firmware, such as boot programs). Examples of volatile memory include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes. - The
storage device 430 is capable of providing mass storage for thecomputing device 400. In some implementations, thestorage device 430 is a computer-readable medium. In various different implementations, thestorage device 430 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. In additional implementations, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as thememory 420, thestorage device 430, or memory onprocessor 410. - The
high speed controller 440 manages bandwidth-intensive operations for thecomputing device 400, while thelow speed controller 460 manages lower bandwidth-intensive operations. Such allocation of duties is exemplary only. In some implementations, the high-speed controller 440 is coupled to thememory 420, the display 480 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 490, which may accept various expansion cards (not shown). In some implementations, the low-speed controller 460 is coupled to thestorage device 430 and a low-speed expansion port 490. The low-speed expansion port 490, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter. - The
computing device 400 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as astandard server 400 a or multiple times in a group ofsuch servers 400 a, as alaptop computer 400 b, or as part of arack server system 400 c. - Various implementations of the systems and techniques described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, non-transitory computer readable medium, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
- The processes and logic flows described in this specification can be performed by one or more programmable processors, also referred to as data processing hardware, executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- To provide for interaction with a user, one or more aspects of the disclosure can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
- A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.
Claims (20)
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/318,435 US11931898B2 (en) | 2020-12-22 | 2021-05-12 | Arm and body coordination |
| PCT/US2021/064196 WO2022140199A1 (en) | 2020-12-22 | 2021-12-17 | Arm and body coordination |
| EP21847583.8A EP4267353A1 (en) | 2020-12-22 | 2021-12-17 | Arm and body coordination |
| KR1020237024747A KR102867186B1 (en) | 2020-12-22 | 2021-12-17 | Arm and body adjustments |
| CN202180091392.5A CN116723916A (en) | 2020-12-22 | 2021-12-17 | Arm and body coordination |
| US18/443,180 US12440970B2 (en) | 2020-12-22 | 2024-02-15 | Arm and body coordination |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202063129398P | 2020-12-22 | 2020-12-22 | |
| US17/318,435 US11931898B2 (en) | 2020-12-22 | 2021-05-12 | Arm and body coordination |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/443,180 Continuation US12440970B2 (en) | 2020-12-22 | 2024-02-15 | Arm and body coordination |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220193900A1 true US20220193900A1 (en) | 2022-06-23 |
| US11931898B2 US11931898B2 (en) | 2024-03-19 |
Family
ID=82023951
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/318,435 Active 2042-03-01 US11931898B2 (en) | 2020-12-22 | 2021-05-12 | Arm and body coordination |
| US18/443,180 Active US12440970B2 (en) | 2020-12-22 | 2024-02-15 | Arm and body coordination |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/443,180 Active US12440970B2 (en) | 2020-12-22 | 2024-02-15 | Arm and body coordination |
Country Status (4)
| Country | Link |
|---|---|
| US (2) | US11931898B2 (en) |
| EP (1) | EP4267353A1 (en) |
| KR (1) | KR102867186B1 (en) |
| CN (1) | CN116723916A (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11999059B2 (en) | 2020-12-18 | 2024-06-04 | Boston Dynamics, Inc. | Limiting arm forces and torques |
| US20240416517A1 (en) * | 2021-04-14 | 2024-12-19 | Bae Systems Plc | Robotic cells |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11931898B2 (en) * | 2020-12-22 | 2024-03-19 | Boston Dynamics, Inc. | Arm and body coordination |
| US12321672B2 (en) * | 2021-07-14 | 2025-06-03 | Dassault Systèmes Americas Corp. | Environment-aware prepositioning of digital models in an environment |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5550953A (en) * | 1994-04-20 | 1996-08-27 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | On-line method and apparatus for coordinated mobility and manipulation of mobile robots |
| US20060129278A1 (en) * | 2004-12-14 | 2006-06-15 | Honda Motor Co., Ltd. | Legged mobile robot control system |
| US20090173560A1 (en) * | 2006-02-02 | 2009-07-09 | Kabushiki Kaisha Yaskawa Denki | Robot system |
| US20100143089A1 (en) * | 2008-12-10 | 2010-06-10 | Southwest Research Institute | System For Autonomously Dispensing Media On Large Scale Surfaces |
| US20120130540A2 (en) * | 2008-05-21 | 2012-05-24 | Georgia Tech Research Corporation | Force balancing mobile robot and robotic system |
| US9440353B1 (en) * | 2014-12-29 | 2016-09-13 | Google Inc. | Offline determination of robot behavior |
| US20160288324A1 (en) * | 2013-03-15 | 2016-10-06 | Industrial Perception, Inc. | Moveable Apparatuses Having Robotic Manipulators and Conveyors To Facilitate Object Movement |
| US9987745B1 (en) * | 2016-04-01 | 2018-06-05 | Boston Dynamics, Inc. | Execution of robotic tasks |
| US20190134821A1 (en) * | 2015-05-01 | 2019-05-09 | Ge Global Sourcing Llc | Integrated robotic system and method for autonomous vehicle maintenance |
| US10493617B1 (en) * | 2016-10-21 | 2019-12-03 | X Development Llc | Robot control |
| US20200368911A1 (en) * | 2019-05-24 | 2020-11-26 | Seiko Epson Corporation | Method Of Controlling Robot |
| US11203120B1 (en) * | 2019-02-06 | 2021-12-21 | Intrinsic Innovation Llc | Mobile robotics frame system |
Family Cites Families (28)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4066922B2 (en) | 2003-09-19 | 2008-03-26 | 株式会社デンソーウェーブ | Robot stop location determination method and robot stop location determination apparatus |
| JP4305323B2 (en) | 2004-08-11 | 2009-07-29 | ソニー株式会社 | Robot apparatus motion control device and motion control method |
| US7313463B2 (en) | 2005-03-31 | 2007-12-25 | Massachusetts Institute Of Technology | Biomimetic motion and balance controllers for use in prosthetics, orthotics and robotics |
| JP4456560B2 (en) | 2005-12-12 | 2010-04-28 | 本田技研工業株式会社 | Legged mobile robot control device, legged mobile robot, and legged mobile robot control method |
| JP4930003B2 (en) | 2006-11-20 | 2012-05-09 | 株式会社日立製作所 | Mobile robot |
| KR20090124560A (en) | 2008-05-30 | 2009-12-03 | 삼성전자주식회사 | Control device of robot manipulator and its control method |
| JP5836565B2 (en) | 2009-03-24 | 2015-12-24 | ディズニー エンタープライゼス インコーポレイテッド | Robot tracking and balancing system and method for mimicking motion capture data |
| US8942848B2 (en) | 2011-07-06 | 2015-01-27 | Florida Institute for Human and Machine Cognition | Humanoid robot that can dynamically walk with limited available footholds in the presence of disturbances |
| FR2978844B1 (en) | 2011-08-04 | 2014-03-21 | Aldebaran Robotics | ROBOT WITH ARTICULATIONS OF VARIABLE RIGIDITY AND METHOD OF CALCULATING SAID OPTIMIZED RIGIDITY |
| KR20130033920A (en) | 2011-09-27 | 2013-04-04 | 주식회사 유진로봇 | Apparatus of service robot having arm assembly |
| JP5286457B1 (en) * | 2011-12-28 | 2013-09-11 | パナソニック株式会社 | Robot arm |
| US9031691B2 (en) | 2013-03-04 | 2015-05-12 | Disney Enterprises, Inc. | Systemic derivation of simplified dynamics for humanoid robots |
| US9120227B2 (en) | 2013-08-15 | 2015-09-01 | Disney Enterprises, Inc. | Human motion tracking control with strict contact force constraints for floating-base humanoid robots |
| US9579796B2 (en) | 2013-09-25 | 2017-02-28 | Disney Enterprises, Inc. | Automatic task-specific model reduction for humanoid robots |
| US9314934B2 (en) | 2014-02-27 | 2016-04-19 | Disney Enterprises, Inc. | Gravity-counterbalanced robot arm |
| SG11201607059UA (en) | 2014-03-04 | 2016-09-29 | Universal Robots As | Safety system for industrial robot |
| JP6075343B2 (en) | 2014-09-02 | 2017-02-08 | トヨタ自動車株式会社 | Traveling robot, operation planning method thereof, and program |
| US9969082B1 (en) * | 2016-01-05 | 2018-05-15 | Boston Dynamics, Inc. | Robotic systems and methods for task scoring and selection |
| DE102016000187B3 (en) | 2016-01-11 | 2017-01-26 | Kuka Roboter Gmbh | Determining an orientation of a robot relative to a gravitational direction |
| US11370117B2 (en) | 2017-05-29 | 2022-06-28 | Franka Emika Gmbh | Collision handling by a robot |
| WO2020017370A1 (en) | 2018-07-17 | 2020-01-23 | ソニー株式会社 | Control device, control method, and control system |
| US11407118B1 (en) * | 2018-12-10 | 2022-08-09 | Joseph E Augenbraun | Robot for performing dextrous tasks and related methods and systems |
| US20200306998A1 (en) * | 2019-03-25 | 2020-10-01 | Boston Dynamics, Inc. | Multi-Body Controller |
| DE102019134665B3 (en) | 2019-12-17 | 2020-12-10 | Franka Emika Gmbh | Calibrating a virtual force sensor of a robot manipulator |
| WO2022133016A1 (en) | 2020-12-18 | 2022-06-23 | Boston Dynamics, Inc. | Limiting arm forces and torques |
| US11999059B2 (en) | 2020-12-18 | 2024-06-04 | Boston Dynamics, Inc. | Limiting arm forces and torques |
| US11931898B2 (en) * | 2020-12-22 | 2024-03-19 | Boston Dynamics, Inc. | Arm and body coordination |
| WO2022140199A1 (en) | 2020-12-22 | 2022-06-30 | Boston Dynamics, Inc. | Arm and body coordination |
-
2021
- 2021-05-12 US US17/318,435 patent/US11931898B2/en active Active
- 2021-12-17 EP EP21847583.8A patent/EP4267353A1/en active Pending
- 2021-12-17 KR KR1020237024747A patent/KR102867186B1/en active Active
- 2021-12-17 CN CN202180091392.5A patent/CN116723916A/en active Pending
-
2024
- 2024-02-15 US US18/443,180 patent/US12440970B2/en active Active
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5550953A (en) * | 1994-04-20 | 1996-08-27 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | On-line method and apparatus for coordinated mobility and manipulation of mobile robots |
| US20060129278A1 (en) * | 2004-12-14 | 2006-06-15 | Honda Motor Co., Ltd. | Legged mobile robot control system |
| US20090173560A1 (en) * | 2006-02-02 | 2009-07-09 | Kabushiki Kaisha Yaskawa Denki | Robot system |
| US20120130540A2 (en) * | 2008-05-21 | 2012-05-24 | Georgia Tech Research Corporation | Force balancing mobile robot and robotic system |
| US20100143089A1 (en) * | 2008-12-10 | 2010-06-10 | Southwest Research Institute | System For Autonomously Dispensing Media On Large Scale Surfaces |
| US20160288324A1 (en) * | 2013-03-15 | 2016-10-06 | Industrial Perception, Inc. | Moveable Apparatuses Having Robotic Manipulators and Conveyors To Facilitate Object Movement |
| US9440353B1 (en) * | 2014-12-29 | 2016-09-13 | Google Inc. | Offline determination of robot behavior |
| US20190134821A1 (en) * | 2015-05-01 | 2019-05-09 | Ge Global Sourcing Llc | Integrated robotic system and method for autonomous vehicle maintenance |
| US9987745B1 (en) * | 2016-04-01 | 2018-06-05 | Boston Dynamics, Inc. | Execution of robotic tasks |
| US10493617B1 (en) * | 2016-10-21 | 2019-12-03 | X Development Llc | Robot control |
| US11203120B1 (en) * | 2019-02-06 | 2021-12-21 | Intrinsic Innovation Llc | Mobile robotics frame system |
| US20200368911A1 (en) * | 2019-05-24 | 2020-11-26 | Seiko Epson Corporation | Method Of Controlling Robot |
Non-Patent Citations (1)
| Title |
|---|
| Rehman, Bilal, et al., "Towards a Multi-legged Mobile Manipulator," May 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA), pp.3618-3624 (Year: 2016) * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11999059B2 (en) | 2020-12-18 | 2024-06-04 | Boston Dynamics, Inc. | Limiting arm forces and torques |
| US20240416517A1 (en) * | 2021-04-14 | 2024-12-19 | Bae Systems Plc | Robotic cells |
Also Published As
| Publication number | Publication date |
|---|---|
| US12440970B2 (en) | 2025-10-14 |
| US20240189999A1 (en) | 2024-06-13 |
| KR102867186B1 (en) | 2025-09-30 |
| US11931898B2 (en) | 2024-03-19 |
| EP4267353A1 (en) | 2023-11-01 |
| CN116723916A (en) | 2023-09-08 |
| KR20230124657A (en) | 2023-08-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12440970B2 (en) | Arm and body coordination | |
| US12321178B2 (en) | Semantic models for robot autonomy on dynamic sites | |
| US9862090B2 (en) | Surrogate: a body-dexterous mobile manipulation robot with a tracked base | |
| JP5114019B2 (en) | Method for controlling the trajectory of an effector | |
| US10759051B2 (en) | Architecture and methods for robotic mobile manipulation system | |
| US10802508B2 (en) | Mobile robot | |
| US20230286161A1 (en) | Systems and Methods for Robotic Manipulation Using Extended Reality | |
| US12172321B2 (en) | Work mode and travel mode for mobile robots | |
| US12059814B2 (en) | Object-based robot control | |
| US12064879B2 (en) | Global arm path planning with roadmaps and precomputed domains | |
| US20250269522A1 (en) | Constrained manipulation of objects | |
| US11999059B2 (en) | Limiting arm forces and torques | |
| Pepe et al. | A hybrid teleoperation control scheme for a single-arm mobile manipulator with omnidirectional wheels | |
| WO2022140199A1 (en) | Arm and body coordination | |
| Zieliński et al. | Agent-based structures of robot systems | |
| EP1795315A1 (en) | Hand-held control device for an industrial robot | |
| US20240269838A1 (en) | Limiting arm forces and torques | |
| Sieusankar et al. | A review of current techniques for robotic arm manipulation and mobile navigation | |
| Anderson et al. | Coordinated control and range imaging for mobile manipulation | |
| US20250196339A1 (en) | Automated constrained manipulation | |
| Peer et al. | A MOBILE HAPTIC INTERFACE FOR BIMANUAL MANIPULATIONS IN EXTENDED REMOTE/VIRTUAL | |
| Vijayan et al. | A hybrid approach in robot visual feedback |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BOSTON DYNAMICS, INC., MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BERARD, STEPHEN GEORGE;BARRY, ANDREW JAMES;SWILLING, BENJAMIN JOHN;AND OTHERS;REEL/FRAME:056217/0428 Effective date: 20210107 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: BOSTON DYNAMICS, INC., DELAWARE Free format text: CHANGE OF NAME;ASSIGNOR:BOSTON DYNAMICS, INC.;REEL/FRAME:057711/0202 Effective date: 20210614 |
|
| AS | Assignment |
Owner name: BOSTON DYNAMICS, INC., DELAWARE Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE APPLICATIONS NUMBERS 63127573 AND 11/302759 AND THE CITY OF THE ASSIGNEE PREVIOUSLY RECORDED AT REEL: 057111 FRAME: 0202. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:BOSTON DYNAMICS, INC.;REEL/FRAME:057964/0415 Effective date: 20210614 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |