US20230302632A1 - Information processing device, information processing method, and program - Google Patents
Information processing device, information processing method, and program Download PDFInfo
- Publication number
- US20230302632A1 US20230302632A1 US18/006,733 US202118006733A US2023302632A1 US 20230302632 A1 US20230302632 A1 US 20230302632A1 US 202118006733 A US202118006733 A US 202118006733A US 2023302632 A1 US2023302632 A1 US 2023302632A1
- Authority
- US
- United States
- Prior art keywords
- task
- sensors
- information processing
- hand portion
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/0009—Gripping heads and other end effectors comprising multi-articulated fingers, e.g. resembling a human hand
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37425—Distance, range
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39487—Parallel jaws, two fingered hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39527—Workpiece detector, sensor mounted in, near hand, gripper
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39543—Recognize object and plan hand shapes in grasping movements
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40264—Human like, type robot arm
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40298—Manipulator on vehicle, wheels, mobile
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40564—Recognize shape, contour of object, extract position and orientation
Definitions
- the present technology relates to an information processing device, an information processing method, and a program, and more particularly, to an information processing device, an information processing method, and a program capable of appropriately controlling positioning of an operation object.
- Various tasks using a robot hand are typically implemented by controlling a robot hand and performing success/failure determination of the task on the basis of a sensor value acquired by a sensor.
- Patent Document 1 discloses a technique of detecting movement of a target object and a peripheral object on the basis of image information acquired by a vision sensor and force information acquired by a force sensor, and determining whether or not a robot is normally operating the target object.
- Patent Document 2 discloses a technique of moving a sensor unit to a position where measurement of an object is easy, and then moving a robot hand on the basis of the position and orientation of the object measured by the sensor unit to hold the object.
- the measurement result of the position and orientation of the object includes an error, so that the actual position and orientation of the object may deviate from the estimated position and orientation in a case where the robot hand is moved on the basis of one measurement result.
- the information acquired by the sensor unit cannot be used for the success/failure determination of the task or for the control of the robot hand only by moving the sensor unit to a position where measurement is easy.
- the robot hand brought close to an object to hold the object shields the object the position and orientation of the object cannot be measured by the sensor unit.
- the present technology has been made in view of such situations, and an object thereof is to appropriately control positioning of an operation object.
- An information processing device is an information processing device including a control unit configured to control a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface, on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
- a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface is controlled on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
- FIG. 1 is a diagram illustrating an example of external appearance of a robot according to an embodiment of the present technology.
- FIG. 2 is an enlarged view of a hand portion.
- FIG. 3 is a view illustrating a state of measurement by distance sensors.
- FIG. 4 is a view illustrating a state of measurement by the distance sensors.
- FIG. 5 is a block diagram illustrating a hardware configuration example of the robot.
- FIG. 6 is a block diagram illustrating a functional configuration example of an information processing device.
- FIG. 7 is a flowchart for describing processing of the information processing device.
- FIG. 8 is a diagram illustrating a state at the time of execution of a task of holding a thin object.
- FIG. 9 is a diagram subsequent to FIG. 8 , illustrating a state at the time of execution of the task.
- FIG. 10 is a diagram illustrating a state at the time of failure of the task.
- FIG. 11 is a diagram illustrating a state at the time of failure of the task.
- FIG. 12 is a diagram illustrating an example of control by a positioning control unit.
- FIG. 13 is a diagram illustrating an example of a contactable area and a command value of a moving speed of the hand portion.
- FIG. 14 is a diagram illustrating a state at the time of success of a task of wiping a window using a cleaner.
- FIG. 15 is a diagram illustrating a state at the time of success of the task of wiping a window using the cleaner.
- FIG. 16 is a diagram illustrating an example of an interval and a command value of the moving speed of the hand portion.
- FIG. 17 is a diagram illustrating a state at the time of time of failure of the task of wiping a window.
- FIG. 18 is a diagram illustrating a state at the time of success of a task of cutting an object by operating a kitchen knife.
- FIG. 19 is a diagram illustrating a state at the time of failure of the task of cutting the object by operating the kitchen knife.
- FIG. 20 is a diagram illustrating a state at the time of failure of a task of placing a book between other books.
- FIG. 21 is a diagram illustrating a state of a task of positioning an operation object at a specific position of a desk.
- FIG. 22 is a diagram illustrating a configuration example of a system.
- FIG. 23 is a block diagram illustrating a hardware configuration example of a computer.
- FIG. 1 is a diagram illustrating an example of external appearance of a robot 1 according to an embodiment of the present technology.
- the robot 1 is a robot having a humanoid upper body and a moving mechanism using wheels.
- a flat sphere-shaped head portion 12 is provided above a body portion 11 .
- two visual sensors 12 A are provided to imitate human eyes.
- arm portions 13 - 1 and 13 - 2 each including a manipulator with multiple degrees of freedom are provided.
- Hand portions 14 - 1 and 14 - 2 that are end effectors are provided at distal ends of the arm portions 13 - 1 and 13 - 2 , respectively.
- the robot 1 has a function of holding an object with the hand portions 14 - 1 and 14 - 2 .
- arm portions 13 in a case where it is not necessary to distinguish the arm portions 13 - 1 and 13 - 2 , they are collectively referred to as arm portions 13 as appropriate. Furthermore, in a case where it is not necessary to distinguish the hand portions 14 - 1 and 14 - 2 , they are collectively referred to as hand portions 14 . A plurality of other components may be described collectively as appropriate.
- a carriage-type moving body portion 15 is provided at a lower end of the body portion 11 .
- the robot 1 can move by rotating the wheels provided on the left and right of the moving body portion 15 or changing the direction of the wheels.
- the robot 1 is a robot capable of executing various tasks such as holding an object by the hand portions 14 and carrying the object in a state of being held.
- a card C 1 is placed on a top plate of a desk D 1 in front of the robot 1 .
- the robot 1 executes a series of tasks of picking up the card C 1 while monitoring the execution status of the tasks by distance sensors provided on the hand portions 14 .
- the robot 1 may be configured not as a dual arm robot as illustrated in FIG. 1 but as a single arm robot (the number of the arm portion 13 is one). Furthermore, the body portion 11 may be provided on leg portions instead of the carriage (moving body portion 15 ).
- FIG. 2 is an enlarged view of the hand portion 14 - 1 .
- the hand portion 14 - 1 is a gripper type holding portion with two fingers.
- a left finger 22 L and a right finger 22 R that are two finger portions 22 are attached to a base portion 21 having a cubic shape.
- the base portion 21 functions as a support portion that supports the plurality of finger portions 22 .
- the left finger 22 L is configured by connecting a plate-shaped portion 31 L and a plate-shaped portion 32 L that are plate-shaped members having a predetermined thickness.
- the plate-shaped portion 32 L is provided on the distal end side of the plate-shaped portion 31 L attached to the base portion 21 .
- a coupling portion between the base portion 21 and the plate-shaped portion 31 L and a coupling portion between the plate-shaped portion 31 L and the plate-shaped portion 32 L each have a predetermined movable range.
- a thin plate-shaped finger contact portion 33 L is provided on an inner side of the plate-shaped portion 32 L.
- the right finger 22 R has a configuration similar to that of the left finger 22 L. That is, a plate-shaped portion 32 R is provided on the distal end side of a plate-shaped portion 31 R attached to the base portion 21 . A coupling portion between the base portion 21 and the plate-shaped portion 31 R and a coupling portion between the plate-shaped portion 31 R and the plate-shaped portion 32 R each have a predetermined movable range. A thin plate-shaped finger contact portion 33 R (not illustrated) is provided on an inner side of the plate-shaped portion 32 R.
- the left finger 22 L and the right finger 22 R are opened and closed by moving the respective coupling portions.
- Various objects such as the card C 1 are held so as to be sandwiched between the inner side of the plate-shaped portion 32 L and the inner side of the plate-shaped portion 32 R.
- the inner surface of the plate-shaped portion 32 L provided with the finger contact portion 33 L and the inner surface of the plate-shaped portion 32 R provided with the finger contact portion 33 R serve as contact surfaces with an object when the object is held.
- a plurality of distance sensors capable of short distance measurement is provided on the surface of each member included in the hand portion 14 - 1 .
- the distance sensor is, for example, an optical sensor.
- the base portion 21 corresponding to the palm nine distance sensors 41 - 0 are provided side by side vertically and horizontally.
- the distance sensors 41 - 0 are provided at predetermined intervals.
- distance sensors 41 L- 1 , 41 L- 2 , and 41 L- 3 that are pairs of two distance sensors, are provided on the inner side of the plate-shaped portion 32 L, which is a contact surface with an object, in this order from the fingertip side.
- the two distance sensors constituting each of the distance sensors 41 L- 1 , 41 L- 2 , and 41 L- 3 are provided across the finger contact portion 33 L and the distance sensors 41 L- 1 , 41 L- 2 , and 41 L- 3 are provided along the edges of the plate-shaped portion 32 L.
- Distance sensors 41 L- 4 are provided on the side surfaces of the plate-shaped portion 32 L, and a distance sensor 41 L- 5 is provided on a semi-cylindrical surface serving as a fingertip.
- a distance sensor 41 L- 6 (not illustrated) is provided similarly to the right finger 22 R side.
- distance sensors 41 L- 7 and 41 L- 8 are provided side by side.
- a distance sensor 41 L- 9 is provided similarly to the right finger 22 R side.
- distance sensors 41 R- 1 to 41 R- 9 are provided similarly to the left finger 22 L. That is, the distance sensors 41 R- 1 , 41 R- 2 , and 41 R- 3 (not illustrated) are provided on the inner side of the plate-shaped portion 32 R in this order from the fingertip side, and the distance sensors 41 R- 4 are provided on the side surfaces of the plate-shaped portion 32 R.
- the distance sensor 41 R- 5 is provided on the surface of the fingertip of the plate-shaped portion 32 R, and the distance sensor 41 R- 6 is provided on the outer side of the plate-shaped portion 32 R.
- the distance sensors 41 R- 7 and 41 R- 8 are provided on the inner side of the plate-shaped portion 31 R, and the distance sensor 41 R- 9 is provided on the outer side of the plate-shaped portion 31 R.
- distance sensors 41 L- 1 to 41 L- 9 and the distance sensors 41 R- 1 to 41 R- 9 are collectively referred to as distance sensors 41 as appropriate.
- the distance to each position of the object is measured by detecting the reflected light of the light beam emitted from each of the distance sensors 41 .
- the light emitted by the distance sensors 41 are illustrated in a color.
- the distance sensors 41 - 0 , the distance sensors 41 L- 1 to 41 L- 3 , 41 L- 7 , and 41 L- 8 , and the distance sensors 41 R- 1 to 41 R- 3 , 41 R- 7 , and 41 R- 8 are used to measure the distance to each position of the object held by the hand portions 14 , and the like.
- the distance sensor 41 is provided on each part of the hand portion 14 - 1 , so that the distribution of the distances to the card C 1 as the operation object (the distance to each position of the card C 1 ) is measured in real time. Furthermore, the distribution of the distances to the desk D 1 included in the environment surrounding the card C 1 is measured in real time. The execution status of the task is monitored on the basis of the distance to each position of the card C 1 and the desk D 1 .
- the same components as the components of the hand portion 14 - 1 as described above are also provided in the hand portion 14 - 2 .
- the hand portions 14 are two-finger type holding portions, a multi-finger type holding portion having different numbers of finger portions, such as a three-finger type holding portion and a five-finger type holding portion, may be provided.
- the degree of freedom of the finger portion, the number of the distance sensors 41 , and the arrangement of the distance sensors 41 can be set in any ways.
- FIG. 5 is a block diagram illustrating a hardware configuration example of the robot 1 .
- the robot 1 is configured by connecting each component provided in the body portion 11 , the head portion 12 , the arm portion 13 , the hand portions 14 , and the moving body portion 15 to an information processing device 51 .
- the information processing device 51 includes a computer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a flash memory, and the like.
- the information processing device 51 is housed in, for example, the body portion 11 .
- the information processing device 51 executes a predetermined program by the CPU to control the overall operation of the robot 1 .
- the information processing device 51 recognizes the environment around the robot 1 on the basis of the detection result by the sensors, the images captured by the visual sensors, and the like, and executes a task according to the recognition result.
- Various sensors and cameras are provided in each of the body portion 11 , the head portion 12 , the arm portions 13 , the hand portions 14 , and the moving body portion 15 .
- the head portion 12 is provided with the visual sensors 12 A including RGB cameras or the like.
- the hand portions 14 are provided with the distance sensors 41 .
- FIG. 6 is a block diagram illustrating a functional configuration example of the information processing device 51 .
- the information processing device 51 includes an environment measurement unit 101 , a task determination unit 102 , a hand and finger initial position determination unit 103 , an initial position database 104 , an initial position movement control unit 105 , a target value calculation unit 106 , a geometric information estimation unit 107 , a positioning control unit 108 , a task success/failure condition calculation unit 109 , and a task success/failure determination unit 110 .
- At least a part of the functional units illustrated in FIG. 6 is implemented by executing a predetermined program by the CPU of the information processing device 51 .
- the environment measurement unit 101 performs three-dimensional measurement on an operation object and objects included in the environment surrounding the operation object on the basis of the output of the visual sensors 12 A. By performing the three-dimensional measurement, the position and shape of the operation object, the shape of the object included in the environment surrounding the operation object, and the like are calculated. The measurement result by the environment measurement unit 101 is output to the hand and finger initial position determination unit 103 .
- the task determination unit 102 determines a task to be executed and outputs information indicating the content of the task.
- the information indicating the content of the task includes information indicating what kind of information is used as the geometric information. As described later, the geometric information is information used for monitoring the execution status of the task as well as control of each unit.
- the information output from the task determination unit 102 is supplied to the hand and finger initial position determination unit 103 , the target value calculation unit 106 , and the geometric information estimation unit 107 .
- the hand and finger initial position determination unit 103 determines use sensors that are the distance sensor 41 used for monitoring the execution status of the task according to the task determined by the task determination unit 102 .
- the distance sensors 41 suitable for monitoring the execution status of the task are determined as use sensors.
- the hand and finger initial position determination unit 103 calculates the initial positions of the distance sensors 41 determined as the use sensors on the basis of the measurement result by the environment measurement unit 101 and the content of the task determined by the task determination unit 102 .
- Candidates for the initial positions of the distance sensors 41 are set in advance for every type of content of the task.
- the information indicating the candidates of the initial position is stored in the initial position database 104 and read by the hand and finger initial position determination unit 103 as appropriate.
- Information indicating the initial positions of the distance sensors 41 calculated by the hand and finger initial position determination unit 103 are output to the initial position movement control unit 105 .
- the initial position movement control unit 105 controls a drive unit 121 so that the distance sensors 41 are positioned at the initial positions calculated by the hand and finger initial position determination unit 103 .
- the drive unit 121 corresponds to drive portions of the robot 1 including the arm portions 13 , the hand portions 14 , and the moving body portion 15 .
- the target value calculation unit 106 calculates a target value of the geometric information on the basis of the information supplied from the task determination unit 102 , and outputs the target value to the positioning control unit 108 .
- the geometric information estimation unit 107 acquires distance distribution information measured by the distance sensors 41 .
- the distance distribution information indicates a distance to each position of an operation object and an object included in the environment.
- the geometric information estimation unit 107 estimates the geometric information determined by the task determination unit 102 on the basis of the distance distribution information, and outputs the estimated geometric information to the task determination unit 102 , the positioning control unit 108 , and the task success/failure determination unit 110 .
- the positioning control unit 108 performs positioning control by controlling the drive unit 121 so that the geometric information estimated by the geometric information estimation unit 107 reaches the target value supplied from the target value calculation unit 106 .
- the positioning control is control for moving the operation object to a predetermined position. Furthermore, the positioning control is also control for moving the arm portions 13 and the hand portions 14 so that the distance sensors 41 come to predetermined positions.
- the task success/failure condition calculation unit 109 determines a success condition and a failure condition of the task, and outputs information indicating the success condition and the failure condition to the task success/failure determination unit 110 .
- the task success/failure determination unit 110 determines success or failure of the task being executed according to whether or not the geometric information estimated by the geometric information estimation unit 107 satisfies the condition determined by the task success/failure condition calculation unit 109 . In a case where it is determined that the task has succeeded or the task has failed, the task success/failure determination unit 110 outputs a stop command to the positioning control unit 108 . The result of success/failure determination of the task is also supplied to other processing units such as the task determination unit 102 as appropriate.
- a human When a thin object such as the card C 1 is held, a human often picks up the object by using a nail, or picks up the object after translating the object to the front side of the desk D 1 .
- the latter operation that is, translating the card C 1 to the front side of the desk D 1 and then picking up the card C 1 is implemented by the information processing device 51 .
- step S 1 the task determination unit 102 determines a task to be executed.
- the task of holding the card C 1 includes a task of translating the card C 1 to the front side to allow the card C 1 to be sandwiched between the left finger 22 L and the right finger 22 R, and a task of bringing the left finger 22 L into contact with the lower surface of the card C 1 (task of sandwiching the card C 1 between the left finger 22 L and the right finger 22 R).
- the task determination unit 102 determines to first execute a task of translating the card C 1 to the front side.
- step S 2 the environment measurement unit 101 performs three-dimensional measurement of the environment on the basis of the output of the visual sensors 12 A. By performing three-dimensional measurement, the position and shape of the card C 1 , the shape of the desk D 1 , and the like are recognized.
- step S 3 the hand and finger initial position determination unit 103 calculates the initial positions of the hand portions 14 and the finger portions 22 on the basis of the content of the task and the measurement result by the environment measurement unit 101 . Together with the positions, the respective orientations of the hand portions 14 and the finger portions 22 are also calculated.
- step S 4 the hand and finger initial position determination unit 103 determines a use sensor.
- step S 3 the initial position of each of the hand portions 14 and the finger portions 22 is calculated in step S 3 and the use sensor is determined in step S 4 so as to be a position suitable for monitoring the execution status of the task.
- FIG. 8 is a diagram illustrating a state at the time of execution of a task of holding a thin object.
- initial positions that make the left finger 22 L positioned under the desk D 1 and the right finger 22 R brought into contact with the upper surface of the card C 1 are calculated.
- the distance sensors 41 L- 1 to 41 L- 3 and the distance sensors 41 L- 7 and 41 L- 8 ( FIG. 2 ) provided on the inner side of the left finger 22 L are determined as use sensors.
- the initial positions illustrated in A of FIG. 8 are positions where the distance sensors 41 provided on the inner side of the left finger 22 L are provided side by side in parallel to the moving direction of the card C 1 .
- the horizontal right direction indicated by the white arrow is the moving direction of the hand portion 14 , that is, the moving direction of the card C 1 .
- Such an initial position of each portion is selected according to the task determined by the task determination unit 102 .
- Various methods can be used as a method of determining the initial positions in addition to programming in advance such that the determination according to the task is performed.
- initial positions or the like most suitable for detection of success or failure of a task may be determined using an inference model obtained by machine learning.
- an inference model is generated by performing machine learning using time-series sensor data pieces at the time of task success and at the time of task failure recorded when executing the task with setting the positions of the hand portions 14 to various positions.
- step S 5 the task determination unit 102 determines geometric information to be used for control of the drive unit 121 .
- the geometric information is information indicating at least a part of the state of the object including a size (area), an inclination, a distance, and the like and is obtained on the basis of the distance measured by the distance sensors 41 during the movement of the hand portions 14 . Since the geometric information changes according to the positional relationship between the operation object and objects surrounding it, the geometric information is used for monitoring the execution status of the task as information indicating the positional relationship between the operation object and the objects surrounding it during execution of the task (during movement of the hand portion 14 ) together with the control of the drive unit 121 .
- the contactable area S which is an area where the left finger 22 L can be brought into contact, is determined as the geometric information.
- the contactable area S est is expressed by Equation (1) below.
- n represents the number of the distance sensors 41 that have measured sensor values within the effective distance range.
- A represents a footprint area of the distance sensors 41 .
- the effective distance range is a range of a distance larger (longer) than the distance from the distance sensor 41 (the distance sensor 41 on the left finger 22 L) to the desk D 1 and smaller (shorter) than the distance from the distance sensor 41 to the right finger 22 R, which is the finger portion 22 on the opposite side.
- the contactable area S est is an area, in the entire card C 1 , of a region protruding from the edge portion of the desk D 1 .
- the geometric information In addition to the contactable area of the operation object, distances measured by the distance sensors 41 , an orientation (inclination) of the operation object with respect to the environment, and the like can be used as the geometric information. An example of using information other than the contactable area of the operation object as the geometric information will be described later with reference to an example of executing another task.
- step S 6 the initial position movement control unit 105 moves the hand portion 14 to the initial position calculated by the hand and finger initial position determination unit 103 .
- step S 7 the initial position movement control unit 105 brings the hand portion 14 into contact with the operation object. Specifically, as described with reference to A of FIG. 8 , the initial position movement control unit 105 brings the right finger 22 R roughly into contact with the upper surface of the card C 1 .
- distances are measured by the distance sensors 41 L- 1 to 41 L- 3 , 41 L- 7 , and 41 L- 8 , which are use sensors.
- Light beams L 1 to L 5 indicated by broken lines in A of FIG. 8 represent light beams emitted from the distance sensors 41 L- 1 to 41 L- 3 , 41 L- 7 , and 41 L- 8 , respectively.
- the sensor values indicating the distances to the desk D 1 are measured by the distance sensors 41 L- 1 to 41 L- 3 and 41 L- 7
- the sensor value indicating the distance to the right finger 22 R is measured by the distance sensor 41 L- 8 . Since there is no distance sensor that measures the distance within the effective distance range, the contactable area is 0 in this case.
- step S 8 the target value calculation unit 106 calculates a target value of the geometric information.
- the control of the hand portion 14 and the like is performed so as to make the geometric information close to the target value.
- step S 9 the task success/failure condition calculation unit 109 determines a success condition and a failure condition of the task.
- the task success/failure condition calculation unit 109 sets, as a threshold value, an area that allows the left finger 22 L to be in contact with the lower surface of the card C 1 , and determines, as a success condition, that the contactable area S est is larger than the threshold value.
- the target value calculated by the target value calculation unit 106 in step S 8 is an area used as a threshold of the success condition.
- the task success/failure condition calculation unit 109 determines, as a failure condition, that the contactable area S est is smaller than the target value and the sensor values of all the distance sensors 41 measuring sensor values that are out of the effective distance range are within an abnormal range. Of the distances out of the effective distance range, the distance from the distance sensor 41 provided on the left finger 22 L to the right finger 22 R is set as the abnormal range.
- a plurality of conditions may be determined as failure conditions. For example, together with the failure condition described above, the minimum value of the sensor values measured by the distance sensors 41 provided on the left finger 22 L (the shortest distance to the desk D 1 ) becomes smaller than that at the start of the task is determined as the failure condition.
- the hand portion 14 cannot slide parallel to the top plate of the desk D 1 , and the right finger 22 R is away from the card C 1 .
- step S 10 the geometric information estimation unit 107 calculates the contactable area S est on the basis of the sensor values measured by the distance sensors 41 .
- step S 11 the positioning control unit 108 performs positioning control on the basis of the contactable area S est estimated by the geometric information estimation unit 107 .
- the positioning control unit 108 slides the hand portion 14 - 1 in the state illustrated in A of FIG. 8 in parallel to the desk D 1 .
- the hand portion 14 - 1 By sliding the hand portion 14 - 1 to translate the card C 1 , a part of the card C 1 protrudes from the edge portion of the desk D 1 as illustrated in B of FIG. 8 .
- the distance sensors 41 L- 3 measure the sensor values within the effective distance range from the distance sensors 41 to the card C 1 . In this state, a predetermined area is obtained as the contactable area S est .
- the distance sensors 41 L- 1 and 41 L- 2 measure the distances from the distance sensors 41 to the desk D 1 , that is, the sensor values out of the effective distance range. Furthermore, the distance sensors 41 L- 7 and 41 L- 8 measure the distances to the right finger 22 R, that is, the sensor values out of the effective distance sensor.
- the control by the positioning control unit 108 is performed such that the hand portion 14 - 1 (drive unit 121 ) is moved while being decelerated as the contactable area S est , which is geometric information, increases.
- step S 12 the task success/failure determination unit 110 determines whether or not an abnormality has occurred.
- step S 13 the task success/failure determination unit 110 determines whether or not the task has succeeded.
- the contactable area S est satisfies the success condition, it is determined that the task has succeeded.
- step S 13 In a case where it is determined in step S 13 that the task has not succeeded, the processing returns to step S 10 , and the subsequent processing is repeated.
- the positioning control unit 108 continues to slide the hand portion 14 - 1 until the contactable area reaches the target value.
- the positioning control unit 108 stops the sliding of the hand portion 14 - 1 according to a stop command from the task success/failure determination unit 110 in step S 14 .
- FIG. 9 is a diagram subsequent to FIG. 8 , illustrating a state at the time of execution of the task.
- the contactable area S est reaches the target value, and it is determined that the task of translating the card C 1 to the front side has succeeded. At this time, a state where a task of bringing the left finger 22 L into contact with the lower surface of the card C 1 is executed next is made.
- step S 15 the task determination unit 102 determines whether or not all the tasks have been completed.
- step S 15 In a case where it is determined in step S 15 that all the tasks are not completed because, for example, there is a task of bringing the left finger 22 L into contact with the lower surface of the card C 1 , the target value calculation unit 106 calculates, in step S 16 , a command value for the next task on the basis of the current sensor values. The calculation of the command value for the next task is performed after the task of bringing the left finger 22 L into contact with the lower surface of the card C 1 is determined as a task to be executed next by the task determination unit 102 .
- the target value calculation unit 106 calculates a target position of the left finger 22 L as a command value on the basis of the sensor values acquired by the distance sensors 41 L- 1 to 41 L- 3 , 41 L- 7 , and 41 L- 8 .
- step S 3 After the command value for the next task is calculated, the processing returns to step S 3 , and processing similar to the processing described above is performed.
- the processing of moving the left finger 22 L while monitoring the execution status of the task using the geometric information is performed, so that the left finger 22 L comes into contact with the lower surface of the region of the card C 1 protruding from the edge portion of the desk D 1 and the card C 1 is held as illustrated in B of FIG. 9 .
- the distance sensor 41 L- 5 provided at the fingertip of the left finger 22 L is used as a use sensor.
- the distance from the distance sensor 41 L- 5 which is a use sensor, to the card C 1 is used as the geometric information.
- the task success/failure determination unit 110 determines that an abnormality has occurred.
- FIGS. 10 and 11 are diagrams illustrating states at the time of failure of a task.
- the card C 1 is slightly translated but does not protrude from the edge portion of the desk D 1 as illustrated in B of FIG. 10 .
- sensor values out of the effective distance range are measured by the distance sensors 41 L- 1 and 41 L- 2 , respectively.
- the sensor values within the abnormal range are measured by the distance sensors 41 L- 3 , 41 L- 7 , and 41 L- 8 , respectively.
- the sliding of the hand portion 14 - 1 is continued.
- the card C 1 is slightly translated further as illustrated in A of FIG. 11 .
- the contactable area S est of the card C 1 is smaller than that in the case of A of FIG. 9 in which the hand portion 14 - 1 can be slid without slipping.
- sensor values within the effective distance range are measured only by the distance sensors 41 L- 1 . Since the contactable area S est is smaller than the target value and the sensor values of all the distance sensors 41 measuring the sensor values out of the effective distance range are within the abnormal range, it is determined that the failure condition is satisfied, that is, the task of translating the card C 1 to the front side has failed.
- step S 12 determines that an abnormality has occurred due to satisfaction of the above-described failure condition
- the task determination unit 102 sets, in step S 17 , a return operation task on the basis of the current sensor value. Thereafter, the processing returns to step S 3 , and the return operation task is executed by processing similar to the processing described above.
- the task determination unit 102 determines a return operation task of performing the task of translating the card C 1 again as a task to be executed next.
- the target value calculation unit 106 calculates a target value of the movement amount of the return operation task by Equation (2) on the basis of the contactable area S est .
- n ref ⁇ " ⁇ [LeftBracketingBar]" S ref - S e ⁇ s ⁇ t A ⁇ " ⁇ [RightBracketingBar]” + n ( 2 )
- n ref represents the number of the distance sensors 41 that measure sensor values within the effective distance range necessary for the contactable area to reach the target value.
- n represents the current number of the distance sensors 41 that measure sensor values within the effective distance range.
- S ref represents a target value of the contactable area.
- Equation (2) indicates that the contactable area S est becomes larger than the target value S ref if the region of the card C 1 corresponding to the footprint area of the n ref distance sensors 41 protrudes from the edge portion of the desk D 1 .
- the positioning control unit 108 moves the hand portion 14 - 1 to a position where the number of the distance sensors 41 that measure distances to the desk D 1 is n ref . For example, as illustrated in B of FIG. 11 , the positioning control unit 108 moves the hand portion 14 - 1 forward (in the direction of the fingertip of the hand portion 14 - 1 ) indicated by the white arrow by a distance corresponding to two of the distance sensors 41 (the distance sensors 41 L- 1 and 41 L- 2 ).
- the operation of moving the hand portion 14 - 1 by a distance corresponding to n ref of the distance sensors 41 is performed as the return operation task.
- the return operation task for example, the task of increasing the force of pressing the card C 1 with the right finger 22 R to translate the card C 1 is performed again.
- the hand portion is once moved to a position away from the desk, and then the environment is measured again using a visual sensor, and the hand portion is moved again according to the recognized position of the card.
- step S 18 the positioning control unit 108 releases the contact with the operation object and ends the processing.
- FIG. 12 is a diagram illustrating an example of control by the positioning control unit 108 .
- the control of the hand portion 14 by the positioning control unit 108 is performed such that the hand portion 14 is moved while being decelerated as the contactable area S est , which is the geometric information, increases.
- the control by the positioning control unit 108 is implemented by a subtractor 131 , a converter 132 , a subtractor 133 , and a controller 134 .
- the subtractor 131 , the converter 132 , the subtractor 133 , and the controller 134 which are surrounded by a broken line in FIG. 12 , are provided in the positioning control unit 108 .
- the subtractor 131 calculates a difference between the target value S ref of the contactable area and the contactable area S est estimated by the geometric information estimation unit 107 , and outputs the difference to the converter 132 .
- the converter 132 applies the conversion coefficient K to the difference supplied from the subtractor 131 to calculate the target value v ref of the moving speed. For example, as the difference between the target value S ref and the contactable area S est is smaller, a smaller value is calculated as the target value v ref of the moving speed.
- the converter 132 outputs the target value v ref of the moving speed to the subtractor 133 .
- the subtractor 133 calculates a difference between the target value v ref supplied from the converter 132 and the actual moving speed v of the drive unit 121 , and outputs the difference to the controller 134 .
- the controller 134 controls the drive unit 121 so that the difference between the moving speeds supplied from the subtractor 133 becomes 0.
- the actual moving speed v of the drive unit 121 is measured by a sensor provided in each unit and supplied to the subtractor 133 . Furthermore, the three-dimensional coordinates p of each measurement point of the distance sensor 41 are supplied to the geometric information estimation unit 107 . For example, in a case where N distance sensors 41 are provided, the three-dimensional coordinates p are represented by a 3 ⁇ N matrix.
- the geometric information estimation unit 107 estimates the contactable area S est on the basis of the three-dimensional coordinates p, and outputs the contactable area S est to the subtractor 131 .
- FIG. 13 is a diagram illustrating an example of the contactable area and the command value of the moving speed of the hand portion 14 .
- FIG. 13 The upper part of FIG. 13 indicates the contactable area S est , and the lower part indicates the command value of the moving speed with respect to the hand portion 14 .
- the horizontal axis in FIG. 13 represents time.
- the period from time t 0 to time t 1 is a period in which a sensor value within the effective distance range is not measured by any of the distance sensors 41 .
- the contactable area S est is 0.
- the command value of the moving speed is a predetermined speed.
- the period from time t 1 to time t 2 is a period in which sensor values within the effective distance range are measured by the distance sensors 41 L- 3 .
- the contactable area S est increases more than before time t 1 .
- the command value of the moving speed is a value lower than that before time t 1 .
- the period after time t 2 is a period in which sensor values within the effective distance range are measured by the distance sensors 41 L- 2 and 41 L- 3 .
- the contactable area S est increases more than before time t 2 .
- the command value of the moving speed is a value lower than that before time t 2 .
- control of the hand portion 14 by the positioning control unit 108 is performed such that the moving speed of the hand portion 14 is adjusted according to the change in the contactable area S est as the geometric information.
- the distance sensors 41 are positioned at positions where it is easy to measure the displacement between the operation object and the environment according to the content of the task to be executed. Therefore, it is possible to constantly monitor the progress status of a task in which a robot hand would shield the operation object from a camera at a fixed position to make measurement difficult.
- the robot 1 can improve the accuracy and success rate of a task of moving an operation object to a target position.
- the robot 1 it is not necessary to measure the contact position between the card C 1 and the hand portion 14 - 1 , and it is possible to succeed a task of translating the card C 1 only by bringing the hand portion 14 - 1 into contact with an approximate target position.
- the visual sensors 12 A are used only for measuring the positional relationship between the card C 1 and the desk D 1 at the start of the task.
- the control of the hand portion 14 - 1 and the success determination of the task are performed on the basis of the relative displacement between the card C 1 and the desk D 1 measured by the distance sensor 41 provided on the left finger 22 L. Since the contact position between the card C 1 and the hand portion 14 - 1 is not so important, the hand portion 14 - 1 does not need to contact the center of the card C 1 , and may contact the front side or the back side of the card C 1 .
- the geometric information is calculated on the basis of the distance distribution information measured by the distance sensors 41 , and the control of the hand portion 14 and the success determination of the task are performed according to the change in the geometric information.
- the robot 1 can control the hand portion 14 and determine the success of the task with easy observation and a simple algorithm.
- FIGS. 14 and 15 are diagrams illustrating a state at the time of success of a task of wiping a window W 1 using a cleaner C 11 .
- the cleaner C 11 is pressed against the surface of the window W 1 by the hand portion 14 - 1 .
- a frame F 1 thicker than the window W 1 is provided at the end of the window W 1 to surround the window W 1 .
- the operation object is the cleaner C 11 .
- objects included in the environment surrounding the operation object are the window W 1 and the frame F 1 .
- FIG. 14 is a view of the hand portion 14 - 1 pressing the cleaner C 11 against the surface of the window W 1 , which is a vertical surface, as viewed from below
- FIG. 15 is a view of the hand portion 14 - 1 as viewed from the front.
- the horizontal direction in FIG. 14 is the x-axis direction
- the vertical direction in FIG. 14 is the z-axis direction
- the horizontal direction in FIG. is the x-axis direction
- the vertical direction in FIG. 15 is the y-axis direction.
- the left finger 22 L and the right finger 22 R are spread, and the cleaner C 11 is pressed against the surface of the window W 1 by the base portion 21 as the palm of the hand portion 14 - 1 .
- the left finger 22 L is positioned on the right side in the drawing
- the right finger 22 R is positioned on the left side in the drawing.
- the initial position of the hand portion 14 illustrated in A of FIG. 14 is a position where the distance sensors 41 - 0 provided on the base portion 21 and the distance sensor 41 L provided on the inner side of the left finger 22 L are arranged in parallel to the window W 1 .
- the robot 1 brings the hand portion 14 - 1 roughly into contact with the cleaner C 11 . Since the base portion 21 , the left finger 22 L, and the right finger 22 R are provided with the distance sensors 41 , the robot 1 can detect which part of the hand portion 14 - 1 the cleaner C 11 is in contact with on the basis of the sensor values measured by the distance sensors 41 when the robot 1 is brought roughly into contact with the cleaner C 11 .
- the distance sensors 41 L- 1 to 41 L- 3 , 41 L- 7 , and 41 L- 8 which are use sensors.
- sensor values sensor values indicating distances to the frame F 1
- the effective distance range is a range of a distance larger than 0 and smaller than the distance to the window W 1 , or a range of a distance larger than the distance to the window W 1 .
- sensor values out of the effective distance range are measured by the distance sensors 41 L- 2 , 41 L- 3 , and 41 L- 7 .
- a sensor value within a contact range is measured by the distance sensor 41 L- 8 .
- the contact range indicates that the sensor value is 0.
- the progress status of the task is monitored and the hand portion 14 - 1 is controlled using, as the geometric information, an interval ⁇ x that is the interval between the end of the cleaner C 11 and the end of the window W 1 .
- the interval ⁇ x indicated by the bidirectional arrow in A of FIG. 14 is expressed by Equation (3) below.
- Equation (3) x e represents a set of positions of the distance sensors 41 that measure sensor values within the effective distance range.
- x c represents a set of positions of the distance sensors 41 that acquire sensor values within the contact range.
- min(x e ) represents the position of the left end of the frame F 1 (the end on the side of the surface in contact with the window W 1 ), and max(x c ) represents the right end position of the cleaner C 11 .
- the positioning control by the hand portion 14 - 1 is performed such that the interval ⁇ x decreases.
- the robot 1 moves the hand portion 14 - 1 in the state illustrated in A of FIG. 14 in the +x direction (the direction toward the fingertip side of the left finger 22 L) indicated by the white arrow.
- the hand portion 14 - 1 By moving the hand portion 14 - 1 , the right end of the cleaner C 11 is brought close to the frame F 1 as illustrated in B of FIG. 14 .
- the distance sensors 41 L- 1 to 41 L- 3 and 41 L- 7 measure sensor values within the effective distance range.
- min(x e ) is a position corresponding to the position of the distance sensor 41 L- 7
- max(x c ) is a position corresponding to the position of the distance sensor 41 L- 8 , and an interval ⁇ x of a predetermined length is obtained.
- the robot 1 moves the hand portion 14 - 1 in the +y direction indicated by the white arrow.
- the cleaner C 11 also moves in the same +y direction.
- the sensor values measured by the distance sensors 41 when it is determined that the task has succeeded are used for a task of moving the hand portion 14 - 1 in the +y direction as the next operation.
- the movement of the hand portion 14 - 1 is controlled such that the interval ⁇ x, which is the distance between the cleaner C 11 and the frame F 1 , is maintained at a constant distance.
- wiping a window using the hand portion is often performed with the cleaner held by the left and right finger portions of the hand portion.
- the left finger 22 L and the right finger 22 R are spread, the cleaner C 11 is pressed against the surface of the window W 1 by the base portion 21 of the hand portion 14 - 1 , and wiping work is performed while measuring the interval ⁇ x as the geometric information. Since wiping is performed while measuring the interval ⁇ x, it is possible to move the cleaner C 11 to almost the end of the window W 1 .
- control of the hand portion 14 - 1 is performed such that the hand portion 14 - 1 is moved while being decelerated as the interval ⁇ x, which is the geometric information, decreases.
- FIG. 16 is a diagram illustrating an example of the interval ⁇ x and the command value of the moving speed of the hand portion 14 - 1 .
- the upper part of FIG. 16 indicates the interval ⁇ x, and the lower part indicates the command value of the moving speed for the hand portion 14 - 1 .
- the horizontal axis in FIG. 16 represents time.
- the period from time t 0 to time t 1 is a period in which sensor values within the effective distance range are measured by the distance sensors 41 L- 1 .
- the interval ⁇ x is a predetermined distance.
- the command value of the moving speed is a predetermined speed.
- a period from time t 1 to time t 2 is a period in which the hand portion 14 - 1 moves in the +x direction.
- the interval ⁇ x gradually decreases from time t 1 .
- the command value of the moving speed gradually decreases from time t 1 .
- the period after time t 2 is a period in which sensor values within the effective distance range are measured by the distance sensors 41 L- 1 to 41 L- 3 and 41 L- 7 .
- the interval ⁇ x is a value lower than that before time t 2 .
- the command value of the moving speed is a value lower than that before time t 2 .
- FIG. 17 is a diagram illustrating a state at the time of failure of a task of wiping the window W 1 .
- the movement amount of the cleaner C 11 is smaller than the movement amount of the hand portion 14 - 1 as illustrated in B of FIG. 17 . That is, the contact position between the cleaner C 11 and the hand portion 14 - 1 is deviated.
- the information processing device 51 moves the hand portion 14 - 1 in the direction indicated by the white arrow (the direction of the fingertip of the right finger 22 R) by a distance corresponding to one of the distance sensors 41 (distance sensor 41 L- 8 ) by the return operation task. Thereafter, the robot 1 brings the hand portion 14 - 1 into contact with the cleaner C 11 again and performs the wiping task again.
- control of the hand portion 14 may be performed using not only the sensor values measured by the distance sensors 41 but also a sensor value measured by a vibration sensor or a tactile sensor provided on the hand portion 14 .
- slipping occurred between the hand portion 14 and the operation object is detected on the basis of a sensor value measured by a vibration sensor or a tactile sensor, thereby failure of the task is determined.
- the robot 1 can improve the accuracy of the failure determination of the task.
- FIG. 18 is a diagram illustrating a state at the time of success of a task of cutting an object Ob 1 by operating a kitchen knife K 1 .
- the object Ob 1 is placed on the desk D 1 .
- the robot 1 holds the handle portion of the kitchen knife K 1 with the finger portions 22 , and applies the blade portion of the kitchen knife K 1 to the object Ob 1 with the fingertip facing downward.
- the object Ob 1 is food such as a vegetable or a fruit.
- the operation object is the kitchen knife K 1 .
- objects included in the environment surrounding the operation object are the desk D 1 (table) and the spherical object Ob 1 .
- the distance sensor 41 - 5 at the fingertip is determined as a use sensor, and the distance is measured by the distance sensor 41 - 5 .
- a sensor value indicating the distance to the desk D 1 is measured by the distance sensor 41 - 5 as indicated by the light beam L 11 indicated by a broken line.
- the progress status of the task is monitored and the hand portion 14 - 1 is controlled using, as the geometric information, the distance from the blade portion of the kitchen knife K 1 to the desk D 1 measured by the distance sensor 41 - 5 .
- the distance measured by the distance sensor 41 - 5 is the same distance as the distance from the blade portion of the kitchen knife K 1 to the desk D 1 indicated by the bidirectional arrow in A of FIG. 18 .
- a plurality of distances to the handle portion of the kitchen knife K 1 are measured by the distance sensors 41 - 0 provided on the base portion 21 .
- the plurality of distance sensors 41 - 0 is provided on the upper surface of the base portion 21 corresponding to the palm.
- the inclination (orientation) of the kitchen knife K 1 obtained on the basis of the sensor values measured by the distance sensors 41 - 0 is used as the geometric information together with the distance to the desk D 1 .
- the inclination of the kitchen knife K 1 is represented by, for example, a difference between the distance measured by the light beam L 12 and the distance measured by the light beam L 13 .
- the positioning control for lowering the kitchen knife K 1 is performed such that the distance to the desk D 1 decreases. Specifically, the positioning control is performed by moving the hand portion 14 - 1 in a downward direction indicated by the white arrow in A of FIG. 18 in a state where the blade portion of the kitchen knife K 1 is in contact with the object Ob 1 . By moving the hand portion 14 - 1 , the blade portion of the kitchen knife K 1 is pushed into the object Ob 1 as illustrated in A of FIG. 18 .
- the sensor value measured by the force sensor provided at the wrist portion of the hand portion 14 - 1 may be used for the success determination of the task.
- the force sensor provided at the wrist portion of the hand portion 14 - 1 measures, for example, the reaction force when the blade portion of the kitchen knife K 1 hits the desk D 1 .
- positioning control for separating the kitchen knife K 1 from the object Ob 1 is performed. Specifically, as illustrated in C of FIG. 18 , the positioning control is performed by moving the hand portion 14 - 1 in an upward direction indicated by the white arrow until the sensor value measured by the distance sensor 41 - 5 becomes sufficiently large.
- the present technology can also be applied to a task of positioning an object using a tool, such as a task of placing an object held by a tong at a specific position, by regarding the end effector to include the tool.
- FIG. 19 is a diagram illustrating a state at the time of failure of the task of cutting the object Ob 1 by operating the kitchen knife K 1 .
- the robot 1 adjusts the orientation of pushing the kitchen knife K 1 into the object Ob 1 on the basis of the inclination of the kitchen knife K 1 as the geometric information. Specifically, as indicated by the white arrow in C of FIG. 19 , the orientation is adjusted by changing the orientation of the hand portion 14 - 1 or re-holding the kitchen knife K 1 so that the inclination of the kitchen knife K 1 becomes 0. Such a task of adjusting the orientation is executed as a return operation task.
- the robot 1 can resume the task of cutting the object Ob 1 without performing the task from the beginning again.
- FIG. 20 is a diagram illustrating a state at the time of failure of a task of placing a book B 1 between a book B 11 and a book B 12 other than the book B 1 .
- FIG. 20 is a top view of the positional relationship between the books.
- the book B 11 and the book B 12 are provided in parallel to each other.
- the robot 1 holds the book B 1 with the right finger 22 R and the left finger 22 L, and moves the book B 1 so as to place the book B 1 in the gap between the book B 11 and the book B 12 .
- the operation object is the book B 1 .
- objects included in the environment surrounding the operation object are the book B 11 and the book B 12 .
- the distance sensor 41 L- 5 at the fingertip of the left finger 22 L and the distance sensor 41 R- 5 at the fingertip of the right finger 22 R are determined as use sensors, and the distance sensor 41 L- 5 and the distance sensor 41 R- 5 measure distances.
- the distance to the book B 11 is measured by a distance sensor 41 L- 5 provided at the fingertip of the left finger 22 L.
- the distance to the book B 12 is measured by the distance sensor 41 R- 5 provided at the fingertip of the right finger 22 R.
- an average value of sensor values measured by the distance sensor 41 L- 5 and the distance sensor 41 R- 5 is used as geometric information to monitor the progress status of the task and control the hand portion 14 - 1 .
- the positioning control for placing the book B 1 is performed such that the book B 1 is inserted into the gap between the book B 11 and the book B 12 until the average value of the sensor values measured by the distance sensor 41 L- 5 and the distance sensor 41 R- 5 becomes 0.
- the book B 1 When the book B 1 is inserted, the book B 1 may come into contact with the book B 11 in the surroundings as illustrated in B of FIG. 20 , and the book B 11 may move as illustrated in C of FIG. 20 .
- the sensor value measured by the distance sensor 41 L- 5 becomes a large value. Therefore, the average value of the sensor values measured by the distance sensors 41 L- 4 and the distance sensor 41 R- 5 also becomes a large value.
- the robot 1 In a case where the average value of the sensor values measured by the distance sensors 41 L- 4 and the distance sensor 41 R- 5 becomes large, it is determined that the task has failed. In a case where it is determined that the task has failed, a recovery operation is performed. Specifically, as illustrated in D of FIG. 20 , the robot 1 returns the book B 1 to the initial position, and returns B 11 that has moved to the original position.
- the robot 1 After performing the recovery operation, the robot 1 can perform a next task, such as the same task.
- the robot 1 can detect an abnormality in a case where an object in the surroundings of the operation object moves unexpectedly.
- the geometric information is obtained on the basis of the sensor values measured by the distance sensors 41
- the geometric information may be obtained on the basis of a measurement result by a different sensor such as a time of flight (ToF) camera or a stereo camera.
- a different sensor such as a time of flight (ToF) camera or a stereo camera.
- ToF time of flight
- various sensors capable of acquiring distance distribution information can be used for measurement.
- the geometric information may be obtained on the basis of a map surrounding the hand portion 14 created by moving the hand portion 14 to perform scanning and merging time-series data pieces of measurement results by the distance sensors 41 .
- the distance sensors 41 may be provided at portions other than the hand portion 14 .
- the other hand (the hand portion 14 - 2 ) is positioned under the desk D 1 , and positioning control is performed on the basis of sensor values measured by the distance sensors 41 provided on the hand portion 14 - 2 .
- the robot 1 can perform positioning control using the hand portion 14 - 2 at the same time as positioning the hand portion 14 - 1 at the initial position. Therefore, the robot 1 can shorten the time required for the task.
- the operation object may be moved to the target position on the basis of information output from an RGB camera or a color sensor mounted on the hand portion 14 .
- FIG. 21 is a diagram illustrating a state of a task of positioning an operation object at a specific position of the desk D 1 .
- FIG. 21 there is no shape feature or mark on the top plate of the desk D 1 .
- a position P 1 on the top plate of the desk D 1 indicated by the star is a target position where the operation object is to be positioned.
- the robot 1 holds the object Ob 11 with the finger portions 22 and moves the object Ob 1 so as to place the object Ob 1 at the position P 1 .
- the operation object is the object Ob 11 .
- an object included in the environment surrounding the operation object is the desk D 1 .
- a target distance from the end of the desk D 1 to the position P 1 indicated by the bidirectional arrow in A of FIG. 21 is calculated on the basis of the image information output by the visual sensors 12 A.
- the plurality of distance sensors 41 provided on the arm portion 13 - 1 is determined as use sensors, and the distances are measured by the distance sensors 41 .
- the distance sensors 41 provided on the arm portion 13 - 1 are arranged in parallel to the top plate of the desk D 1 , and the distance to the desk D 1 is measured by the light beam L 31 .
- the arm portion 13 - 1 is controlled using the distance from the end of the desk D 1 to the object Ob 11 as the geometric information.
- the distance from the end of the desk D 1 to the object Ob 11 is obtained on the basis of sensor values measured by the plurality of distance sensors 41 provided on the arm portion 13 - 1 .
- the positioning for positioning the object Ob 11 at the position P 1 is performed so that the difference between the distance from the end of the desk D 1 to the object Ob 11 and the target distance becomes small. Specifically, by moving the arm portion 13 - 1 in the left direction indicated by the white arrow in A of FIG. 21 , the distance from the end of the desk D 1 to the object Ob 11 becomes the same as the target distance indicated by the bidirectional arrow as illustrated in B of FIG. 21 .
- the distance sensors 41 provided on the arm portion 13 - 1 measure sensor values indicating the distance to the desk D 1 .
- the arm portion 13 - 1 is moved downward so as to place the object Ob 11 on the desk D 1 , so that the object Ob 11 is positioned at the position P 1 .
- An actuator other than the electromagnetic motor may be mounted on the end effector.
- a suction type end effector is mounted on the robot 1 .
- the robot 1 can easily hold a light object such as a card using a suction type end effector.
- the robot 1 can once move the object to the front side of the desk while sucking the object, and then suck and hold the back of the object with a suction mechanism of another finger portion. By once moving a heavy object to the front side of the desk and then holding the object, the robot 1 can increase the stability of holding.
- the positioning control may be performed on the basis of the sensor values measured by the distance sensors 41 and information measured at the start of the task by the visual sensors 12 A (a three-dimensional measuring instrument such as a Depth camera). Since the distance sensors 41 are discretely arranged, the accuracy of the geometric information obtained on the basis of the sensor values measured by such distance sensors 41 may be low.
- the robot 1 By complementing the distance information between the distance sensors 41 on the basis of the information measured by the visual sensors 12 A, the robot 1 can obtain highly accurate geometric information, and the accuracy of positioning control of a small object or an object having a complicated shape can be improved.
- the operation of moving the hand portion 14 - 1 to the initial position may be performed a plurality of times. For example, in a case where the quality of the sensor values measured by the distance sensors 41 is poor (a case where noise is large, a case where there are many measurement omissions, and the like) after the hand portion 14 - 1 is moved to the initial position, the robot 1 changes the orientation of the hand portion 14 - 1 from the initial position or moves the hand portion 14 - 1 to an initial position that is another candidate.
- FIG. 22 is a diagram illustrating a configuration example of a system.
- the system illustrated in FIG. 22 is configured by providing the information processing device 51 as an external apparatus of the robot 1 .
- the information processing device 51 may be provided outside the housing of the robot 1 .
- Wireless communication of a predetermined standard such as a wireless LAN or long term evolution (LTE) is performed between the robot 1 and the information processing device 51 in FIG. 22 .
- a predetermined standard such as a wireless LAN or long term evolution (LTE)
- Various types of information such as information indicating the state of the robot 1 and information indicating the detection result of the sensors are transmitted from the robot 1 to the information processing device 51 .
- Information for controlling the operation of the robot 1 and the like are transmitted from the information processing device 51 to the robot 1 .
- the robot 1 and the information processing device 51 may be directly connected as illustrated in A of FIG. 22 , or may be connected via a network 61 such as the Internet as illustrated in B of FIG. 22 .
- the operations of the plurality of robots 1 may be controlled by one information processing device 51 .
- the above-described series of processing can be performed by hardware or software.
- a program implementing the software is installed from a program recording medium to a computer incorporated in dedicated hardware, a general-purpose personal computer, or the like.
- FIG. 23 is a block diagram illustrating a configuration example of hardware of a computer that performs the above-described series of processing using a program.
- a central processing unit (CPU) 1001 , a read only memory (ROM) 1002 , and a random access memory (RAM) 1003 are connected to each other by a bus 1004 .
- An input/output interface 1005 is further connected to the bus 1004 .
- An input unit 1006 including a keyboard, a mouse, and the like, and an output unit 1007 including a display, a speaker, and the like are connected to the input/output interface 1005 .
- a storage unit 1008 including a hard disk, a nonvolatile memory, and the like, a communication unit 1009 including a network interface and the like, and a drive 1010 that drives a removable medium 1011 are connected to the input/output interface 1005 .
- the CPU 1001 loads a program stored in the storage unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004 and executes the program, so that the above-described series of processing is performed.
- the program executed by the CPU 1001 is provided, for example, by being recorded in the removable medium 1011 or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and is installed in the storage unit 1008 .
- the program executed by the computer may be a program that causes pieces of processing to be performed in time series in the order described in the present specification, or may be a program that causes the pieces of processing to be performed in parallel or at necessary timing such as when a call is made.
- a system means a set of a plurality of components (devices, modules (parts), and the like), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device having one housing in which a plurality of modules is housed are both systems.
- Embodiments of the present technology are not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present technology.
- the present technology can have a configuration of cloud computing in which one function is shared and processed in cooperation by a plurality of devices via a network.
- steps described in the above-described flowcharts can be performed by one device or can be shared and executed by a plurality of devices.
- the plurality of pieces of processing included in the one step can be executed by one device or can be shared and executed by a plurality of devices.
- the present technology may also have the following configurations.
- An information processing device including:
- the information processing device further including
- the information processing device according to any one of (2) to (7) further including:
- the information processing device according to any one of (1) to (9) further including
- An information processing method performed by an information processing device including:
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
- The present technology relates to an information processing device, an information processing method, and a program, and more particularly, to an information processing device, an information processing method, and a program capable of appropriately controlling positioning of an operation object.
- Various tasks using a robot hand, such as holding an object, are typically implemented by controlling a robot hand and performing success/failure determination of the task on the basis of a sensor value acquired by a sensor.
- For example,
Patent Document 1 discloses a technique of detecting movement of a target object and a peripheral object on the basis of image information acquired by a vision sensor and force information acquired by a force sensor, and determining whether or not a robot is normally operating the target object. -
Patent Document 2 discloses a technique of moving a sensor unit to a position where measurement of an object is easy, and then moving a robot hand on the basis of the position and orientation of the object measured by the sensor unit to hold the object. -
- Patent Document 1: Japanese Patent Application Laid-Open No. 2016-196077
- Patent Document 2: Japanese Patent Application Laid-Open No. 2020-16446
- In the technique described in
Patent Document 1, depending on the positional relationship between the vision sensor and the robot hand, an object may be shielded by the robot hand, so that the vision sensor may not be able to observe the object. In this case, it is difficult to control the robot hand on the basis of the image information. - Furthermore, it is not preferable to estimate the positional relationship between the target object and the peripheral object and the movement amount on the basis of the force information acquired by the force sensor from the viewpoint of estimation accuracy.
- In the technique described in
Patent Document 2, the measurement result of the position and orientation of the object includes an error, so that the actual position and orientation of the object may deviate from the estimated position and orientation in a case where the robot hand is moved on the basis of one measurement result. - Furthermore, there is a case where the information acquired by the sensor unit cannot be used for the success/failure determination of the task or for the control of the robot hand only by moving the sensor unit to a position where measurement is easy. For example, in a case where the robot hand brought close to an object to hold the object shields the object, the position and orientation of the object cannot be measured by the sensor unit.
- The present technology has been made in view of such situations, and an object thereof is to appropriately control positioning of an operation object.
- An information processing device according to one aspect of the present technology is an information processing device including a control unit configured to control a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface, on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
- In one aspect of the present technology, a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface is controlled on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
-
FIG. 1 is a diagram illustrating an example of external appearance of a robot according to an embodiment of the present technology. -
FIG. 2 is an enlarged view of a hand portion. -
FIG. 3 is a view illustrating a state of measurement by distance sensors. -
FIG. 4 is a view illustrating a state of measurement by the distance sensors. -
FIG. 5 is a block diagram illustrating a hardware configuration example of the robot. -
FIG. 6 is a block diagram illustrating a functional configuration example of an information processing device. -
FIG. 7 is a flowchart for describing processing of the information processing device. -
FIG. 8 is a diagram illustrating a state at the time of execution of a task of holding a thin object. -
FIG. 9 is a diagram subsequent toFIG. 8 , illustrating a state at the time of execution of the task. -
FIG. 10 is a diagram illustrating a state at the time of failure of the task. -
FIG. 11 is a diagram illustrating a state at the time of failure of the task. -
FIG. 12 is a diagram illustrating an example of control by a positioning control unit. -
FIG. 13 is a diagram illustrating an example of a contactable area and a command value of a moving speed of the hand portion. -
FIG. 14 is a diagram illustrating a state at the time of success of a task of wiping a window using a cleaner. -
FIG. 15 is a diagram illustrating a state at the time of success of the task of wiping a window using the cleaner. -
FIG. 16 is a diagram illustrating an example of an interval and a command value of the moving speed of the hand portion. -
FIG. 17 is a diagram illustrating a state at the time of time of failure of the task of wiping a window. -
FIG. 18 is a diagram illustrating a state at the time of success of a task of cutting an object by operating a kitchen knife. -
FIG. 19 is a diagram illustrating a state at the time of failure of the task of cutting the object by operating the kitchen knife. -
FIG. 20 is a diagram illustrating a state at the time of failure of a task of placing a book between other books. -
FIG. 21 is a diagram illustrating a state of a task of positioning an operation object at a specific position of a desk. -
FIG. 22 is a diagram illustrating a configuration example of a system. -
FIG. 23 is a block diagram illustrating a hardware configuration example of a computer. - MODE FOR CARRYING OUT THE INVENTION
- Hereinafter, an embodiment for carrying out the present technology will be described. The description will be given in the following order.
-
- 1. External configuration of robot
- 2. Configuration of robot
- 3. Operation of information processing device
- 4. Examples of other tasks
- 5. Modifications
- <1. External Configuration of Robot>
-
FIG. 1 is a diagram illustrating an example of external appearance of arobot 1 according to an embodiment of the present technology. - As illustrated in
FIG. 1 , therobot 1 is a robot having a humanoid upper body and a moving mechanism using wheels. A flat sphere-shaped head portion 12 is provided above abody portion 11. On the front face of thehead portion 12, twovisual sensors 12A are provided to imitate human eyes. - At the upper end of the
body portion 11, arm portions 13-1 and 13-2 each including a manipulator with multiple degrees of freedom are provided. Hand portions 14-1 and 14-2 that are end effectors are provided at distal ends of the arm portions 13-1 and 13-2, respectively. Therobot 1 has a function of holding an object with the hand portions 14-1 and 14-2. - Hereinafter, in a case where it is not necessary to distinguish the arm portions 13-1 and 13-2, they are collectively referred to as
arm portions 13 as appropriate. Furthermore, in a case where it is not necessary to distinguish the hand portions 14-1 and 14-2, they are collectively referred to ashand portions 14. A plurality of other components may be described collectively as appropriate. - A carriage-type moving
body portion 15 is provided at a lower end of thebody portion 11. Therobot 1 can move by rotating the wheels provided on the left and right of the movingbody portion 15 or changing the direction of the wheels. - As described above, the
robot 1 is a robot capable of executing various tasks such as holding an object by thehand portions 14 and carrying the object in a state of being held. In the example ofFIG. 1 , a card C1 is placed on a top plate of a desk D1 in front of therobot 1. As described later, therobot 1 executes a series of tasks of picking up the card C1 while monitoring the execution status of the tasks by distance sensors provided on thehand portions 14. - Note that the
robot 1 may be configured not as a dual arm robot as illustrated inFIG. 1 but as a single arm robot (the number of thearm portion 13 is one). Furthermore, thebody portion 11 may be provided on leg portions instead of the carriage (moving body portion 15). -
FIG. 2 is an enlarged view of the hand portion 14-1. - As illustrated in
FIG. 2 , the hand portion 14-1 is a gripper type holding portion with two fingers. Aleft finger 22L and aright finger 22R that are twofinger portions 22 are attached to abase portion 21 having a cubic shape. Thebase portion 21 functions as a support portion that supports the plurality offinger portions 22. - The
left finger 22L is configured by connecting a plate-shapedportion 31L and a plate-shapedportion 32L that are plate-shaped members having a predetermined thickness. The plate-shapedportion 32L is provided on the distal end side of the plate-shapedportion 31L attached to thebase portion 21. A coupling portion between thebase portion 21 and the plate-shapedportion 31L and a coupling portion between the plate-shapedportion 31L and the plate-shapedportion 32L each have a predetermined movable range. A thin plate-shapedfinger contact portion 33L is provided on an inner side of the plate-shapedportion 32L. - The
right finger 22R has a configuration similar to that of theleft finger 22L. That is, a plate-shapedportion 32R is provided on the distal end side of a plate-shapedportion 31R attached to thebase portion 21. A coupling portion between thebase portion 21 and the plate-shapedportion 31R and a coupling portion between the plate-shapedportion 31R and the plate-shapedportion 32R each have a predetermined movable range. A thin plate-shaped finger contact portion 33R (not illustrated) is provided on an inner side of the plate-shapedportion 32R. - The
left finger 22L and theright finger 22R are opened and closed by moving the respective coupling portions. Various objects such as the card C1 are held so as to be sandwiched between the inner side of the plate-shapedportion 32L and the inner side of the plate-shapedportion 32R. The inner surface of the plate-shapedportion 32L provided with thefinger contact portion 33L and the inner surface of the plate-shapedportion 32R provided with the finger contact portion 33R serve as contact surfaces with an object when the object is held. - As illustrated in a color in
FIG. 2 , a plurality of distance sensors capable of short distance measurement is provided on the surface of each member included in the hand portion 14-1. The distance sensor is, for example, an optical sensor. - For example, on the upper surface of the
base portion 21 corresponding to the palm, nine distance sensors 41-0 are provided side by side vertically and horizontally. The distance sensors 41-0 are provided at predetermined intervals. - Furthermore,
distance sensors 41L-1, 41L-2, and 41L-3 that are pairs of two distance sensors, are provided on the inner side of the plate-shapedportion 32L, which is a contact surface with an object, in this order from the fingertip side. The two distance sensors constituting each of thedistance sensors 41L-1, 41L-2, and 41L-3 are provided across thefinger contact portion 33L and thedistance sensors 41L-1, 41L-2, and 41L-3 are provided along the edges of the plate-shapedportion 32L. -
Distance sensors 41L-4 are provided on the side surfaces of the plate-shapedportion 32L, and adistance sensor 41L-5 is provided on a semi-cylindrical surface serving as a fingertip. On the outer side of the plate-shapedportion 32L, adistance sensor 41L-6 (not illustrated) is provided similarly to theright finger 22R side. - On the inner side of the plate-shaped
portion 31L,distance sensors 41L-7 and 41L-8 are provided side by side. On the outer side of the plate-shapedportion 31L, adistance sensor 41L-9 (not illustrated) is provided similarly to theright finger 22R side. - Also on the
right finger 22R,distance sensors 41R-1 to 41R-9 are provided similarly to theleft finger 22L. That is, thedistance sensors 41R-1, 41R-2, and 41R-3 (not illustrated) are provided on the inner side of the plate-shapedportion 32R in this order from the fingertip side, and thedistance sensors 41R-4 are provided on the side surfaces of the plate-shapedportion 32R. - The
distance sensor 41R-5 is provided on the surface of the fingertip of the plate-shapedportion 32R, and thedistance sensor 41R-6 is provided on the outer side of the plate-shapedportion 32R. Thedistance sensors 41R-7 and 41R-8 (not illustrated) are provided on the inner side of the plate-shapedportion 31R, and thedistance sensor 41R-9 is provided on the outer side of the plate-shapedportion 31R. - Hereinafter, in a case where it is not necessary to distinguish the
distance sensors 41L-1 to 41L-9 and thedistance sensors 41R-1 to 41R-9, they are collectively referred to asdistance sensors 41 as appropriate. - When a task is executed, as illustrated in
FIGS. 3 and 4 , the distance to each position of the object is measured by detecting the reflected light of the light beam emitted from each of thedistance sensors 41. InFIGS. 3 and 4 , the light emitted by thedistance sensors 41 are illustrated in a color. - For example, the distance sensors 41-0, the
distance sensors 41L-1 to 41L-3, 41L-7, and 41L-8, and thedistance sensors 41R-1 to 41R-3, 41R-7, and 41R-8 are used to measure the distance to each position of the object held by thehand portions 14, and the like. - As described above, the
distance sensor 41 is provided on each part of the hand portion 14-1, so that the distribution of the distances to the card C1 as the operation object (the distance to each position of the card C1) is measured in real time. Furthermore, the distribution of the distances to the desk D1 included in the environment surrounding the card C1 is measured in real time. The execution status of the task is monitored on the basis of the distance to each position of the card C1 and the desk D1. - The same components as the components of the hand portion 14-1 as described above are also provided in the hand portion 14-2.
- Although the
hand portions 14 are two-finger type holding portions, a multi-finger type holding portion having different numbers of finger portions, such as a three-finger type holding portion and a five-finger type holding portion, may be provided. The degree of freedom of the finger portion, the number of thedistance sensors 41, and the arrangement of thedistance sensors 41 can be set in any ways. - <2. Configuration of Robot>
- Hardware Configuration
-
FIG. 5 is a block diagram illustrating a hardware configuration example of therobot 1. - As illustrated in
FIG. 5 , therobot 1 is configured by connecting each component provided in thebody portion 11, thehead portion 12, thearm portion 13, thehand portions 14, and the movingbody portion 15 to aninformation processing device 51. - The
information processing device 51 includes a computer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a flash memory, and the like. Theinformation processing device 51 is housed in, for example, thebody portion 11. Theinformation processing device 51 executes a predetermined program by the CPU to control the overall operation of therobot 1. - The
information processing device 51 recognizes the environment around therobot 1 on the basis of the detection result by the sensors, the images captured by the visual sensors, and the like, and executes a task according to the recognition result. Various sensors and cameras are provided in each of thebody portion 11, thehead portion 12, thearm portions 13, thehand portions 14, and the movingbody portion 15. For example, thehead portion 12 is provided with thevisual sensors 12A including RGB cameras or the like. Thehand portions 14 are provided with thedistance sensors 41. - Functional Configuration
-
FIG. 6 is a block diagram illustrating a functional configuration example of theinformation processing device 51. - As illustrated in
FIG. 6 , theinformation processing device 51 includes anenvironment measurement unit 101, atask determination unit 102, a hand and finger initialposition determination unit 103, aninitial position database 104, an initial positionmovement control unit 105, a targetvalue calculation unit 106, a geometricinformation estimation unit 107, apositioning control unit 108, a task success/failurecondition calculation unit 109, and a task success/failure determination unit 110. At least a part of the functional units illustrated inFIG. 6 is implemented by executing a predetermined program by the CPU of theinformation processing device 51. - The
environment measurement unit 101 performs three-dimensional measurement on an operation object and objects included in the environment surrounding the operation object on the basis of the output of thevisual sensors 12A. By performing the three-dimensional measurement, the position and shape of the operation object, the shape of the object included in the environment surrounding the operation object, and the like are calculated. The measurement result by theenvironment measurement unit 101 is output to the hand and finger initialposition determination unit 103. - The
task determination unit 102 determines a task to be executed and outputs information indicating the content of the task. The information indicating the content of the task includes information indicating what kind of information is used as the geometric information. As described later, the geometric information is information used for monitoring the execution status of the task as well as control of each unit. The information output from thetask determination unit 102 is supplied to the hand and finger initialposition determination unit 103, the targetvalue calculation unit 106, and the geometricinformation estimation unit 107. - The hand and finger initial
position determination unit 103 determines use sensors that are thedistance sensor 41 used for monitoring the execution status of the task according to the task determined by thetask determination unit 102. Among the plurality ofdistance sensors 41 provided on thehand portions 14, thedistance sensors 41 suitable for monitoring the execution status of the task are determined as use sensors. - Furthermore, the hand and finger initial
position determination unit 103 calculates the initial positions of thedistance sensors 41 determined as the use sensors on the basis of the measurement result by theenvironment measurement unit 101 and the content of the task determined by thetask determination unit 102. - Candidates for the initial positions of the
distance sensors 41 are set in advance for every type of content of the task. The information indicating the candidates of the initial position is stored in theinitial position database 104 and read by the hand and finger initialposition determination unit 103 as appropriate. Information indicating the initial positions of thedistance sensors 41 calculated by the hand and finger initialposition determination unit 103 are output to the initial positionmovement control unit 105. - The initial position
movement control unit 105 controls adrive unit 121 so that thedistance sensors 41 are positioned at the initial positions calculated by the hand and finger initialposition determination unit 103. Thedrive unit 121 corresponds to drive portions of therobot 1 including thearm portions 13, thehand portions 14, and the movingbody portion 15. - The target
value calculation unit 106 calculates a target value of the geometric information on the basis of the information supplied from thetask determination unit 102, and outputs the target value to thepositioning control unit 108. - The geometric
information estimation unit 107 acquires distance distribution information measured by thedistance sensors 41. The distance distribution information indicates a distance to each position of an operation object and an object included in the environment. The geometricinformation estimation unit 107 estimates the geometric information determined by thetask determination unit 102 on the basis of the distance distribution information, and outputs the estimated geometric information to thetask determination unit 102, thepositioning control unit 108, and the task success/failure determination unit 110. - The
positioning control unit 108 performs positioning control by controlling thedrive unit 121 so that the geometric information estimated by the geometricinformation estimation unit 107 reaches the target value supplied from the targetvalue calculation unit 106. The positioning control is control for moving the operation object to a predetermined position. Furthermore, the positioning control is also control for moving thearm portions 13 and thehand portions 14 so that thedistance sensors 41 come to predetermined positions. - The task success/failure
condition calculation unit 109 determines a success condition and a failure condition of the task, and outputs information indicating the success condition and the failure condition to the task success/failure determination unit 110. - The task success/
failure determination unit 110 determines success or failure of the task being executed according to whether or not the geometric information estimated by the geometricinformation estimation unit 107 satisfies the condition determined by the task success/failurecondition calculation unit 109. In a case where it is determined that the task has succeeded or the task has failed, the task success/failure determination unit 110 outputs a stop command to thepositioning control unit 108. The result of success/failure determination of the task is also supplied to other processing units such as thetask determination unit 102 as appropriate. - <3. Operation of Information Processing Device>
- Example of Task of Holding Thin Object
- The processing of the
information processing device 51 will be described with reference to the flowchart ofFIG. 7 . - Here, processing in a case where a task of holding the card C1 placed on the top plate of the desk D1 is executed will be described appropriately referring to states at the time of execution of the task illustrated in
FIGS. 8 and 9 . - When a thin object such as the card C1 is held, a human often picks up the object by using a nail, or picks up the object after translating the object to the front side of the desk D1. The latter operation, that is, translating the card C1 to the front side of the desk D1 and then picking up the card C1 is implemented by the
information processing device 51. - In step S1, the
task determination unit 102 determines a task to be executed. - The task of holding the card C1 includes a task of translating the card C1 to the front side to allow the card C1 to be sandwiched between the
left finger 22L and theright finger 22R, and a task of bringing theleft finger 22L into contact with the lower surface of the card C1 (task of sandwiching the card C1 between theleft finger 22L and theright finger 22R). Thetask determination unit 102 determines to first execute a task of translating the card C1 to the front side. - In step S2, the
environment measurement unit 101 performs three-dimensional measurement of the environment on the basis of the output of thevisual sensors 12A. By performing three-dimensional measurement, the position and shape of the card C1, the shape of the desk D1, and the like are recognized. - In step S3, the hand and finger initial
position determination unit 103 calculates the initial positions of thehand portions 14 and thefinger portions 22 on the basis of the content of the task and the measurement result by theenvironment measurement unit 101. Together with the positions, the respective orientations of thehand portions 14 and thefinger portions 22 are also calculated. - In step S4, the hand and finger initial
position determination unit 103 determines a use sensor. - Here, the initial position of each of the
hand portions 14 and thefinger portions 22 is calculated in step S3 and the use sensor is determined in step S4 so as to be a position suitable for monitoring the execution status of the task. -
FIG. 8 is a diagram illustrating a state at the time of execution of a task of holding a thin object. - For example, as illustrated in A of
FIG. 8 , initial positions that make theleft finger 22L positioned under the desk D1 and theright finger 22R brought into contact with the upper surface of the card C1 are calculated. - In the example in A of
FIG. 8 , thedistance sensors 41L-1 to 41L-3 and thedistance sensors 41L-7 and 41L-8 (FIG. 2 ) provided on the inner side of theleft finger 22L are determined as use sensors. The initial positions illustrated in A ofFIG. 8 are positions where thedistance sensors 41 provided on the inner side of theleft finger 22L are provided side by side in parallel to the moving direction of the card C1. In A ofFIG. 8 , the horizontal right direction indicated by the white arrow is the moving direction of thehand portion 14, that is, the moving direction of the card C1. - Such an initial position of each portion is selected according to the task determined by the
task determination unit 102. Various methods can be used as a method of determining the initial positions in addition to programming in advance such that the determination according to the task is performed. - For example, initial positions or the like most suitable for detection of success or failure of a task may be determined using an inference model obtained by machine learning. In this case, an inference model is generated by performing machine learning using time-series sensor data pieces at the time of task success and at the time of task failure recorded when executing the task with setting the positions of the
hand portions 14 to various positions. - Returning to the description of
FIG. 7 , in step S5, thetask determination unit 102 determines geometric information to be used for control of thedrive unit 121. - The geometric information is information indicating at least a part of the state of the object including a size (area), an inclination, a distance, and the like and is obtained on the basis of the distance measured by the
distance sensors 41 during the movement of thehand portions 14. Since the geometric information changes according to the positional relationship between the operation object and objects surrounding it, the geometric information is used for monitoring the execution status of the task as information indicating the positional relationship between the operation object and the objects surrounding it during execution of the task (during movement of the hand portion 14) together with the control of thedrive unit 121. - In a case where the task of translating the card C1 is executed, the contactable area S, which is an area where the
left finger 22L can be brought into contact, is determined as the geometric information. The contactable area Sest is expressed by Equation (1) below. -
[Equation 1] -
S est =nA (1) - In Equation (1), n represents the number of the
distance sensors 41 that have measured sensor values within the effective distance range. A represents a footprint area of thedistance sensors 41. - The effective distance range is a range of a distance larger (longer) than the distance from the distance sensor 41 (the
distance sensor 41 on theleft finger 22L) to the desk D1 and smaller (shorter) than the distance from thedistance sensor 41 to theright finger 22R, which is thefinger portion 22 on the opposite side. - Since the
left finger 22L is positioned under the desk D1, the contactable area Sest is an area, in the entire card C1, of a region protruding from the edge portion of the desk D1. - In addition to the contactable area of the operation object, distances measured by the
distance sensors 41, an orientation (inclination) of the operation object with respect to the environment, and the like can be used as the geometric information. An example of using information other than the contactable area of the operation object as the geometric information will be described later with reference to an example of executing another task. - In step S6, the initial position
movement control unit 105 moves thehand portion 14 to the initial position calculated by the hand and finger initialposition determination unit 103. - In step S7, the initial position
movement control unit 105 brings thehand portion 14 into contact with the operation object. Specifically, as described with reference to A ofFIG. 8 , the initial positionmovement control unit 105 brings theright finger 22R roughly into contact with the upper surface of the card C1. - After the
left finger 22L and theright finger 22R are positioned at the initial positions illustrated in A ofFIG. 8 , distances are measured by thedistance sensors 41L-1 to 41L-3, 41L-7, and 41L-8, which are use sensors. Light beams L1 to L5 indicated by broken lines in A ofFIG. 8 represent light beams emitted from thedistance sensors 41L-1 to 41L-3, 41L-7, and 41L-8, respectively. - In the case of A of
FIG. 8 , the sensor values indicating the distances to the desk D1 are measured by thedistance sensors 41L-1 to 41L-3 and 41L-7, and the sensor value indicating the distance to theright finger 22R is measured by thedistance sensor 41L-8. Since there is no distance sensor that measures the distance within the effective distance range, the contactable area is 0 in this case. - In step S8, the target
value calculation unit 106 calculates a target value of the geometric information. The control of thehand portion 14 and the like is performed so as to make the geometric information close to the target value. - In step S9, the task success/failure
condition calculation unit 109 determines a success condition and a failure condition of the task. - For example, the task success/failure
condition calculation unit 109 sets, as a threshold value, an area that allows theleft finger 22L to be in contact with the lower surface of the card C1, and determines, as a success condition, that the contactable area Sest is larger than the threshold value. The target value calculated by the targetvalue calculation unit 106 in step S8 is an area used as a threshold of the success condition. - Furthermore, the task success/failure
condition calculation unit 109 determines, as a failure condition, that the contactable area Sest is smaller than the target value and the sensor values of all thedistance sensors 41 measuring sensor values that are out of the effective distance range are within an abnormal range. Of the distances out of the effective distance range, the distance from thedistance sensor 41 provided on theleft finger 22L to theright finger 22R is set as the abnormal range. - Note that a plurality of conditions may be determined as failure conditions. For example, together with the failure condition described above, the minimum value of the sensor values measured by the
distance sensors 41 provided on theleft finger 22L (the shortest distance to the desk D1) becomes smaller than that at the start of the task is determined as the failure condition. - In this case, it is meant that the
hand portion 14 cannot slide parallel to the top plate of the desk D1, and theright finger 22R is away from the card C1. - In step S10, the geometric
information estimation unit 107 calculates the contactable area Sest on the basis of the sensor values measured by thedistance sensors 41. - In step S11, the
positioning control unit 108 performs positioning control on the basis of the contactable area Sest estimated by the geometricinformation estimation unit 107. - Specifically, the
positioning control unit 108 slides the hand portion 14-1 in the state illustrated in A ofFIG. 8 in parallel to the desk D1. By sliding the hand portion 14-1 to translate the card C1, a part of the card C1 protrudes from the edge portion of the desk D1 as illustrated in B ofFIG. 8 . - In this case, as indicated by the light beam L3 indicated by an alternate long and short dash line, the
distance sensors 41L-3 measure the sensor values within the effective distance range from thedistance sensors 41 to the card C1. In this state, a predetermined area is obtained as the contactable area Sest. - In the example of B of
FIG. 8 , thedistance sensors 41L-1 and 41L-2 measure the distances from thedistance sensors 41 to the desk D1, that is, the sensor values out of the effective distance range. Furthermore, thedistance sensors 41L-7 and 41L-8 measure the distances to theright finger 22R, that is, the sensor values out of the effective distance sensor. - As described later, the control by the
positioning control unit 108 is performed such that the hand portion 14-1 (drive unit 121) is moved while being decelerated as the contactable area Sest, which is geometric information, increases. - In step S12, the task success/
failure determination unit 110 determines whether or not an abnormality has occurred. - In a case where it is determined in step S12 that no abnormality has occurred, in step S13, the task success/
failure determination unit 110 determines whether or not the task has succeeded. Here, in a case where the contactable area Sest satisfies the success condition, it is determined that the task has succeeded. - In a case where it is determined in step S13 that the task has not succeeded, the processing returns to step S10, and the subsequent processing is repeated. The
positioning control unit 108 continues to slide the hand portion 14-1 until the contactable area reaches the target value. - In a case where it is determined in step S13 that the task has succeeded since the contactable area Sest has reached the target value, the
positioning control unit 108 stops the sliding of the hand portion 14-1 according to a stop command from the task success/failure determination unit 110 in step S14. -
FIG. 9 is a diagram subsequent toFIG. 8 , illustrating a state at the time of execution of the task. - As indicated by the light beams L2 and L3 indicated by alternate long and short dash lines in A of
FIG. 9 , in a case where the sensor values measured by thedistance sensors 41L-2 and 41L-3 are within the effective distance range, the contactable area Sest reaches the target value, and it is determined that the task of translating the card C1 to the front side has succeeded. At this time, a state where a task of bringing theleft finger 22L into contact with the lower surface of the card C1 is executed next is made. - In step S15, the
task determination unit 102 determines whether or not all the tasks have been completed. - In a case where it is determined in step S15 that all the tasks are not completed because, for example, there is a task of bringing the
left finger 22L into contact with the lower surface of the card C1, the targetvalue calculation unit 106 calculates, in step S16, a command value for the next task on the basis of the current sensor values. The calculation of the command value for the next task is performed after the task of bringing theleft finger 22L into contact with the lower surface of the card C1 is determined as a task to be executed next by thetask determination unit 102. - For example, as illustrated in a word balloon in A of
FIG. 9 , the targetvalue calculation unit 106 calculates a target position of theleft finger 22L as a command value on the basis of the sensor values acquired by thedistance sensors 41L-1 to 41L-3, 41L-7, and 41L-8. - After the command value for the next task is calculated, the processing returns to step S3, and processing similar to the processing described above is performed.
- That is, the processing of moving the
left finger 22L while monitoring the execution status of the task using the geometric information is performed, so that theleft finger 22L comes into contact with the lower surface of the region of the card C1 protruding from the edge portion of the desk D1 and the card C1 is held as illustrated in B ofFIG. 9 . In the task of bringing theleft finger 22L into contact with the card C1, for example, thedistance sensor 41L-5 provided at the fingertip of theleft finger 22L is used as a use sensor. Furthermore, the distance from thedistance sensor 41L-5, which is a use sensor, to the card C1 is used as the geometric information. - On the other hand, in a case where the failure condition is satisfied in step S12, the task success/
failure determination unit 110 determines that an abnormality has occurred. -
FIGS. 10 and 11 are diagrams illustrating states at the time of failure of a task. - In a case where the force of pressing the card C1 with the
right finger 22R is weak, even if the hand portion 14-1 is slid to move the card C1, slipping occurs between theright finger 22R and the card C1 as illustrated in a word balloon in A ofFIG. 10 . - In a case where the hand portion 14-1 is slid while the slipping occurs, the card C1 is slightly translated but does not protrude from the edge portion of the desk D1 as illustrated in B of
FIG. 10 . - In this case, as indicated by the light beams L1 and L2 indicated by broken lines, sensor values out of the effective distance range are measured by the
distance sensors 41L-1 and 41L-2, respectively. Furthermore, as indicated by the light beams L3 to L5 indicated by broken lines, the sensor values within the abnormal range are measured by thedistance sensors 41L-3, 41L-7, and 41L-8, respectively. - Since the
distance sensors 41L-1 or 41L-2 does not measure a sensor value within the abnormal range, the sliding of the hand portion 14-1 is continued. As the sliding is continued, the card C1 is slightly translated further as illustrated in A ofFIG. 11 . In this case, the contactable area Sest of the card C1 is smaller than that in the case of A ofFIG. 9 in which the hand portion 14-1 can be slid without slipping. - In A of
FIG. 11 , as indicated by the light beam L1 indicated by an alternate long and short dash line, sensor values within the effective distance range are measured only by thedistance sensors 41L-1. Since the contactable area Sest is smaller than the target value and the sensor values of all thedistance sensors 41 measuring the sensor values out of the effective distance range are within the abnormal range, it is determined that the failure condition is satisfied, that is, the task of translating the card C1 to the front side has failed. - In a case where it is determined in step S12 that an abnormality has occurred due to satisfaction of the above-described failure condition, the
task determination unit 102 sets, in step S17, a return operation task on the basis of the current sensor value. Thereafter, the processing returns to step S3, and the return operation task is executed by processing similar to the processing described above. - For example, the
task determination unit 102 determines a return operation task of performing the task of translating the card C1 again as a task to be executed next. The targetvalue calculation unit 106 calculates a target value of the movement amount of the return operation task by Equation (2) on the basis of the contactable area Sest. -
- In Equation (2), nref represents the number of the
distance sensors 41 that measure sensor values within the effective distance range necessary for the contactable area to reach the target value. n represents the current number of thedistance sensors 41 that measure sensor values within the effective distance range. Sref represents a target value of the contactable area. - Equation (2) indicates that the contactable area Sest becomes larger than the target value Sref if the region of the card C1 corresponding to the footprint area of the nref distance sensors 41 protrudes from the edge portion of the desk D1.
- The
positioning control unit 108 moves the hand portion 14-1 to a position where the number of thedistance sensors 41 that measure distances to the desk D1 is nref. For example, as illustrated in B ofFIG. 11 , thepositioning control unit 108 moves the hand portion 14-1 forward (in the direction of the fingertip of the hand portion 14-1) indicated by the white arrow by a distance corresponding to two of the distance sensors 41 (thedistance sensors 41L-1 and 41L-2). - As described above, the operation of moving the hand portion 14-1 by a distance corresponding to nref of the
distance sensors 41 is performed as the return operation task. After the return operation task is performed, for example, the task of increasing the force of pressing the card C1 with theright finger 22R to translate the card C1 is performed again. - In a case where the task of translating the card C1 to the front side fails, there is a possibility that the card C1 has moved from the position at the start of the task. In this case, even if the hand portion 14-1 is moved to the initial position of the task, the hand portion 14-1 cannot be brought into contact with the card C1 in some cases.
- In such a case, in the conventional methods, the hand portion is once moved to a position away from the desk, and then the environment is measured again using a visual sensor, and the hand portion is moved again according to the recognized position of the card.
- With the above-described processing, it is possible to resume the task by moving the hand portion 14-1 by the minimum necessary movement amount without once moving the hand portion 14-1 away from the desk D1. That is, the
information processing device 51 can quickly perform the task again. - Returning to the description of
FIG. 7 , in a case where it is determined in step S15 that all the tasks have been completed, in step S18, thepositioning control unit 108 releases the contact with the operation object and ends the processing. - Example of Control by
Positioning Control Unit 108 -
FIG. 12 is a diagram illustrating an example of control by thepositioning control unit 108. - As described above, the control of the
hand portion 14 by thepositioning control unit 108 is performed such that thehand portion 14 is moved while being decelerated as the contactable area Sest, which is the geometric information, increases. - The control by the
positioning control unit 108 is implemented by asubtractor 131, aconverter 132, asubtractor 133, and acontroller 134. Thesubtractor 131, theconverter 132, thesubtractor 133, and thecontroller 134, which are surrounded by a broken line inFIG. 12 , are provided in thepositioning control unit 108. - The
subtractor 131 calculates a difference between the target value Sref of the contactable area and the contactable area Sest estimated by the geometricinformation estimation unit 107, and outputs the difference to theconverter 132. - The
converter 132 applies the conversion coefficient K to the difference supplied from thesubtractor 131 to calculate the target value vref of the moving speed. For example, as the difference between the target value Sref and the contactable area Sest is smaller, a smaller value is calculated as the target value vref of the moving speed. Theconverter 132 outputs the target value vref of the moving speed to thesubtractor 133. - The
subtractor 133 calculates a difference between the target value vref supplied from theconverter 132 and the actual moving speed v of thedrive unit 121, and outputs the difference to thecontroller 134. - The
controller 134 controls thedrive unit 121 so that the difference between the moving speeds supplied from thesubtractor 133 becomes 0. - The actual moving speed v of the
drive unit 121 is measured by a sensor provided in each unit and supplied to thesubtractor 133. Furthermore, the three-dimensional coordinates p of each measurement point of thedistance sensor 41 are supplied to the geometricinformation estimation unit 107. For example, in a case whereN distance sensors 41 are provided, the three-dimensional coordinates p are represented by a 3×N matrix. - The geometric
information estimation unit 107 estimates the contactable area Sest on the basis of the three-dimensional coordinates p, and outputs the contactable area Sest to thesubtractor 131. -
FIG. 13 is a diagram illustrating an example of the contactable area and the command value of the moving speed of thehand portion 14. - The upper part of
FIG. 13 indicates the contactable area Sest, and the lower part indicates the command value of the moving speed with respect to thehand portion 14. The horizontal axis inFIG. 13 represents time. - As described with reference to A of
FIG. 8 , the period from time t0 to time t1 is a period in which a sensor value within the effective distance range is not measured by any of thedistance sensors 41. In this period, the contactable area Sest is 0. Furthermore, the command value of the moving speed is a predetermined speed. - As described with reference to B of
FIG. 8 , the period from time t1 to time t2 is a period in which sensor values within the effective distance range are measured by thedistance sensors 41L-3. In this period, the contactable area Sest increases more than before time t1. Furthermore, the command value of the moving speed is a value lower than that before time t1. - As described with reference to A of
FIG. 9 , the period after time t2 is a period in which sensor values within the effective distance range are measured by thedistance sensors 41L-2 and 41L-3. In this period, the contactable area Sest increases more than before time t2. Furthermore, the command value of the moving speed is a value lower than that before time t2. - As described above, the control of the
hand portion 14 by thepositioning control unit 108 is performed such that the moving speed of thehand portion 14 is adjusted according to the change in the contactable area Sest as the geometric information. - Effects
- As described above, in the
robot 1, thedistance sensors 41 are positioned at positions where it is easy to measure the displacement between the operation object and the environment according to the content of the task to be executed. Therefore, it is possible to constantly monitor the progress status of a task in which a robot hand would shield the operation object from a camera at a fixed position to make measurement difficult. - The
robot 1 can improve the accuracy and success rate of a task of moving an operation object to a target position. - Furthermore, in the
robot 1, it is not necessary to measure the contact position between the card C1 and the hand portion 14-1, and it is possible to succeed a task of translating the card C1 only by bringing the hand portion 14-1 into contact with an approximate target position. - Typically, in a case where a robot hand is controlled on the basis of information acquired by a camera at a fixed position, it is necessary to measure the positional relationship between a card and a desk at the start of the task and the positional relationship between the card and the robot hand.
- In the
robot 1, thevisual sensors 12A are used only for measuring the positional relationship between the card C1 and the desk D1 at the start of the task. The control of the hand portion 14-1 and the success determination of the task are performed on the basis of the relative displacement between the card C1 and the desk D1 measured by thedistance sensor 41 provided on theleft finger 22L. Since the contact position between the card C1 and the hand portion 14-1 is not so important, the hand portion 14-1 does not need to contact the center of the card C1, and may contact the front side or the back side of the card C1. - Moreover, in the
robot 1, the geometric information is calculated on the basis of the distance distribution information measured by thedistance sensors 41, and the control of thehand portion 14 and the success determination of the task are performed according to the change in the geometric information. In a case of executing a task for which success or failure is determined by the positional relationship between the operation object and the environment, therobot 1 can control thehand portion 14 and determine the success of the task with easy observation and a simple algorithm. - <4. Examples of Other Tasks>
- Examples in which the present technology is applied to other tasks will be described later. Note that the basic flow of the tasks is similar to the flow of the task of holding the card C1 described with reference to
FIGS. 8 to 11 . - Example of Wiping Task
-
FIGS. 14 and 15 are diagrams illustrating a state at the time of success of a task of wiping a window W1 using a cleaner C11. - In
FIGS. 14 and 15 , the cleaner C11 is pressed against the surface of the window W1 by the hand portion 14-1. A frame F1 thicker than the window W1 is provided at the end of the window W1 to surround the window W1. In this example, the operation object is the cleaner C11. Furthermore, objects included in the environment surrounding the operation object are the window W1 and the frame F1. -
FIG. 14 is a view of the hand portion 14-1 pressing the cleaner C11 against the surface of the window W1, which is a vertical surface, as viewed from below, andFIG. 15 is a view of the hand portion 14-1 as viewed from the front. The horizontal direction inFIG. 14 is the x-axis direction, and the vertical direction inFIG. 14 is the z-axis direction. The horizontal direction in FIG. is the x-axis direction, and the vertical direction inFIG. 15 is the y-axis direction. - At the start of the task of wiping the window W1, as illustrated in A of
FIG. 14 , theleft finger 22L and theright finger 22R are spread, and the cleaner C11 is pressed against the surface of the window W1 by thebase portion 21 as the palm of the hand portion 14-1. In the example ofFIG. 14 , theleft finger 22L is positioned on the right side in the drawing, and theright finger 22R is positioned on the left side in the drawing. The initial position of thehand portion 14 illustrated in A ofFIG. 14 is a position where the distance sensors 41-0 provided on thebase portion 21 and thedistance sensor 41L provided on the inner side of theleft finger 22L are arranged in parallel to the window W1. - Here, the
robot 1 brings the hand portion 14-1 roughly into contact with the cleaner C11. Since thebase portion 21, theleft finger 22L, and theright finger 22R are provided with thedistance sensors 41, therobot 1 can detect which part of the hand portion 14-1 the cleaner C11 is in contact with on the basis of the sensor values measured by thedistance sensors 41 when therobot 1 is brought roughly into contact with the cleaner C11. - After the
left finger 22L and theright finger 22R are positioned to be spread, distances are measured by thedistance sensors 41L-1 to 41L-3, 41L-7, and 41L-8, which are use sensors. In the case ofFIG. 14 , as indicated by the light beam L1 indicated by an alternate long and short dash line, sensor values (sensor values indicating distances to the frame F1) within the effective distance range are measured by thedistance sensors 41L-1. Here, the effective distance range is a range of a distance larger than 0 and smaller than the distance to the window W1, or a range of a distance larger than the distance to the window W1. - Furthermore, as indicated by the light beams L2 to L4 indicated by broken lines, sensor values out of the effective distance range are measured by the
distance sensors 41L-2, 41L-3, and 41L-7. - Since a portion of the
left finger 22L provided with thedistance sensor 41L-8 is pressed against the cleaner C11 together with thebase portion 21, a sensor value within a contact range is measured by thedistance sensor 41L-8. The contact range indicates that the sensor value is 0. - In a case where the task of wiping is executed, the progress status of the task is monitored and the hand portion 14-1 is controlled using, as the geometric information, an interval δx that is the interval between the end of the cleaner C11 and the end of the window W1. The interval δx indicated by the bidirectional arrow in A of
FIG. 14 is expressed by Equation (3) below. -
[Equation 3] -
δx=min(x e)−min(x c) (3) - In Equation (3), xe represents a set of positions of the
distance sensors 41 that measure sensor values within the effective distance range. xc represents a set of positions of thedistance sensors 41 that acquire sensor values within the contact range. - In other words, min(xe) represents the position of the left end of the frame F1 (the end on the side of the surface in contact with the window W1), and max(xc) represents the right end position of the cleaner C11. The positioning control by the hand portion 14-1 is performed such that the interval δx decreases.
- Specifically, the
robot 1 moves the hand portion 14-1 in the state illustrated in A ofFIG. 14 in the +x direction (the direction toward the fingertip side of theleft finger 22L) indicated by the white arrow. By moving the hand portion 14-1, the right end of the cleaner C11 is brought close to the frame F1 as illustrated in B ofFIG. 14 . - In this case, as indicated by the light beams L1 to L4 indicated by alternate long and short dash lines, the
distance sensors 41L-1 to 41L-3 and 41L-7 measure sensor values within the effective distance range. In this state, it is assumed that min(xe) is a position corresponding to the position of thedistance sensor 41L-7 and max(xc) is a position corresponding to the position of thedistance sensor 41L-8, and an interval δx of a predetermined length is obtained. - In a case where the interval δx becomes sufficiently small, it is determined that the task has succeeded. At this time, as illustrated in
FIG. 15 , therobot 1 moves the hand portion 14-1 in the +y direction indicated by the white arrow. As the hand portion 14-1 moves, the cleaner C11 also moves in the same +y direction. - The sensor values measured by the
distance sensors 41 when it is determined that the task has succeeded are used for a task of moving the hand portion 14-1 in the +y direction as the next operation. In the task of moving the hand portion 14-1 in the +y direction, the movement of the hand portion 14-1 is controlled such that the interval δx, which is the distance between the cleaner C11 and the frame F1, is maintained at a constant distance. - Typically, wiping a window using the hand portion is often performed with the cleaner held by the left and right finger portions of the hand portion.
- In the
robot 1, theleft finger 22L and theright finger 22R are spread, the cleaner C11 is pressed against the surface of the window W1 by thebase portion 21 of the hand portion 14-1, and wiping work is performed while measuring the interval δx as the geometric information. Since wiping is performed while measuring the interval δx, it is possible to move the cleaner C11 to almost the end of the window W1. - Note that the control of the hand portion 14-1 is performed such that the hand portion 14-1 is moved while being decelerated as the interval δx, which is the geometric information, decreases.
-
FIG. 16 is a diagram illustrating an example of the interval δx and the command value of the moving speed of the hand portion 14-1. - The upper part of
FIG. 16 indicates the interval δx, and the lower part indicates the command value of the moving speed for the hand portion 14-1. The horizontal axis inFIG. 16 represents time. - As described with reference to A of
FIG. 14 , the period from time t0 to time t1 is a period in which sensor values within the effective distance range are measured by thedistance sensors 41L-1. In this period, the interval δx is a predetermined distance. Furthermore, the command value of the moving speed is a predetermined speed. - A period from time t1 to time t2 is a period in which the hand portion 14-1 moves in the +x direction. In this period, the interval δx gradually decreases from time t1. Furthermore, the command value of the moving speed gradually decreases from time t1.
- As described with reference to B of
FIG. 14 , the period after time t2 is a period in which sensor values within the effective distance range are measured by thedistance sensors 41L-1 to 41L-3 and 41L-7. In this period, the interval δx is a value lower than that before time t2. Furthermore, the command value of the moving speed is a value lower than that before time t2. -
FIG. 17 is a diagram illustrating a state at the time of failure of a task of wiping the window W1. - As illustrated in A of
FIG. 17 , in a case where the hand portion 14-1 is moved in a state where slipping occurs between the hand portion 14-1 and the cleaner C11, the movement amount of the cleaner C11 is smaller than the movement amount of the hand portion 14-1 as illustrated in B ofFIG. 17 . That is, the contact position between the cleaner C11 and the hand portion 14-1 is deviated. - In this case, as indicated by the light beam L5 indicated by an alternate long and short dash line, a state where the
distance sensor 41L-8, which has measured a sensor value within the contact range before the movement, measures a sensor value out of the effective distance range is made. In a case where the contact position between the cleaner C11 and the hand portion 14-1, that is, the value of max (xc) changes, it is determined that slipping has occurred between the hand portion 14-1 and the cleaner C11 (the task has failed). - In a case where a sensor value of one of the
distance sensors 41 that has measured a sensor value within the contact range changes, as illustrated in C ofFIG. 17 , theinformation processing device 51 moves the hand portion 14-1 in the direction indicated by the white arrow (the direction of the fingertip of theright finger 22R) by a distance corresponding to one of the distance sensors 41 (distance sensor 41L-8) by the return operation task. Thereafter, therobot 1 brings the hand portion 14-1 into contact with the cleaner C11 again and performs the wiping task again. - Note that the control of the
hand portion 14 may be performed using not only the sensor values measured by thedistance sensors 41 but also a sensor value measured by a vibration sensor or a tactile sensor provided on thehand portion 14. For example, slipping occurred between thehand portion 14 and the operation object is detected on the basis of a sensor value measured by a vibration sensor or a tactile sensor, thereby failure of the task is determined. By using the measurement results by a plurality of types of sensors, therobot 1 can improve the accuracy of the failure determination of the task. - Example of Task of Cutting Object
-
FIG. 18 is a diagram illustrating a state at the time of success of a task of cutting an object Ob1 by operating a kitchen knife K1. - In the example of
FIG. 18 , the object Ob1 is placed on the desk D1. Therobot 1 holds the handle portion of the kitchen knife K1 with thefinger portions 22, and applies the blade portion of the kitchen knife K1 to the object Ob1 with the fingertip facing downward. For example, the object Ob1 is food such as a vegetable or a fruit. In this example, the operation object is the kitchen knife K1. Furthermore, objects included in the environment surrounding the operation object are the desk D1 (table) and the spherical object Ob1. - After the blade portion of the kitchen knife K1 is brought into contact with the object Ob1, the distance sensor 41-5 at the fingertip is determined as a use sensor, and the distance is measured by the distance sensor 41-5. In the example of
FIG. 18 , a sensor value indicating the distance to the desk D1 is measured by the distance sensor 41-5 as indicated by the light beam L11 indicated by a broken line. - In a case where the task of cutting the object Ob1 is executed, the progress status of the task is monitored and the hand portion 14-1 is controlled using, as the geometric information, the distance from the blade portion of the kitchen knife K1 to the desk D1 measured by the distance sensor 41-5. For example, in a case where the fingertip is at the same height as the blade portion of the kitchen knife K1, the distance measured by the distance sensor 41-5 is the same distance as the distance from the blade portion of the kitchen knife K1 to the desk D1 indicated by the bidirectional arrow in A of
FIG. 18 . - Furthermore, as indicated by the light beams L12 and L13 indicated by broken lines, a plurality of distances to the handle portion of the kitchen knife K1 are measured by the distance sensors 41-0 provided on the
base portion 21. As described with reference toFIG. 2 and the like, the plurality of distance sensors 41-0 is provided on the upper surface of thebase portion 21 corresponding to the palm. - In the task of cutting the object Ob1, the inclination (orientation) of the kitchen knife K1 obtained on the basis of the sensor values measured by the distance sensors 41-0 is used as the geometric information together with the distance to the desk D1. The inclination of the kitchen knife K1 is represented by, for example, a difference between the distance measured by the light beam L12 and the distance measured by the light beam L13.
- The positioning control for lowering the kitchen knife K1 is performed such that the distance to the desk D1 decreases. Specifically, the positioning control is performed by moving the hand portion 14-1 in a downward direction indicated by the white arrow in A of
FIG. 18 in a state where the blade portion of the kitchen knife K1 is in contact with the object Ob1. By moving the hand portion 14-1, the blade portion of the kitchen knife K1 is pushed into the object Ob1 as illustrated in A of FIG. 18. - In a case where the sensor value measured by the distance sensor 41-5 becomes sufficiently small, it is determined that the blade portion of the kitchen knife K1 completely comes into contact with the desk D1 and cannot be lowered any more, and it is determined that the task has succeeded.
- In addition to the sensor value measured by the distance sensor 41-5, the sensor value measured by the force sensor provided at the wrist portion of the hand portion 14-1 may be used for the success determination of the task. The force sensor provided at the wrist portion of the hand portion 14-1 measures, for example, the reaction force when the blade portion of the kitchen knife K1 hits the desk D1. By using the measurement results by the plurality of types of sensors, the
robot 1 can improve the accuracy of the success determination of the task. - In a case where it is determined that the task of cutting the object Ob1 has succeeded, positioning control for separating the kitchen knife K1 from the object Ob1 is performed. Specifically, as illustrated in C of
FIG. 18 , the positioning control is performed by moving the hand portion 14-1 in an upward direction indicated by the white arrow until the sensor value measured by the distance sensor 41-5 becomes sufficiently large. - Note that the present technology can also be applied to a task of positioning an object using a tool, such as a task of placing an object held by a tong at a specific position, by regarding the end effector to include the tool.
-
FIG. 19 is a diagram illustrating a state at the time of failure of the task of cutting the object Ob1 by operating the kitchen knife K1. - As illustrated in A of
FIG. 19 , when the kitchen knife K1 is pushed into the object Ob1, a moment received by the kitchen knife K1 from the object Ob1 causes slipping between the kitchen knife K1 and the hand portion 14-1 as illustrated in B ofFIG. 19 , and the orientation of the kitchen knife K1 may change. - In this case, as indicated by the light beams L12 and L13 indicated by alternate long and short dash lines, different sensor values are measured by the plurality of distance sensors 41-0. The inclination of the kitchen knife K1 is calculated on the basis of the sensor values measured by the distance sensors 41-0.
- The
robot 1 adjusts the orientation of pushing the kitchen knife K1 into the object Ob1 on the basis of the inclination of the kitchen knife K1 as the geometric information. Specifically, as indicated by the white arrow in C ofFIG. 19 , the orientation is adjusted by changing the orientation of the hand portion 14-1 or re-holding the kitchen knife K1 so that the inclination of the kitchen knife K1 becomes 0. Such a task of adjusting the orientation is executed as a return operation task. - By adjusting the orientation of the kitchen knife K1, the
robot 1 can resume the task of cutting the object Ob1 without performing the task from the beginning again. - Example of Task of Placing Operation Object Between Objects
-
FIG. 20 is a diagram illustrating a state at the time of failure of a task of placing a book B1 between a book B11 and a book B12 other than the book B1. - In the example of
FIG. 20 , the book B11 and the book B12 are placed, for example, on a bookshelf with a predetermined gap.FIG. 20 is a top view of the positional relationship between the books. In the state of A ofFIG. 20 , the book B11 and the book B12 are provided in parallel to each other. Therobot 1 holds the book B1 with theright finger 22R and theleft finger 22L, and moves the book B1 so as to place the book B1 in the gap between the book B11 and the book B12. In this example, the operation object is the book B1. - Furthermore, objects included in the environment surrounding the operation object are the book B11 and the book B12.
- After the book B1 is held, the
distance sensor 41L-5 at the fingertip of theleft finger 22L and thedistance sensor 41R-5 at the fingertip of theright finger 22R are determined as use sensors, and thedistance sensor 41L-5 and thedistance sensor 41R-5 measure distances. In the example in B ofFIG. 20 , as indicated by the light beam L21 indicated by a broken line, the distance to the book B11 is measured by adistance sensor 41L-5 provided at the fingertip of theleft finger 22L. Furthermore, as indicated by the light beam L22 indicated by a broken line, the distance to the book B12 is measured by thedistance sensor 41R-5 provided at the fingertip of theright finger 22R. - In a case where the task of placing the book B1 between the book B11 and the book B12 is executed, an average value of sensor values measured by the
distance sensor 41L-5 and thedistance sensor 41R-5 is used as geometric information to monitor the progress status of the task and control the hand portion 14-1. - The positioning control for placing the book B1 is performed such that the book B1 is inserted into the gap between the book B11 and the book B12 until the average value of the sensor values measured by the
distance sensor 41L-5 and thedistance sensor 41R-5 becomes 0. - When the book B1 is inserted, the book B1 may come into contact with the book B11 in the surroundings as illustrated in B of
FIG. 20 , and the book B11 may move as illustrated in C ofFIG. 20 . - In this case, as indicated by the light beam L21 indicated by an alternate long and short dash line, the sensor value measured by the
distance sensor 41L-5 becomes a large value. Therefore, the average value of the sensor values measured by thedistance sensors 41L-4 and thedistance sensor 41R-5 also becomes a large value. - In a case where the average value of the sensor values measured by the
distance sensors 41L-4 and thedistance sensor 41R-5 becomes large, it is determined that the task has failed. In a case where it is determined that the task has failed, a recovery operation is performed. Specifically, as illustrated in D ofFIG. 20 , therobot 1 returns the book B1 to the initial position, and returns B11 that has moved to the original position. - After performing the recovery operation, the
robot 1 can perform a next task, such as the same task. - As described above, the
robot 1 can detect an abnormality in a case where an object in the surroundings of the operation object moves unexpectedly. - <5. Modifications>
- Regarding Sensor
- Although the geometric information is obtained on the basis of the sensor values measured by the
distance sensors 41, the geometric information may be obtained on the basis of a measurement result by a different sensor such as a time of flight (ToF) camera or a stereo camera. As described above, various sensors capable of acquiring distance distribution information (distance information of multiple points) can be used for measurement. - Furthermore, the geometric information may be obtained on the basis of a map surrounding the
hand portion 14 created by moving thehand portion 14 to perform scanning and merging time-series data pieces of measurement results by thedistance sensors 41. - Therefore, even in a case where the number of the
distance sensors 41 provided on thehand portion 14 is small, it is possible to acquire the distribution information of the distance necessary for obtaining the geometric information by moving the positions of thedistance sensors 41. - The
distance sensors 41 may be provided at portions other than thehand portion 14. For example, in a task of picking up the card C1, the other hand (the hand portion 14-2) is positioned under the desk D1, and positioning control is performed on the basis of sensor values measured by thedistance sensors 41 provided on the hand portion 14-2. - In this case, the
robot 1 can perform positioning control using the hand portion 14-2 at the same time as positioning the hand portion 14-1 at the initial position. Therefore, therobot 1 can shorten the time required for the task. - In a case where the target position to be the destination of movement of the operation object does not have any shape feature but a mark is provided instead, the operation object may be moved to the target position on the basis of information output from an RGB camera or a color sensor mounted on the
hand portion 14. -
FIG. 21 is a diagram illustrating a state of a task of positioning an operation object at a specific position of the desk D1. - In the example of
FIG. 21 , there is no shape feature or mark on the top plate of the desk D1. A position P1 on the top plate of the desk D1 indicated by the star is a target position where the operation object is to be positioned. Therobot 1 holds the object Ob11 with thefinger portions 22 and moves the object Ob1 so as to place the object Ob1 at the position P1. In this example, the operation object is the object Ob11. Furthermore, an object included in the environment surrounding the operation object is the desk D1. - In a case where the task of positioning the object Ob11 at the position P1 is executed, first, a target distance from the end of the desk D1 to the position P1 indicated by the bidirectional arrow in A of
FIG. 21 is calculated on the basis of the image information output by thevisual sensors 12A. - After the object Ob11 is held, the plurality of
distance sensors 41 provided on the arm portion 13-1 is determined as use sensors, and the distances are measured by thedistance sensors 41. In the example of A ofFIG. 21 , thedistance sensors 41 provided on the arm portion 13-1 are arranged in parallel to the top plate of the desk D1, and the distance to the desk D1 is measured by the light beam L31. - In the task of positioning the operation object at the position P1, the arm portion 13-1 is controlled using the distance from the end of the desk D1 to the object Ob11 as the geometric information. The distance from the end of the desk D1 to the object Ob11 is obtained on the basis of sensor values measured by the plurality of
distance sensors 41 provided on the arm portion 13-1. - The positioning for positioning the object Ob11 at the position P1 is performed so that the difference between the distance from the end of the desk D1 to the object Ob11 and the target distance becomes small. Specifically, by moving the arm portion 13-1 in the left direction indicated by the white arrow in A of
FIG. 21 , the distance from the end of the desk D1 to the object Ob11 becomes the same as the target distance indicated by the bidirectional arrow as illustrated in B ofFIG. 21 . - In this case, as indicated by the light beams L31 to L33 indicated by broken lines, the
distance sensors 41 provided on the arm portion 13-1 measure sensor values indicating the distance to the desk D1. When the difference between the distance from the end to the desk D1 to the object Ob11 and the target position becomes sufficiently small, the arm portion 13-1 is moved downward so as to place the object Ob11 on the desk D1, so that the object Ob11 is positioned at the position P1. - Regarding Actuator
- An actuator other than the electromagnetic motor may be mounted on the end effector. For example, a suction type end effector is mounted on the
robot 1. Therobot 1 can easily hold a light object such as a card using a suction type end effector. - In the case of holding a heavy object using a suction type end effector, similarly to the case of picking up the card C1, the
robot 1 can once move the object to the front side of the desk while sucking the object, and then suck and hold the back of the object with a suction mechanism of another finger portion. By once moving a heavy object to the front side of the desk and then holding the object, therobot 1 can increase the stability of holding. - Regarding Control
- The positioning control may be performed on the basis of the sensor values measured by the
distance sensors 41 and information measured at the start of the task by thevisual sensors 12A (a three-dimensional measuring instrument such as a Depth camera). Since thedistance sensors 41 are discretely arranged, the accuracy of the geometric information obtained on the basis of the sensor values measured bysuch distance sensors 41 may be low. - By complementing the distance information between the
distance sensors 41 on the basis of the information measured by thevisual sensors 12A, therobot 1 can obtain highly accurate geometric information, and the accuracy of positioning control of a small object or an object having a complicated shape can be improved. - The operation of moving the hand portion 14-1 to the initial position may be performed a plurality of times. For example, in a case where the quality of the sensor values measured by the
distance sensors 41 is poor (a case where noise is large, a case where there are many measurement omissions, and the like) after the hand portion 14-1 is moved to the initial position, therobot 1 changes the orientation of the hand portion 14-1 from the initial position or moves the hand portion 14-1 to an initial position that is another candidate. - Regarding System Configuration
-
FIG. 22 is a diagram illustrating a configuration example of a system. - The system illustrated in
FIG. 22 is configured by providing theinformation processing device 51 as an external apparatus of therobot 1. In this manner, theinformation processing device 51 may be provided outside the housing of therobot 1. - Wireless communication of a predetermined standard such as a wireless LAN or long term evolution (LTE) is performed between the
robot 1 and theinformation processing device 51 inFIG. 22 . - Various types of information such as information indicating the state of the
robot 1 and information indicating the detection result of the sensors are transmitted from therobot 1 to theinformation processing device 51. Information for controlling the operation of therobot 1 and the like are transmitted from theinformation processing device 51 to therobot 1. - The
robot 1 and theinformation processing device 51 may be directly connected as illustrated in A ofFIG. 22 , or may be connected via anetwork 61 such as the Internet as illustrated in B ofFIG. 22 . The operations of the plurality ofrobots 1 may be controlled by oneinformation processing device 51. - Regarding Computer
- The above-described series of processing can be performed by hardware or software. In a case where the series of processing is executed by software, a program implementing the software is installed from a program recording medium to a computer incorporated in dedicated hardware, a general-purpose personal computer, or the like.
-
FIG. 23 is a block diagram illustrating a configuration example of hardware of a computer that performs the above-described series of processing using a program. - A central processing unit (CPU) 1001, a read only memory (ROM) 1002, and a random access memory (RAM) 1003 are connected to each other by a
bus 1004. - An input/
output interface 1005 is further connected to thebus 1004. Aninput unit 1006 including a keyboard, a mouse, and the like, and anoutput unit 1007 including a display, a speaker, and the like are connected to the input/output interface 1005. Furthermore, astorage unit 1008 including a hard disk, a nonvolatile memory, and the like, acommunication unit 1009 including a network interface and the like, and adrive 1010 that drives a removable medium 1011 are connected to the input/output interface 1005. - In the computer configured as described above, for example, the
CPU 1001 loads a program stored in thestorage unit 1008 into theRAM 1003 via the input/output interface 1005 and thebus 1004 and executes the program, so that the above-described series of processing is performed. - The program executed by the
CPU 1001 is provided, for example, by being recorded in the removable medium 1011 or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and is installed in thestorage unit 1008. - Note that the program executed by the computer may be a program that causes pieces of processing to be performed in time series in the order described in the present specification, or may be a program that causes the pieces of processing to be performed in parallel or at necessary timing such as when a call is made.
- In the present specification, a system means a set of a plurality of components (devices, modules (parts), and the like), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device having one housing in which a plurality of modules is housed are both systems.
- Note that the effects described in the present specification are merely examples and are not limited, and other effects may be provided.
- Embodiments of the present technology are not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present technology.
- For example, the present technology can have a configuration of cloud computing in which one function is shared and processed in cooperation by a plurality of devices via a network.
- Furthermore, steps described in the above-described flowcharts can be performed by one device or can be shared and executed by a plurality of devices.
- Moreover, in a case where a plurality of pieces of processing is included in one step, the plurality of pieces of processing included in the one step can be executed by one device or can be shared and executed by a plurality of devices.
- Combination Examples of Configurations
- The present technology may also have the following configurations.
- (1)
- An information processing device including:
-
- a control unit configured to control a position of
- a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface, on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
- (2)
- The information processing device according to (1),
-
- in which the hand portion includes a plurality of finger portions and a support portion that supports the plurality of finger portions, and a plurality of the sensors is provided side by side at least on an inner surface of each of the finger portions that are contact surfaces with the target object.
- (3)
- The information processing device according to (2),
-
- in which the control unit controls a position of the hand portion by driving an arm portion that supports the hand portion and driving each of the finger portions.
- (4)
- The information processing device according to (3) further including
-
- an estimation unit that estimates the positional relationship on the basis of a distribution of distances measured by the plurality of sensors.
- (5)
- The information processing device according to (4),
-
- in which the estimation unit estimates a contactable area of the target object where the finger portions are contactable on the basis of the number of sensors that are included in the sensors and that have measured distances within a predetermined range.
- (6)
- The information processing device according to (5),
-
- in which the task is a task of moving the target object placed on a top plate as the object in surroundings to a position enabling the target object to be sandwiched from above and below by the finger portions, and
- the control unit moves the hand portion in a moving direction of the target object in a state where a finger portion above the target object among the finger portions is in contact with the target object and where sensors included in the sensors and provided on a finger portion below the top plate among the finger portions are positioned side by side in parallel to the moving direction.
- (7)
- The information processing device according to (6),
-
- in which the control unit controls a moving speed of the hand portion according to the contactable area.
- (8)
- The information processing device according to any one of (2) to (7) further including:
-
- an initial position determination unit that determines an initial position of the hand portion according to the task on the basis of a measurement result by a visual sensor provided at a position different from the sensors; and
- an initial position movement control unit that moves the hand portion to an initial position.
- (9)
- The information processing device according to (8),
-
- in which the initial position determination unit determines, according to the task, a use sensor that is at least one of the sensors to be used, from among the plurality of sensors provided on the hand portion, and determines an initial position of the use sensor.
- (10)
- The information processing device according to any one of (1) to (9) further including
-
- a determination unit that determines whether or not the task is successful according to whether or not the positional relationship satisfies a condition determined according to the task,
- in which in a case where it is determined that the task has failed, the control unit controls a position of the hand portion according to a return task determined on the basis of the positional relationship when the task has failed.
- (11)
- The information processing device according to (4),
-
- in which the estimation unit estimates the positional relationship between the target object and the object in surroundings on the basis of an interval between a sensor that is included in the sensors and that has measured a distance within a first range and a sensor that is included in the sensors and that has measured a distance within a second range.
- (12)
- The information processing device according to (11),
-
- in which the task is a task of bringing the target object being in contact with a plane close to the object provided at an end of the plane while the target object is pressed against the plane by the support portion on which at least two of the sensors are provided side by side, and
- the control unit moves the hand portion in a state where a sensor included in the sensors and provided on the finger portions and a sensor included in the sensors and provided on the support portion are positioned side by side in parallel to the plane.
- (13)
- The information processing device according to (4),
-
- in which the estimation unit estimates the positional relationship on the basis of a distance to an object in a moving direction, the distance being measured by sensors included in the sensors and provided at distal ends of the finger portions holding the target object.
- (14)
- The information processing device according to (13),
-
- in which the estimation unit further estimates an orientation of the target object on the basis of distances to the target object measured by sensors included in the sensors and provided side by side on the support portion.
- (15)
- The information processing device according to (4),
-
- in which the estimation unit estimates the positional relationship on the basis of an average of distances to the object in a moving direction measured by sensors included in the sensors and provided at respective distal ends of the plurality of finger portions holding the target object.
- (16)
- An information processing method performed by an information processing device, including:
-
- controlling a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface, on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
- (17)
- A program that causes a computer to perform processing of:
-
- controlling a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface, on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
-
-
- 1 Robot
- 13 Arm portion
- 14 Hand portion
- 21 Base portion
- 22 Finger portion
- 41 Distance sensor
- 51 Information processing device
- 61 Network
- 101 Environment measurement unit
- 102 Task determination unit
- 103 Hand and finger initial position determination unit
- 104 Initial position database
- 105 Initial position transfer control unit
- 106 Target value calculation unit
- 107 Geometric information estimation unit
- 108 Positioning control unit
- 109 Task success/failure condition calculation unit
- 110 Task success/failure determination unit
- 121 Drive unit
- 131 Subtractor
- 132 Converter
- 133 Subtractor
- 134 Controller
Claims (17)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020131437 | 2020-08-03 | ||
| JP2020-131437 | 2020-08-03 | ||
| PCT/JP2021/027079 WO2022030242A1 (en) | 2020-08-03 | 2021-07-20 | Information processing device, information processing method, and program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20230302632A1 true US20230302632A1 (en) | 2023-09-28 |
Family
ID=80117298
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/006,733 Abandoned US20230302632A1 (en) | 2020-08-03 | 2021-07-20 | Information processing device, information processing method, and program |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20230302632A1 (en) |
| EP (1) | EP4190514A4 (en) |
| JP (1) | JPWO2022030242A1 (en) |
| WO (1) | WO2022030242A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240198525A1 (en) * | 2022-12-16 | 2024-06-20 | Semes Co., Ltd. | Gas supply automation system |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2024142563A (en) | 2023-03-30 | 2024-10-11 | ソニーグループ株式会社 | ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD, AND ROBOT SYSTEM |
Citations (37)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4001556A (en) * | 1975-04-07 | 1977-01-04 | International Business Machines Corporation | Computer controlled pneumatic retractable search sensor |
| US4492949A (en) * | 1983-03-18 | 1985-01-08 | Barry Wright Corporation | Tactile sensors for robotic gripper and the like |
| US4541771A (en) * | 1983-03-31 | 1985-09-17 | At&T Bell Laboratories | Robot having magnetic proximity sensor and manufacturing method using same |
| US4588348A (en) * | 1983-05-27 | 1986-05-13 | At&T Bell Laboratories | Robotic system utilizing a tactile sensor array |
| US20070078564A1 (en) * | 2003-11-13 | 2007-04-05 | Japan Science And Technology Agency | Robot drive method |
| US20090302626A1 (en) * | 2006-11-03 | 2009-12-10 | Aaron Dollar | Robust Compliant Adaptive Grasper and Method of Manufacturing Same |
| US20100292837A1 (en) * | 2009-05-14 | 2010-11-18 | Honda Motor Co., Ltd. | Robot hand and control system, control method and control program for the same |
| US8260458B2 (en) * | 2008-05-13 | 2012-09-04 | Samsung Electronics Co., Ltd. | Robot, robot hand, and method of controlling robot hand |
| US20140025205A1 (en) * | 2012-07-20 | 2014-01-23 | Seiko Epson Corporation | Control system, program, and method of controlling mechanical equipment |
| US9464642B2 (en) * | 2010-11-19 | 2016-10-11 | President And Fellows Of Harvard College | Soft robotic actuators |
| US9744677B2 (en) * | 2015-11-05 | 2017-08-29 | Irobot Corporation | Robotic fingers and end effectors including same |
| US9827670B1 (en) * | 2016-05-24 | 2017-11-28 | X Development Llc | Coaxial finger face and base encoding |
| US20180157317A1 (en) * | 2016-08-18 | 2018-06-07 | Technische Universität Dresden | System and method for haptic interaction with virtual objects |
| US20190176348A1 (en) * | 2017-12-12 | 2019-06-13 | X Development Llc | Sensorized Robotic Gripping Device |
| US20190243451A1 (en) * | 2016-11-02 | 2019-08-08 | Daisuke Wakuda | Gesture input system and gesture input method |
| US10576626B2 (en) * | 2012-03-08 | 2020-03-03 | Quality Manufacturing Inc. | Touch sensitive robotic gripper |
| US20200130193A1 (en) * | 2018-01-16 | 2020-04-30 | Preferred Networks, Inc. | Tactile information estimation apparatus, tactile information estimation method, and program |
| US20200306986A1 (en) * | 2019-03-29 | 2020-10-01 | Robotik Innovations, Inc. | Tactile perception apparatus for robotic systems |
| US10792809B2 (en) * | 2017-12-12 | 2020-10-06 | X Development Llc | Robot grip detection using non-contact sensors |
| US20210023713A1 (en) * | 2019-07-24 | 2021-01-28 | Abb Schweiz Ag | Method of Automated Calibration for In-Hand Object Location System |
| US20210086364A1 (en) * | 2019-09-20 | 2021-03-25 | Nvidia Corporation | Vision-based teleoperation of dexterous robotic system |
| US20210122056A1 (en) * | 2019-10-25 | 2021-04-29 | Dexterity, Inc. | Detecting robot grasp of very thin object or feature |
| US20210122045A1 (en) * | 2019-10-24 | 2021-04-29 | Nvidia Corporation | In-hand object pose tracking |
| US20210122039A1 (en) * | 2019-10-25 | 2021-04-29 | Dexterity, Inc. | Detecting slippage from robotic grasp |
| US20210283785A1 (en) * | 2020-03-10 | 2021-09-16 | Samsung Electronics Co., Ltd. | Method and apparatus for manipulating a tool to control in-grasp sliding of an object held by the tool |
| US20210293643A1 (en) * | 2018-07-05 | 2021-09-23 | The Regents Of The University Of Colorado, A Body Corporate | Multi-Modal Fingertip Sensor With Proximity, Contact, And Force Localization Capabilities |
| US20210394360A1 (en) * | 2020-06-18 | 2021-12-23 | Korea Institute Of Science And Technology | Tactile sensor module for robot-hand and grasping method using the same |
| US11389968B2 (en) * | 2019-10-02 | 2022-07-19 | Toyota Research Institute, Inc. | Systems and methods for determining pose of objects held by flexible end effectors |
| US11413760B2 (en) * | 2019-03-29 | 2022-08-16 | RIOA Intelligent Machines, Inc. | Flex-rigid sensor array structure for robotic systems |
| US11426880B2 (en) * | 2018-11-21 | 2022-08-30 | Kla Corporation | Soft gripper with multizone control to allow individual joint articulation |
| US20220317772A1 (en) * | 2018-12-03 | 2022-10-06 | Microsoft Technology Licensing, Llc | Augmenting the functionality of user input devices using a digital glove |
| US11565406B2 (en) * | 2020-11-23 | 2023-01-31 | Mitsubishi Electric Research Laboratories, Inc. | Multi-tentacular soft robotic grippers |
| US11813749B2 (en) * | 2020-04-08 | 2023-11-14 | Fanuc Corporation | Robot teaching by human demonstration |
| US11964400B2 (en) * | 2020-11-13 | 2024-04-23 | Robert Bosch Gmbh | Device and method for controlling a robot to pick up an object in various positions |
| US12103182B1 (en) * | 2023-10-20 | 2024-10-01 | Tacta Systems Inc. | Tactile robotic training platform |
| US12122040B2 (en) * | 2022-06-10 | 2024-10-22 | Sanctuary Cognitive Systems Corporation | Haptic photogrammetry in robots and methods for operating the same |
| US12214502B2 (en) * | 2021-09-10 | 2025-02-04 | Honda Motor Co., Ltd. | Object manipulation |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9120233B2 (en) * | 2012-05-31 | 2015-09-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Non-contact optical distance and tactile sensing system and method |
| JP6359756B2 (en) * | 2015-02-24 | 2018-07-18 | 株式会社日立製作所 | Manipulator, manipulator operation planning method, and manipulator control system |
| JP6541397B2 (en) | 2015-04-06 | 2019-07-10 | キヤノン株式会社 | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM |
| JP2020016446A (en) | 2018-07-23 | 2020-01-30 | キヤノン株式会社 | Information processing device, control method of information processing device, program, measurement device, and article production method |
-
2021
- 2021-07-20 JP JP2022541425A patent/JPWO2022030242A1/ja active Pending
- 2021-07-20 US US18/006,733 patent/US20230302632A1/en not_active Abandoned
- 2021-07-20 EP EP21852161.5A patent/EP4190514A4/en not_active Withdrawn
- 2021-07-20 WO PCT/JP2021/027079 patent/WO2022030242A1/en not_active Ceased
Patent Citations (39)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4001556A (en) * | 1975-04-07 | 1977-01-04 | International Business Machines Corporation | Computer controlled pneumatic retractable search sensor |
| US4492949A (en) * | 1983-03-18 | 1985-01-08 | Barry Wright Corporation | Tactile sensors for robotic gripper and the like |
| US4541771A (en) * | 1983-03-31 | 1985-09-17 | At&T Bell Laboratories | Robot having magnetic proximity sensor and manufacturing method using same |
| US4588348A (en) * | 1983-05-27 | 1986-05-13 | At&T Bell Laboratories | Robotic system utilizing a tactile sensor array |
| US20070078564A1 (en) * | 2003-11-13 | 2007-04-05 | Japan Science And Technology Agency | Robot drive method |
| US20090302626A1 (en) * | 2006-11-03 | 2009-12-10 | Aaron Dollar | Robust Compliant Adaptive Grasper and Method of Manufacturing Same |
| US8260458B2 (en) * | 2008-05-13 | 2012-09-04 | Samsung Electronics Co., Ltd. | Robot, robot hand, and method of controlling robot hand |
| US20100292837A1 (en) * | 2009-05-14 | 2010-11-18 | Honda Motor Co., Ltd. | Robot hand and control system, control method and control program for the same |
| US9464642B2 (en) * | 2010-11-19 | 2016-10-11 | President And Fellows Of Harvard College | Soft robotic actuators |
| US10576626B2 (en) * | 2012-03-08 | 2020-03-03 | Quality Manufacturing Inc. | Touch sensitive robotic gripper |
| US20140025205A1 (en) * | 2012-07-20 | 2014-01-23 | Seiko Epson Corporation | Control system, program, and method of controlling mechanical equipment |
| US9744677B2 (en) * | 2015-11-05 | 2017-08-29 | Irobot Corporation | Robotic fingers and end effectors including same |
| US9827670B1 (en) * | 2016-05-24 | 2017-11-28 | X Development Llc | Coaxial finger face and base encoding |
| US20180157317A1 (en) * | 2016-08-18 | 2018-06-07 | Technische Universität Dresden | System and method for haptic interaction with virtual objects |
| US20190243451A1 (en) * | 2016-11-02 | 2019-08-08 | Daisuke Wakuda | Gesture input system and gesture input method |
| US20190176348A1 (en) * | 2017-12-12 | 2019-06-13 | X Development Llc | Sensorized Robotic Gripping Device |
| US10682774B2 (en) * | 2017-12-12 | 2020-06-16 | X Development Llc | Sensorized robotic gripping device |
| US10792809B2 (en) * | 2017-12-12 | 2020-10-06 | X Development Llc | Robot grip detection using non-contact sensors |
| US20200130193A1 (en) * | 2018-01-16 | 2020-04-30 | Preferred Networks, Inc. | Tactile information estimation apparatus, tactile information estimation method, and program |
| US20210293643A1 (en) * | 2018-07-05 | 2021-09-23 | The Regents Of The University Of Colorado, A Body Corporate | Multi-Modal Fingertip Sensor With Proximity, Contact, And Force Localization Capabilities |
| US11426880B2 (en) * | 2018-11-21 | 2022-08-30 | Kla Corporation | Soft gripper with multizone control to allow individual joint articulation |
| US20220317772A1 (en) * | 2018-12-03 | 2022-10-06 | Microsoft Technology Licensing, Llc | Augmenting the functionality of user input devices using a digital glove |
| US20200306986A1 (en) * | 2019-03-29 | 2020-10-01 | Robotik Innovations, Inc. | Tactile perception apparatus for robotic systems |
| US11413760B2 (en) * | 2019-03-29 | 2022-08-16 | RIOA Intelligent Machines, Inc. | Flex-rigid sensor array structure for robotic systems |
| US20210023713A1 (en) * | 2019-07-24 | 2021-01-28 | Abb Schweiz Ag | Method of Automated Calibration for In-Hand Object Location System |
| US20210086364A1 (en) * | 2019-09-20 | 2021-03-25 | Nvidia Corporation | Vision-based teleoperation of dexterous robotic system |
| US11389968B2 (en) * | 2019-10-02 | 2022-07-19 | Toyota Research Institute, Inc. | Systems and methods for determining pose of objects held by flexible end effectors |
| US20210122045A1 (en) * | 2019-10-24 | 2021-04-29 | Nvidia Corporation | In-hand object pose tracking |
| US20210122039A1 (en) * | 2019-10-25 | 2021-04-29 | Dexterity, Inc. | Detecting slippage from robotic grasp |
| US20210122056A1 (en) * | 2019-10-25 | 2021-04-29 | Dexterity, Inc. | Detecting robot grasp of very thin object or feature |
| US20210283785A1 (en) * | 2020-03-10 | 2021-09-16 | Samsung Electronics Co., Ltd. | Method and apparatus for manipulating a tool to control in-grasp sliding of an object held by the tool |
| US11396103B2 (en) * | 2020-03-10 | 2022-07-26 | Samsung Electronics Co., Ltd. | Method and apparatus for manipulating a tool to control in-grasp sliding of an object held by the tool |
| US11813749B2 (en) * | 2020-04-08 | 2023-11-14 | Fanuc Corporation | Robot teaching by human demonstration |
| US20210394360A1 (en) * | 2020-06-18 | 2021-12-23 | Korea Institute Of Science And Technology | Tactile sensor module for robot-hand and grasping method using the same |
| US11964400B2 (en) * | 2020-11-13 | 2024-04-23 | Robert Bosch Gmbh | Device and method for controlling a robot to pick up an object in various positions |
| US11565406B2 (en) * | 2020-11-23 | 2023-01-31 | Mitsubishi Electric Research Laboratories, Inc. | Multi-tentacular soft robotic grippers |
| US12214502B2 (en) * | 2021-09-10 | 2025-02-04 | Honda Motor Co., Ltd. | Object manipulation |
| US12122040B2 (en) * | 2022-06-10 | 2024-10-22 | Sanctuary Cognitive Systems Corporation | Haptic photogrammetry in robots and methods for operating the same |
| US12103182B1 (en) * | 2023-10-20 | 2024-10-01 | Tacta Systems Inc. | Tactile robotic training platform |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240198525A1 (en) * | 2022-12-16 | 2024-06-20 | Semes Co., Ltd. | Gas supply automation system |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2022030242A1 (en) | 2022-02-10 |
| EP4190514A1 (en) | 2023-06-07 |
| JPWO2022030242A1 (en) | 2022-02-10 |
| EP4190514A4 (en) | 2024-01-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11090814B2 (en) | Robot control method | |
| JP7154815B2 (en) | Information processing device, control method, robot system, computer program, and storage medium | |
| US10894324B2 (en) | Information processing apparatus, measuring apparatus, system, interference determination method, and article manufacturing method | |
| US10551821B2 (en) | Robot, robot control apparatus and robot system | |
| US8244402B2 (en) | Visual perception system and method for a humanoid robot | |
| Li et al. | A Control Framework for Tactile Servoing. | |
| JP5778311B1 (en) | Picking apparatus and picking method | |
| US9616568B1 (en) | Generating a grasp affordance for an object based on a thermal image of the object that is captured following human manipulation of the object | |
| US10286557B2 (en) | Workpiece position/posture calculation system and handling system | |
| JP2008049459A (en) | Manipulator control system, manipulator control method and program | |
| CN111604942A (en) | Object detection device, control device, and computer program for object detection | |
| JP2009107043A (en) | Gripping device and gripping device control method | |
| Hoffmann et al. | Environment-aware proximity detection with capacitive sensors for human-robot-interaction | |
| CN103221188A (en) | Work pick-up apparatus | |
| JP2019089157A (en) | Holding method, holding system, and program | |
| US20230302632A1 (en) | Information processing device, information processing method, and program | |
| US12358132B2 (en) | Robot system and picking method | |
| US12156708B2 (en) | Confidence-based robotically-assisted surgery system | |
| US20240139962A1 (en) | Iterative control of robot for target object | |
| CN117021084A (en) | Workpiece grabbing method, device, system, electronic equipment and storage medium | |
| US20230286159A1 (en) | Remote control system | |
| US20190321969A1 (en) | Method and robotic system for manipluating instruments | |
| US12466078B2 (en) | Handling apparatus, handling method, and recording medium | |
| EP4148374B1 (en) | Workpiece holding apparatus, workpiece holding method, program, and control apparatus | |
| Nakhaeinia et al. | Adaptive robotic contour following from low accuracy RGB-D surface profiling and visual servoing |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SONY GROUP CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FURUYAMA, YOSHIKAZU;REEL/FRAME:062485/0140 Effective date: 20230117 Owner name: SONY GROUP CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:FURUYAMA, YOSHIKAZU;REEL/FRAME:062485/0140 Effective date: 20230117 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |