[go: up one dir, main page]

WO2024247135A1 - Control device, robot system, and robot control method - Google Patents

Control device, robot system, and robot control method Download PDF

Info

Publication number
WO2024247135A1
WO2024247135A1 PCT/JP2023/020184 JP2023020184W WO2024247135A1 WO 2024247135 A1 WO2024247135 A1 WO 2024247135A1 JP 2023020184 W JP2023020184 W JP 2023020184W WO 2024247135 A1 WO2024247135 A1 WO 2024247135A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
candidates
place
target
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/020184
Other languages
French (fr)
Japanese (ja)
Inventor
孝一 正岡
拓哉 志鷹
裕規 高山
真平 伊藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kawasaki Heavy Industries Ltd
Kawasaki Motors Ltd
Original Assignee
Kawasaki Heavy Industries Ltd
Kawasaki Jukogyo KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kawasaki Heavy Industries Ltd, Kawasaki Jukogyo KK filed Critical Kawasaki Heavy Industries Ltd
Priority to PCT/JP2023/020184 priority Critical patent/WO2024247135A1/en
Publication of WO2024247135A1 publication Critical patent/WO2024247135A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the technology disclosed herein relates to a control device, a robot system, and a method for controlling a robot.
  • Control devices that control robots that perform picking are known from the past.
  • the control device described in Patent Document 1 controls the robot arm and hand to pick up a workpiece and transport and place it at another location.
  • This control device changes the gripping mode depending on the state of the workpiece to be picked, thereby realizing gripping of the workpiece without the robot arm and hand interfering with other objects.
  • the control device disclosed herein is a control device that controls a robot that has a robot arm and a hand attached to the robot arm and performs pick-and-place processing of a workpiece, and is equipped with a searcher that searches for multiple candidates for the relative position and orientation of the hand with respect to the workpiece to be placed in a candidate placement position, and a determiner that determines the relative position and orientation of the hand with respect to the workpiece to be picked from multiple candidates for the relative position and orientation of the hand at the time of placement.
  • the robot system disclosed herein comprises a robot having a robot arm and a hand attached to the robot arm, and the control device.
  • the robot control method disclosed herein is a method for controlling a robot that has a robot arm and a hand attached to the robot arm and performs pick-and-place processing of a workpiece, and includes searching for multiple candidates for the relative position and orientation of the hand with respect to a workpiece to be placed at a candidate placement position, and determining the relative position and orientation of the hand with respect to the workpiece to be picked from the multiple candidates for the relative position and orientation of the hand at the time of placement.
  • the control device allows the workpiece to be picked and placed appropriately.
  • the robot system can properly pick and place workpieces.
  • the robot control method described above allows the workpiece to be picked and placed appropriately.
  • FIG. 1 is a schematic diagram showing the configuration of a robot system.
  • FIG. 2 is a diagram showing a schematic hardware configuration of the robot control device and the main control device 3.
  • FIG. 3 is a block diagram showing the configuration of a control system of the processor.
  • FIG. 4 is a view of the inside of the first container as seen from above.
  • FIG. 5 is an exemplary perspective view showing the inside of the second container.
  • FIG. 6 is a schematic plan view of the inside of the second container for explaining the search for position candidates.
  • FIG. 7 is an explanatory diagram showing the hand when picking up the target workpiece.
  • FIG. 8 is a schematic diagram showing the relative position and orientation of the hand when viewed in the direction of the reference axis.
  • FIG. 9 is a flow chart of location planning.
  • FIG. 9 is a flow chart of location planning.
  • FIG. 10 is a flow chart of the place location planning subroutine.
  • FIG. 11 is a flow chart of the pickup position planning subroutine.
  • FIG. 12 is a timing chart of the placement position plan, the pickup position plan, and the robot operation.
  • FIG. 13 is a flowchart of a position plan according to a modified example.
  • FIG. 14 is a flow chart of a subroutine for correcting a place position.
  • FIG. 15 is a timing chart of the placement position plan, the pickup position plan, and the robot operation according to the modified example.
  • Figure 1 is a schematic diagram showing the configuration of a robot system 100.
  • the robot system 100 includes a robot 1 and a main control device 3 that controls the robot 1.
  • the main control device 3 controls the robot 1 to cause the robot 1 to execute a process.
  • the process performed by the robot 1 is a pick-and-place process.
  • the robot 1 picks up a target workpiece from a first container 91 in which multiple workpieces are piled up randomly, and transfers it into a second container 92.
  • the robot 1 arranges the target workpieces in a regular pattern in the second container 92.
  • the robot system 100 arranges the multiple workpieces in the first container 91 into the second container 92 by repeating the pick-and-place process.
  • the target workpiece has, for example, a roughly rectangular parallelepiped shape.
  • the robot system 100 may include a camera 51 that captures an image of the workpiece.
  • the camera 51 may include a first camera 51A that captures an image inside the first container 91 and a second camera 51B that captures an image inside the second container 92.
  • the first camera 51A is fixedly disposed above the first container 91.
  • the first camera 51A captures an image inside the first container 91 from above.
  • the first camera 51A captures an image including a plurality of workpieces inside the first container 91.
  • the second camera 51B is fixedly disposed above the second container 92.
  • the second camera 51B captures an image inside the second container 92 from above.
  • the second camera 51B captures an image including a plurality of workpieces inside the second container 92.
  • the first camera 51A and the second camera 51B are not distinguished from each other, they will be simply referred to as cameras 51.
  • the image is a two-dimensional image or a three-dimensional image.
  • the three-dimensional image can be point cloud data, an RGB-D image, an RGB image, a depth image, or a voxel, etc.
  • the camera 51 can be a three-dimensional camera, i.e., an RGB-D camera that outputs an RGB-D image, a stereo camera that acquires an RGB image, or a three-dimensional vision sensor that acquires point cloud data, etc.
  • the robot 1 has a robot arm 12 and a hand 14 attached to the robot arm 12.
  • the robot 1 is an industrial robot.
  • the hand 14 is a so-called end effector.
  • a base coordinate system of three orthogonal axes is defined in a space in which the robot 1 is disposed.
  • the robot arm 12 is configured to operate in a three-dimensional manner. Specifically, the robot arm 12 is configured to perform operations including translation with at least three degrees of freedom.
  • the robot arm 12 is a vertical multi-joint robot arm.
  • the robot arm 12 is supported by the base 10.
  • the robot arm 12 has multiple links, multiple joints that connect the multiple links, and a servo motor that rotates the multiple joints.
  • the robot arm 12 has a first link 12a connected to the base 10, a second link 12b connected to the first link 12a, a third link 12c connected to the second link 12b, a fourth link 12d connected to the third link 12c, and a fifth link 12e connected to the fourth link 12d.
  • the base 10 and the first link 12a are connected to each other via a first joint 13a that can rotate around an axis extending in the vertical direction.
  • the first link 12a and the second link 12b are connected to each other via a second joint 13b that can rotate around an axis extending in the horizontal direction.
  • the second link 12b and the third link 12c are connected to each other via a third joint 13c that can rotate around an axis extending in the horizontal direction.
  • the third link 12c and the fourth link 12d are connected to each other via a fourth joint 13d that can rotate around the axis of the fourth link 12d (i.e., the direction in which the fourth link 12d extends).
  • the fourth link 12d and the fifth link 12e are connected to each other via a fifth joint 13e that can rotate around an axis perpendicular to the axis of the fourth link 12d.
  • the robot arm 12 has servo motors 16 (see Figure 2) that rotate each joint.
  • Each servo motor 16 has an encoder 16a (see Figure 2).
  • the robot arm 12 thus configured is configured to perform translational movements in the directions of each of the three orthogonal axes, and rotational movements around each of the three orthogonal axes.
  • the hand 14 includes a suction device 15 that is positioned at a position offset in the radial direction around a predetermined reference axis T of the robot arm 12.
  • the suction device 15 is positioned eccentrically with respect to the reference axis T.
  • the hand 14 has a base 14a that is attached to the robot arm 12.
  • the base 14a extends in the radial direction around the reference axis T.
  • the suction device 15 is attached to the end of the base 14a that is farther from the reference axis T via a plate, block, or the like.
  • the suction device 15 is configured to be able to change its position relative to the robot arm 12.
  • the hand 14 is attached to the robot arm 12 so as to be rotatable around the reference axis T.
  • the hand 14 is connected to the tip of the robot arm 12, i.e., the fifth link 12e.
  • the fifth link 12e and the hand 14 are connected to each other via the sixth joint 13f so as to be rotatable around the reference axis T.
  • the suction device 15 changes its relative position with respect to the robot arm 12 by rotating the hand 14 around the reference axis T.
  • the hand 14 sucks the workpiece with a suction device 15.
  • the suction device 15 has a suction pad.
  • An air hose that is connected to a negative pressure generator is connected to the suction device 15.
  • the air hose is provided with a solenoid valve that serves as an actuator 15a (see Figure 2). By controlling the solenoid valve, the suction device 15 can switch between suction and release.
  • the suction device 15 sucks the surface of the target workpiece W.
  • the air hose is provided with a pressure sensor 15b (see Figure 2).
  • the pressure sensor 15b detects the magnitude of the negative pressure in the air hose.
  • FIG. 2 is a diagram showing the general hardware configuration of the robot controller 2 and the main controller 3.
  • the main controller 3 transmits and receives signals and information to and from the robot controller 2.
  • the main controller 3 controls the robot 1 via the robot controller 2.
  • the main controller 3 outputs commands to the robot controller 2.
  • the robot controller 2 controls the servo motor 16 of the robot arm 12 in response to commands from the main controller 3.
  • Images from the first camera 51A and the second camera 51B are input to the main control device 3.
  • the main control device 3 detects the target workpiece W based on the images.
  • the main control device 3 generates a target path for the robot 1 based on the detected target workpiece W, and outputs a command corresponding to the generated target path to the robot control device 2.
  • An actuator 15a and a pressure sensor 15b are connected to the main control device 3.
  • the main control device 3 controls suction by the hand 14 via the actuator 15a and the pressure sensor 15b.
  • the main control device 3 is an example of a control device.
  • the robot control device 2 has a processor 21, a memory 22, and a memory 23.
  • the processor 21 controls the entire robot control device 2.
  • the processor 21 performs various types of arithmetic processing.
  • the processor 21 is formed of a processor such as a CPU (Central Processing Unit).
  • the processor 21 may also be formed of an MCU (Micro Controller Unit), an MPU (Micro Processor Unit), an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), a system LSI, etc.
  • the memory 22 stores the programs executed by the processor 21 and various data.
  • the memory 22 is formed of a non-volatile memory, a hard disc drive (HDD), a solid state drive (SSD), etc.
  • the memory 23 temporarily stores data, etc.
  • the memory 23 is formed of a volatile memory.
  • the processor 21 drives the servo motor 16 based on a command from the main control device 3. At this time, the robot control device 2 feedback controls the current supplied to the servo motor 16 based on the detection result of the encoder 16a.
  • the main control device 3 has a processor 31, a memory 32, and a memory 33.
  • the processor 31 controls the entire main control device 3.
  • the processor 31 performs various types of arithmetic processing.
  • the processor 31 is formed of a processor such as a CPU (Central Processing Unit).
  • the processor 31 may also be formed of an MCU (Micro Controller Unit), an MPU (Micro Processor Unit), an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), a system LSI, etc.
  • the memory 32 stores the programs and various data executed by the processor 31.
  • the memory 32 is formed of a non-volatile memory, a HDD (Hard Disc Drive), an SSD (Solid State Drive), etc.
  • the memory 32 stores an object detection program that detects objects, a position and orientation determination program that determines the position and orientation of the hand 14, and a path generation program that generates a target path for the robot 1.
  • the various programs cause the main control device 3 to realize various functions.
  • Memory 33 temporarily stores data, etc.
  • memory 33 is formed of a volatile memory.
  • Memory 33 stores images from camera 51.
  • the processor 31 causes the robot 1 to execute a picking operation in which the hand 14 moves to the target workpiece W and causes the suction device 15 to suction the target workpiece W, and a placing operation in which the target workpiece W adsorbed by the hand 14 moves to a target placement position and causes the suction device 15 to release the suction of the target workpiece W.
  • the processor 31 causes the robot 1 to execute a placing operation after the picking operation.
  • the processor 31 determines the relative position and orientation of the hand 14 with respect to the workpiece at the time of picking. Prior to a placing operation, the processor 31 determines the target placement position of the target workpiece W at the time of placement.
  • the relative position and orientation of the hand 14 with respect to the workpiece will be referred to simply as the "relative position and orientation of the hand 14.”
  • FIG. 3 is a block diagram showing the configuration of the control system of the processor 31.
  • the processor 31 realizes various functions by reading out a program from the storage device 32 to the memory 33 and expanding it.
  • the processor 31 functions as a searcher 41 that searches for candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at the candidate placement position in the storage space, and a determiner 42 that determines the relative position and orientation of the hand 14 with respect to the target workpiece to be picked.
  • the processor 31 functions as an operation controller 43 that causes the robot arm 12 and the hand 14 to execute robot operations including a picking operation for picking the target workpiece W and a placing operation for placing the target workpiece W at the placement position.
  • the processor 31 can also function as a photography controller 44 that causes the camera 51 to acquire an image, a workpiece detector 45 that detects the workpiece based on the image, a height detector 46 that detects height information of the storage space based on the image, and a path generator 47 that generates a target path for the robot arm 12 and the hand 14.
  • a photography controller 44 that causes the camera 51 to acquire an image
  • a workpiece detector 45 that detects the workpiece based on the image
  • a height detector 46 that detects height information of the storage space based on the image
  • a path generator 47 that generates a target path for the robot arm 12 and the hand 14.
  • the photography controller 44 controls the first camera 51A and the second camera 51B to capture images from each of the first camera 51A and the second camera 51B.
  • the photography controller 44 causes the first camera 51A to perform picking photography to capture images inside the first container 91.
  • the photography controller 44 causes the second camera 51B to perform place photography to capture images inside the second container 92.
  • the photography controller 44 stores the images from the cameras 51 in the memory 33.
  • the work detector 45 detects a work from within an image by image recognition.
  • the image may include multiple workpieces.
  • the work detector 45 performs plane detection, for example, by RANSAC (Random Sample Consensus) to detect the surface of the workpiece (hereinafter referred to as the "target surface S").
  • RANSAC Random Sample Consensus
  • the work detector 45 detects a workpiece by detecting a plane of a predetermined shape included in the image.
  • FIG. 4 is a view of the inside of the first container 91 from above.
  • the work detector 45 can detect multiple target surfaces S, i.e., multiple workpieces.
  • the work detector 45 determines at least the center position of the target surface S, the normal direction of the target surface S, the vertical and horizontal orientations of the target surface S, and the size of the target surface S (e.g., the vertical and horizontal sizes).
  • the size of the target surface S may be stored in advance in the memory 32.
  • the height detector 46 detects height information of the storage space in which the target workpiece W is to be stored.
  • the storage space is inside the second container 92.
  • the height information is information about the height of each position in the storage space, specifically, the height information of each position inside the second container 92. In areas inside the second container 92 where the bottom surface is exposed, the height information is the height of each position on the bottom surface. In areas inside the second container 92 where the workpiece is present, the height information is the height of each position on the top surface of the workpiece.
  • the height detector 46 detects the height information of the storage space based on an image inside the second container 92.
  • the height detector 46 obtains the height information of the storage space, for example, by performing preprocessing such as discretization or smoothing from the point cloud data of the three-dimensional image.
  • the height detector 46 outputs the height information of the storage space as a height map.
  • the path generator 47 generates a target path for the robot arm 12 and the hand 14.
  • the target path is an ordered set of target positions and target postures of the robot arm 12 and the hand 14. That is, the target path is a change in the target position and target posture of the robot arm 12 and the hand 14 over time.
  • the target path may be an ordered set of target rotation angles of each joint of the robot arm 12 (including the sixth joint 13f) that realizes the change in the target position and target posture of the robot arm 12 and the hand 14 over time.
  • the target path uniquely defines the position and posture of the robot arm 12 and the hand 14.
  • the target path also defines the target rotation position of the suction device 15 around the reference axis T.
  • the path generator 47 generates a target path for the robot arm 12 and the hand 14 for a picking operation.
  • the path generator 47 generates a target path for the robot arm 12 and the hand 14 for a placing operation.
  • the path generator 47 executes path planning using the Probabilistic Roadmap Method (PRM) or the Rapidly-Exploring Random Tree (RRT), etc. In these methods, the path generator 47 also executes an interference check, i.e., checks whether the robot arm 12 and the hand 14 interfere with other objects.
  • the interference check may be, for example, an interference check using point cloud data or CAD data.
  • the searcher 41 searches for candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at a candidate place position in the storage space.
  • the searcher 41 searches for multiple candidates for the relative position and orientation of the hand 14.
  • the place position candidates corresponding to the multiple candidates for the relative position and orientation of the hand 14 may be the same as each other, or may be different from each other.
  • the searcher 41 may search for multiple different candidates for the relative position and orientation of the hand 14 with respect to one candidate position.
  • the searcher 41 may search for multiple candidates for the relative position and orientation of the hand 14 with respect to different candidate positions.
  • the candidates for the relative position and orientation of the hand 14 may be different from each other, or may be the same as each other.
  • the searcher 41 searches for multiple place candidates, which are combinations of position candidates and candidates for the relative position and orientation of the hand 14 with respect to the workpiece placed at the position candidates.
  • Each of the multiple place candidates has a different combination of position candidates and candidates for the relative position and orientation of the hand 14. At least some of the position candidates included in the multiple place candidates may be the same as each other. At least some of the candidates for the relative position and orientation of the hand 14 included in the multiple place candidates may be the same as each other.
  • the searcher 41 may search not only for candidates for the relative position and orientation of the hand 14, but also for candidates for the placement position. For example, the searcher 41 detects a space in the storage space where a work can be placed (hereinafter referred to as a "placeable space”). The searcher 41 detects a plane P in the storage space where a work can be placed (hereinafter referred to as a "placeable surface”) based on the height map, and sets the space above the placeable surface P as the placeable space.
  • Figure 5 is an exemplary perspective view showing the inside of the second container 92. In Figure 5, the surface marked with dots is the placeable surface P.
  • the placeable surface P is a plane exposed upward and is an area of uniform height in the height map.
  • the placeable surface P is an exposed portion of the bottom surface of the second container 92. If the top surface of the existing work W0 is also exposed, it can become the placeable surface P. In addition, when multiple existing workpieces W0 are lined up, the upper surfaces of the existing workpieces W0 are exposed upward, and the heights of the upper surfaces of the existing workpieces W0 are uniform, the upper surfaces of the existing workpieces W0 become a single placeable surface P. In the example of FIG. 5, the upper surfaces of the two existing workpieces W0 are each treated as a separate placeable surface P.
  • the searcher 41 determines whether or not the work can be placed at each position while changing the position of the work on the placement surface P.
  • FIG. 6 is a schematic plan view of the inside of the second container to explain the search for position candidates. The positions determined to be possible for placement become position candidates. Whether or not the work can be placed is determined based on whether or not the work can be placed without interfering with other objects.
  • the searcher 41 sets the size of the work for search (see dashed line) to the actual size of the work (see dashed line) plus a predetermined margin. Specifically, the searcher 41 enlarges the bottom surface of the work by the amount of the predetermined margin.
  • the size of the work may be stored in the memory 32 in advance. Alternatively, the size of the work may be identified when the work detector 45 detects the work.
  • the searcher 41 searches for positions in the storage space where the work can be placed as candidate placement positions.
  • the searcher 41 finds candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at the candidate position.
  • the relative position and orientation of the hand 14 is determined by the position of the suction device 15 with respect to the workpiece, the position of the reference axis T, and the orientation of the reference axis T.
  • FIG. 7 is an explanatory diagram showing the hand 14 when suctioning the target workpiece W. Since the suction device 15 is eccentric with respect to the reference axis T, even if the position of the suction device 15 with respect to the workpiece, i.e., the suction position, is determined, the reference axis T can be positioned at any position within 360 degrees around the suction device 15. In this example, the position of the suction device 15 and the orientation of the reference axis T among the relative position and orientation of the hand 14 are determined.
  • the suction position of the suction device 15 is the center of the target surface S.
  • the reference axis T is approximately perpendicular to the target surface S.
  • the position of the reference axis T among the relative position and orientation of the hand 14 can be set to any angle position within 360 degrees around the suction device 15.
  • the hand 14 or robot arm 12 may interfere with other objects when placing the workpiece.
  • the workpiece is placed in a space that includes a corner within the storage space, there is a risk that the hand 14 or robot arm 12 may interfere with the side wall of the second container 92 that forms the corner or the side surface of the existing workpiece W0.
  • the appropriate relative position and orientation of the hand 14 is one in which the reference axis T is away from the side wall of the second container 92, etc.
  • the memory 32 stores the relative positions and orientations of multiple hands 14 with different angular positions of the reference axis T centered on the suction device 15.
  • Figure 8 is a schematic diagram showing the relative positions and orientations of the hand 14 when viewed in the direction of the reference axis T.
  • the memory 32 may store the relative positions and orientations of 12 types of hands 14 with different angular positions of the reference axis T centered on the suction device 15 at 30 degree intervals.
  • the searcher 41 creates a place candidate by combining one of the multiple relative positions and orientations of the hand 14 with one of multiple position candidates.
  • the searcher 41 creates, as a place candidate, a combination of a position candidate and a relative position and orientation of the hand 14, where the hand 14 and robot arm 12 can hold a workpiece without interfering with other objects. Specifically, the searcher 41 determines whether or not a target path for the hand 14 and robot arm 12 corresponding to the place candidate can be generated. In other words, the searcher 41 sets, as a place candidate, a combination of a position candidate and a candidate relative position and orientation of the hand 14, where a target path for the hand 14 and robot arm 12 that does not interfere with other objects can be generated.
  • the path generator 47 generates a target path for the hand 14 and robot arm 12 to move the hand 14 to a state corresponding to the position candidate and the candidate for the relative position and orientation of the hand 14.
  • the path generator 47 performs an interference check of the target path to confirm the presence or absence of interference.
  • the path generator 47 generates a target path without interference. If interference is present, the path generator 47 does not generate a target path.
  • the searcher 41 sets a combination of the position candidate at that time and the candidate for the relative position and orientation of the hand 14 as a placing candidate.
  • the searcher 41 registers the place candidates. That is, the searcher 41 stores the combination of the position candidate and the relative position and orientation of the hand 14 in the memory 33 or the storage device 32. The searcher 41 further associates the target route with the place candidate and stores it in the memory 33 or the storage device 32.
  • the searcher 41 searches for multiple place candidates.
  • the searcher 41 assigns priorities to multiple place candidates. Specifically, the searcher 41 assigns priorities to position candidates included in the place candidates. For example, the searcher 41 assigns priorities to multiple position candidates from the viewpoint of work storage efficiency. The searcher 41 may assign a high priority to a position candidate corresponding to a corner space in the storage space from among the multiple position candidates.
  • the searcher 41 may assign a high priority to a position candidate corresponding to a corner space in the storage space from among the multiple position candidates.
  • the searcher 41 may assign a high priority to a position candidate where the side of the work comes into contact with another object from among the multiple position candidates.
  • the searcher 41 may assign priorities to multiple position candidates from the viewpoint of storage stability. For example, the searcher 41 may assign a higher priority to a position candidate that is lower in height from the bottom surface of the second container 92. Placing the work on the bottom surface of the second container 92 allows the work to be placed more stably than placing the work on top of the existing work W0.
  • the searcher 41 searches for multiple place candidates, which are combinations of candidate placement positions and corresponding candidates for the relative position and orientation of the hand 14, and assigns priorities to the multiple place candidates.
  • the determiner 42 determines the relative position and orientation of the hand 14 with respect to the target workpiece to be picked from among multiple candidates for the relative position and orientation of the hand 14 at the time of placing. Specifically, the determiner 42 determines the relative position and orientation of the hand 14 at which the hand 14 and robot arm 12 can hold the target workpiece without interfering with other objects. Specifically, the determiner 42 judges whether or not the hand 14 can generate a target path for the hand 14 and robot arm 12 to pick up the target workpiece in the relative position and orientation. In other words, the determiner 42 determines the relative position and orientation of the hand 14 that can generate a target path for the hand 14 and robot arm 12 not interfering with other objects from among multiple candidates for the relative position and orientation of the hand 14 included in multiple place candidates.
  • the determiner 42 provisionally selects one relative position and posture of the hand 14 from among the relative positions and postures of the multiple hands 14.
  • the path generator 47 generates a target path for the hand 14 and the robot arm 12 to pick up the target workpiece at the selected relative position and posture of the hand 14.
  • an intermediate position through which the robot arm 12 and the hand 14 pass is set, so the path generator 47 generates a target path from the intermediate position and posture.
  • the path generator 47 performs an interference check of the target path to confirm the presence or absence of interference. In other words, the path generator 47 generates a target path without interference. If interference is present, the path generator 47 does not generate a target path.
  • the determiner 42 determines the provisional relative position and posture of the hand 14 as the relative position and posture of the hand 14 at the time of picking. If the target path is not generated, the determiner 42 selects another relative position and posture of the hand 14, and the path generator 47 attempts to generate a target path for the selected relative position and posture of the hand 14.
  • the determiner 42 When there are multiple options for the target workpiece, i.e., when selecting a target workpiece from multiple workpieces, the determiner 42 also determines the target workpiece from the multiple works when determining the relative position and posture of the hand 14. When determining the relative position and posture of the hand 14 that can pick up the target workpiece from the relative positions and postures of multiple hands 14, the more options there are for the target workpiece, the wider the range of choices for the relative position and posture of the hand 14.
  • the motion controller 43 causes the robot arm 12 and hand 14 to execute robot motions including a picking motion for picking up the target workpiece W and a placing motion for placing the target workpiece W at a target placement position.
  • the motion controller 43 outputs a command according to the target path to the robot control device 2.
  • the motion controller 43 outputs a command according to the target rotation angle of each joint of the robot arm 12 to the robot control device 2.
  • the robot control device 2 drives the servo motor 16 based on the command, causing the robot arm 12 and hand 14 to move along the target path.
  • the operation controller 43 switches between suction by the suction device 15 and release of the suction by controlling the actuator 15a of the hand 14.
  • the operation controller 43 controls the actuator 15a to generate negative pressure in the suction device 15, thereby causing the suction device 15 to suction the workpiece.
  • the operation controller 43 controls the actuator 15a to release the negative pressure in the suction device 15, thereby causing the suction device 15 to release the suction of the workpiece.
  • the operation controller 43 determines whether suction of the target workpiece W by the suction device 15 has been completed based on the detection result of the pressure sensor 15b.
  • the operation controller 43 determines whether suction has been completed when the negative pressure detected by the pressure sensor 15b is equal to or greater than a predetermined threshold value.
  • Position planning includes a place position plan and a pickup position plan.
  • Robot operation includes a picking operation and a placing operation. Position planning will be described with reference to the flowchart in Figure 9.
  • Figure 9 is a flowchart of position planning.
  • step S101 the photography controller 44 causes the second camera 51B to perform place photography.
  • the place photography is photography of the inside of the second container 92.
  • an image of the inside of the second container 92 is obtained.
  • step S102 the searcher 41 executes a place position plan.
  • the searcher 41 searches for multiple place candidates.
  • the searcher 41 also generates a target path for the robot arm 12 and hand 14 for each of the place candidates.
  • step S103 the photography controller 44 causes the first camera 51A to perform picking photography.
  • the picking photography is photography of the inside of the first container 91.
  • an image of the inside of the first container 91 is acquired.
  • step S103 may be performed before or after step S101. However, if the placement position planning is performed using the size of the work W identified by the work detector 45, steps S103 and S104 are performed before step S102.
  • step S104 the work detector 45 detects a work from within the image of the first container 91.
  • the detected work is simply referred to as the "detected work.”
  • step S105 the determiner 42 executes a pickup position plan.
  • the determiner 42 determines the target work W to be picked, and determines the relative position and orientation of the hand 14 with respect to the target work W. By determining the relative position and orientation of the hand 14, a position candidate specified by the corresponding place candidate is determined as the place target position. Furthermore, in the pickup position plan, a target path of the robot arm 12 and the hand 14 for picking up the target work W is also generated.
  • the position planning determines the target workpiece W, the relative position and orientation of the hand 14, the target placement position, the target path for picking, and the target path for placing.
  • the motion controller 43 performs picking and placing operations by moving the robot arm 12 and hand 14 along the target path.
  • the motion controller 43 controls the robot arm 12 and hand 14 so that the target workpiece W is placed at the target placement position. Specifically, the motion controller 43 moves the robot arm 12 and hand 14 along the target placement path, and moves the target workpiece W to the target placement position. When the target workpiece W reaches the target placement position, the motion controller 43 releases the suction by the suction device 15. In this way, the placement of the target workpiece W at the target placement position is completed. This completes the pick-and-place process.
  • the main control device 3 When the main control device 3 completes the pick-and-place process for one target workpiece W, it executes the pick-and-place process for another target workpiece W. By repeating this type of control, the main control device 3 sequentially transfers the target workpieces W in the first container 91 into the second container 92.
  • step S201 the height detector 46 creates a height map of the storage space, i.e., the interior of the second container 92.
  • step S202 the searcher 41 searches for position candidates for the place position.
  • the searcher 41 creates multiple position candidates for the workpiece within the second container 92.
  • step S203 the searcher 41 assigns priorities to the location candidates. As described above, the searcher 41 assigns priorities to each of the multiple location candidates, for example, from the perspective of accommodation efficiency or accommodation stability.
  • step S204 the searcher 41 reads out one relative position and orientation from the multiple relative positions and orientations of the hand 14 stored in the memory 32.
  • step S205 the searcher 41 reads out the position candidate with the highest priority from the multiple position candidates.
  • step S207 the searcher 41 determines whether or not a target path has been generated. If a target path has been generated, the searcher 41 sets the current relative position and orientation as a candidate for the relative position and orientation of the hand 14. In step S208, the searcher 41 registers a combination of the position candidate and the candidate for the relative position and orientation of the hand 14 as a place candidate.
  • step S209 determines whether or not it has attempted to generate a target path with the current relative position and orientation for all position candidates. If it has not attempted to generate a target path for all position candidates, the searcher 41 returns to step S205 and reads out the position candidate with the next highest priority from the multiple position candidates.
  • the path generator 47 attempts to generate a target path for the new position candidate (step S206), and the searcher 41 determines whether or not a target path has been generated (step S207).
  • the searcher 41 searches for position candidates in which the workpiece can be appropriately placed in the current relative position and orientation, in descending order of priority.
  • the searcher 41 finds a position candidate in which the workpiece can be appropriately placed, it associates the position candidate with the current relative position and orientation.
  • step S210 determines in step S210 whether the search for place candidates has been completed for all relative positions and orientations. If the search for place candidates has not been completed for all relative positions and orientations, the searcher 41 returns to step S204 and searches for place candidates for another relative position and orientation.
  • the searcher 41 ends the place position planning.
  • place position planning By this type of place position planning, multiple place candidates are created, which are combinations of position candidates and candidates for the relative position and orientation of the hand 14. In other words, a set of place candidates is created by the place position planning.
  • step S301 the determiner 42 reads out one relative position/posture candidate from among the multiple relative position/posture candidates of the hand 14 included in the multiple place candidates. Specifically, the determiner 42 reads out the relative position/posture candidate of the hand 14 that is defined by the place candidate with the highest priority among the multiple place candidates.
  • the determiner 42 associates the candidates for relative position and orientation with each detected work. Specifically, the determiner 42 determines the absolute or controlled position and orientation of the hand 14 when the hand 14 is placed with respect to each detected work at the candidate relative position and orientation. For example, this hand position and orientation is not a relative position and orientation with respect to the work, but the position and orientation of the hand 14 in a coordinate system (e.g., a robot coordinate system, etc.) used to control the robot 1.
  • a coordinate system e.g., a robot coordinate system, etc.
  • step S303 the determiner 42 assigns a priority to each combination of the detected workpiece and the hand 14. For example, the determiner 42 assigns a high priority to a combination that includes a position and orientation of the hand 14 that has a small amount of movement from the current position of the hand 14.
  • step S304 the determiner 42 reads out one combination from among multiple combinations of the detection workpiece and the hand 14. At this time, the determiner 42 reads out the combination with the highest priority.
  • step S305 the path generator 47 generates a target path for the robot arm 12 to the position and posture of the hand 14, which is defined by the combination read from the current position of the hand 14. At this time, the path generator 47 performs an interference check of the target path, and generates a target path without interference. Depending on the state of surrounding objects, the path generator 47 may not be able to generate a target path without interference.
  • step S306 the determiner 42 determines whether an interference-free target path has been generated. If an interference-free target path has not been generated, in step S307, the determiner 42 determines whether an attempt has been made to generate target paths for all of the multiple combinations of the detected workpiece and the hand 14. If target path generation has not been attempted for all combinations, the determiner 42 returns to step S304 and reads out the combination with the next highest priority from the multiple combinations. The path generator 47 then attempts to generate a target path for the new combination (step S305), and the determiner 42 determines whether a target path has been generated (step S306).
  • the determiner 42 searches for detected workpieces for which an appropriate target path can be generated from the candidates for the relative position and posture of the hand 14 set in step 301, in descending order of priority.
  • the determiner 42 determines the target path as the target path at the time of picking in step S308.
  • the determiner 42 determines the detected workpiece at the time the appropriate target path was generated as the target workpiece W, and determines the candidate for the relative position and posture of the hand 14 at that time as the relative position and posture of the hand 14 at the time of picking.
  • the determiner 42 ends the suction position planning. Furthermore, by determining the relative position and posture of the hand 14 for picking, the position candidate specified by the corresponding place candidate is determined as the place target position.
  • step S309 the determiner 42 determines whether the search for detection work that can be appropriately picked with respect to the candidates for the relative position and orientation of the hand 14 defined in all place candidates has been completed. If the search for detection work with respect to the candidates for the relative position and orientation of the hand 14 defined in all place candidates has not been completed, the determiner 42 returns to step S301 and reads out the candidate for the relative position and orientation of the hand 14 defined in the place candidate with the next highest priority among the multiple place candidates. Then, the processing from step S302 onwards is repeated for the newly read candidate for the relative position and orientation. In other words, the determiner 42 searches for detection work that can generate an appropriate target path with the candidate for the relative position and orientation of the changed hand 14.
  • the determiner 42 ends the pick-and-place process in step S310.
  • the determiner 42 determines the relative position and orientation of the hand 14 with respect to the target workpiece W to be picked from among the candidates for the relative position and orientation of the hand 14 included in the place candidates. Furthermore, the determiner 42 determines the target workpiece W from among multiple works.
  • the target placement position of the target work W is also determined by determining the relative position and posture of the hand 14 during picking.
  • the target paths for each placement candidate have already been created in the placement position plan. Therefore, once the target placement position is determined, the target path for the placement operation is also determined accordingly.
  • FIG. 12 is a timing chart of the place position plan, the pick-up position plan, and the robot operation.
  • place position planning For place position planning, first, place photography is performed (S101). This results in an image of the inside of the second container 92 being acquired. Next, the searcher 41 performs place position planning based on the image of the inside of the second container 92 (S102). This results in multiple place candidates being obtained.
  • the operation controller 43 causes the robot 1 to pick up the target workpiece W in the relative position and posture of the hand 14 determined by the gripping position plan.
  • a placing operation is executed.
  • the operation controller 43 operates the robot 1 according to the target path determined by the placing position plan and the gripping position plan. As a result, the target work W is placed at the placing target position.
  • place photography can be performed in parallel with pick photography, workpiece detection, gripping position planning, or picking operation.
  • Place position planning can be performed in parallel with pick photography, workpiece detection, gripping position planning, picking operation, or placing operation.
  • Pick photography can be performed in parallel with place photography, place position planning, or placing operation.
  • Grip position planning can be performed in parallel with place photography, picking operation, or placing operation.
  • the place position plan and grip position plan for the next pick-and-place process are executed in parallel with the robot operation.
  • the place photography is executed in parallel with the picking operation in the previous pick-and-place process. Since the situation inside the second container 92 does not change during the picking operation, the next place photography can be executed during the picking operation.
  • the pick photography is executed in parallel with the place operation in the previous pick-and-place process. Since the situation inside the first container 91 does not change during the placing operation, the pick photography can be executed.
  • Each of the place position plan and the grip position plan is executed in parallel with the picking operation and the placing operation in the previous pick-and-place process. Since each of the place position plan and the grip position plan is executed based on an image, each of the place position plan and the grip position plan can be executed during the robot operation.
  • the placement position planning is performed based on an image captured before the completion of the placement operation of the previous pick-and-place process.
  • the placement position planning is performed based on an image of the state in which the workpiece has not yet been placed in the previous placement operation. Therefore, in the placement position planning, the placement target position of the previous pick-and-place process is excluded and placement candidates are found.
  • FIG 13 is a flowchart of position planning according to the modified example.
  • the place position planning is performed based on the image captured by the place shooting before the completion of the previous placing operation, so in the place position planning, the place candidate is found excluding the previous placing target position.
  • the workpiece actually placed in the previous placing operation may be displaced from the placing target position.
  • the existing workpiece may move due to the previous placing operation.
  • the nth place shooting is performed after the n-2th (n is a natural number of 3 or more) placing operation and before the n-1th placing operation, and based on a comparison between the image captured by the nth place shooting and the image captured by the n-1th place shooting, it is determined whether or not the place target position of the n-1th placing operation needs to be corrected.
  • the n-1th placing operation has not yet been performed, so the place target position of the n-1th placing operation can be corrected.
  • steps S101, S102, S103, S104, and S105 in FIG. 13 are similar to the processes in steps S101, S102, S103, S104, and S105 in FIG. 9.
  • step S101 the shooting controller 44 causes the second camera 51B to perform place shooting.
  • step S401 the searcher 41 determines the difference between the image of the current, i.e., the nth place photograph and the image of the previous, i.e., the n-1th place photograph. Specifically, the image difference is determined based on the height map created by the height detector 46.
  • the searcher 41 detects the difference between the image of the nth place photograph and the image of the n-1th place photograph by the difference between the height map of the image of the nth place photograph and the height map of the image of the n-1th place photograph.
  • the n-1th place photograph is performed before the n-2th place operation.
  • the nth place photograph is performed after the n-2th place operation and before the n-1th place operation. Therefore, the difference in the height map corresponds to the change in the situation inside the second container 92 due to the n-2th place operation.
  • step S402 the searcher 41 determines the need to correct the placement target position of the previous, i.e., the n-1th, placing operation based on the image difference.
  • the n-1th placing operation has not yet been performed.
  • the position of the workpiece placed in the n-2th placing operation can be determined from the difference between the image taken in the nth placing operation and the image taken in the n-1th placing operation.
  • the movement of other workpieces due to the influence of the n-2th placing operation can also be determined from the difference between the image taken in the nth placing operation and the image taken in the n-1th placing operation.
  • the searcher 41 determines that the target position of the n-1th placing operation needs to be corrected. Alternatively, if another workpiece has entered the target position of the n-1th placing operation, the searcher 41 determines that the target position of the n-1th placing operation needs to be corrected.
  • step S102 If the searcher 41 determines that the (n-1)th place target position does not need to be corrected, it proceeds to step S102 and executes the current, i.e., nth, place position plan.
  • the place position plan is as described above.
  • the searcher 41 determines that the n-1th place target position needs to be corrected, then in step S403, it executes a place position correction to correct the n-1th place target position.
  • the operation controller 43 executes the n-1th place operation based on the corrected place target position.
  • the searcher 41 executes the nth place position plan in step S102.
  • steps S401, S402, and S403 are not executed until the second position planning.
  • Step S401 is executed from the third position planning onwards.
  • Figure 14 is a flowchart of the place position correction subroutine.
  • step S501 the searcher 41 searches for position candidates for the place position.
  • the searcher 41 creates multiple position candidates for the workpiece in the second container 92.
  • the searcher 41 searches for the position candidates using the height map created in step S401.
  • step S502 the searcher 41 assigns priorities to the position candidates.
  • steps S501 and S502 are the same processes as steps S202 and S203, respectively. However, these processes are for correcting the previous, i.e., n-1th, place target position, but are based on the image captured during the nth place shot.
  • step S503 the searcher 41 reads out the location candidate with the highest priority from among the multiple location candidates. This process is the same as step S205.
  • step S504 the path generator 47 generates a target path for placing the workpiece at the read-out position candidate.
  • the path generator 47 generates a target path for placing the workpiece at the position candidate at the already determined relative position and orientation of the hand 14.
  • the path generator 47 performs an interference check of the target path to confirm whether or not there is interference. In other words, the path generator 47 generates a target path without interference. If there is interference, the path generator 47 does not generate a target path.
  • step S505 the searcher 41 determines whether or not a target route has been generated. This process is the same as in step S207. If a target route has been generated, in step S506, the searcher 41 determines the position candidate as the previous, i.e., the (n-1)th place target position, and sets the generated target route as the previous, i.e., the (n-1)th place operation target route.
  • step S507 determines in step S507 whether or not an attempt has been made to generate a target route for all location candidates. This process is the same as step S209. If an attempt has not been made to generate a target route for all location candidates, the searcher 41 returns to step S503 and reads out the location candidate with the next highest priority from the multiple location candidates. The route generator 47 then attempts to generate a target route for the new location candidate (step S504), and the searcher 41 determines whether or not a target route has been generated (step S505).
  • searcher 41 ends the pick-and-place process in step S508.
  • the searcher 41 searches for a new placement target position from among the position candidates in descending order of priority without changing the relative position and orientation that has already been determined.
  • the searcher 41 finds a position candidate where the workpiece can be appropriately placed, it corrects the placement target position of the n-1th robot operation, i.e., the placement operation in the currently ongoing robot operation, to the new position candidate.
  • This place position correction is completed by the time the place operation in the currently ongoing robot operation is started. After the place position correction, the place position plan (S102) for the next robot operation of the currently ongoing robot operation is executed.
  • FIG. 15 is a timing chart of the place position planning, pickup position planning, and robot operation in the modified example.
  • a third place photograph is taken in parallel with the second picking operation.
  • the searcher 41 determines the difference between the image taken by the third place photograph and the image taken by the second place photograph (S401), and determines whether or not the place target position of the second place operation needs to be corrected (S402).
  • the second place photograph is taken before the first place operation
  • the third place photograph is taken after the first place operation. Therefore, by comparing the image taken by the third place photograph with the image taken by the second place photograph, the change in the situation inside the second container 92 caused by the first place operation can be determined.
  • the searcher 41 determines that the second place target position needs to be corrected, it executes the place position correction (S403).
  • the second place target position is corrected by the place position correction, and a target path is generated accordingly.
  • the place position correction is completed before the second place operation.
  • the operation controller 43 executes the second place operation according to the corrected place target position and target path.
  • the searcher 41 executes the third place position plan.
  • the next place position plan can be executed based on the image of the storage space before the placing operation is completed. Therefore, the nth place photograph is executed before the n-1th placing operation, and the need to correct the n-1th placing target position is determined using the image from the nth place photograph. Then, if correction of the placing target position is necessary, the placing target position is corrected to a position where the work can be appropriately placed with the current relative position and orientation of the hand 14. This allows the work to be appropriately placed while shortening the cycle time of the pick-and-place process.
  • priorities are assigned to the multiple position candidates. Then, it is determined whether or not the workpiece can be picked up, starting from the relative position and posture of the hand 14 that corresponds to the position candidate with the highest priority. Therefore, the relative position and posture of the hand 14 at the time of picking can be made to correspond to the position candidate with the highest priority as much as possible. As a result, it is possible to place the workpiece at a position candidate with a relatively high priority.
  • the target work W to be picked is determined from among multiple workpieces at the same time. In other words, it is possible to test multiple workpieces to see whether one of multiple candidates for the relative position and orientation of the hand 14 during placing can be used to properly pick the workpiece. In other words, it is possible to increase the options available when determining whether a workpiece can be picked using a candidate for the relative position and orientation of the hand 14.
  • the robot system 100 incorporating the main control device 3 is not limited to one that transfers the target work W from the first container 91 to the second container 92.
  • the robot 1 may perform debunking.
  • the storage location of the target work W is not limited to the second container 92.
  • the storage location of the target work W may be a conveyor, a shelf, a pallet, or the like.
  • the robot arm 12 is not limited to a vertical multi-joint type robot arm.
  • the robot arm 12 may be a horizontal multi-joint type, a parallel link type, a Cartesian coordinate type, or a polar coordinate type robot arm, etc.
  • the suction device 15 does not have to rotate around the reference axis T relative to the robot arm 12. Even if the suction device 15 cannot rotate around the reference axis T, it only needs to be positioned at a position offset in the radial direction around the reference axis T of the robot arm 12, that is, positioned eccentrically with respect to the reference axis T.
  • the suction device 15 may attract the workpiece by magnetic force or the like instead of negative pressure.
  • the holding of the workpiece by the hand 14 is not limited to suction, but may also be gripping, etc.
  • the camera 51 does not have to include a first camera 51A and a second camera 51B.
  • a single camera 51 may capture an image of the inside of the first container 91 and an image of the inside of the second container 92.
  • the camera 51 does not have to be fixedly disposed, but may be attached to the robot arm 12 and moved by the robot arm 12.
  • the robot system 100 may not be equipped with a camera 51. In that case, an image may be input to the main control device 3 from outside. The method of acquiring the image is not important. The image input from outside may be stored in the memory 33 or in the storage device 32.
  • the target work to be picked may not be piled up randomly, but may be aligned.
  • the target work is not limited to a rectangular parallelepiped work.
  • the target work may have a shape of a roughly triangular prism.
  • the target work may be a bag filled with powder or granular material such as fertilizer, lime, gravel, etc.
  • the processing by the operation controller 43, the photography controller 44, and the work detector 45 is merely an example.
  • the work detector 45 can detect a work from an image using various methods.
  • the processing of the searcher 41 and the determiner 42 is merely an example.
  • the searcher 41 can search for place location candidates using any method.
  • the searcher 41 may receive place location candidates from outside.
  • the searcher 41 assigns a position candidate to each of the candidates for the relative position and orientation of the hand 14, but is not limited to this.
  • the searcher 41 may assign a candidate for the relative position and orientation of the hand 14 to each of the position candidates.
  • the searcher 41 may assign multiple candidates for the relative position and orientation of the hand 14 to a single candidate position to create multiple place candidates. Alternatively, the searcher 41 may assign multiple position candidates to a single candidate for the relative position and orientation of the hand 14 to create multiple place candidates.
  • the searcher 41 determines whether the hand 14 and robot arm 12 can hold the workpiece without interfering with other objects depending on whether a target path can be generated, but is not limited to this. For example, the searcher 41 may only determine whether the hand 14 and robot arm 12 will interfere with other objects when the hand 14 holds the workpiece placed in the position candidate.
  • the determiner 42 determines whether the hand 14 and robot arm 12 can hold the workpiece without interfering with other objects depending on whether a target path can be generated, but is not limited to this. For example, the determiner 42 may only determine whether the hand 14 and robot arm 12 will interfere with other objects when the target workpiece is held by the hand 14.
  • the determiner 42 determines the relative position and orientation of the hand 14.
  • the target workpiece W may already be determined.
  • the control device does not have to be a single device such as the main control device 3.
  • the control device may include multiple separate devices.
  • the work detector 45, the determiner 42, and the operation controller 43 may each be realized by separate devices.
  • the flowchart is merely an example. Steps in the flowchart may be changed, replaced, added, omitted, etc. as appropriate. The order of steps in the flowchart may also be changed, and serial processing may be performed in parallel. For example, the placement photography in step S101 in FIG. 9 may be performed consecutively with the picking photography in S103 or the workpiece detection in step S104.
  • position planning and robot operation may be performed sequentially rather than in parallel.
  • a processor includes transistors and other circuits and is considered a circuit or processing circuit.
  • a processor may be a programmable processor that executes a program stored in a memory.
  • a circuit, unit, or means is hardware that is programmed to realize or executes the described functions.
  • the hardware may be any hardware disclosed in this specification or any hardware known to be programmed to realize or execute the described functions.
  • the hardware is a processor that is considered to be a type of circuitry
  • the circuit, means, or unit is a combination of hardware and software used to configure the hardware and/or processor.
  • the main control device 3 is a control device that controls the robot 1 that has a robot arm 12 and a hand 14 attached to the robot arm 12 and performs pick-and-place processing of a workpiece, and is equipped with a searcher 41 that searches for multiple candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at a candidate place position, and a determiner 42 that determines the relative position and orientation of the hand 14 with respect to the workpiece W to be picked from multiple candidates for the relative position and orientation of the hand 14 at the time of placement.
  • a searcher 41 that searches for multiple candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at a candidate place position
  • a determiner 42 that determines the relative position and orientation of the hand 14 with respect to the workpiece W to be picked from multiple candidates for the relative position and orientation of the hand 14 at the time of placement.
  • the searcher 41 searches for a plurality of the position candidates and also searches for a plurality of candidates for the relative position and orientation of the hand 14 corresponding to the plurality of the position candidates, and the determiner 42 determines the position candidate corresponding to the candidate for the relative position and orientation of the hand 14 at the time of placement, which is determined as the relative position and orientation of the hand 14 with respect to the target workpiece W, as the placement target position of the workpiece.
  • the placement target position is also determined by determining the relative position and orientation of the hand 14 during picking.
  • the relative position and orientation of the hand 14 during picking and the placement target position are determined in relation to each other.
  • the searcher 41 assigns priorities to multiple position candidates, and the determiner 42 determines whether multiple candidates for the relative position and orientation of the hand 14 at the time of placement are the relative position and orientation of the hand 14 with respect to the target workpiece W, in order of the priority of the corresponding position candidates.
  • the determiner 42 determines the target work W from among a plurality of works when determining the relative position and posture of the hand 14 with respect to the target work W to be picked from among a plurality of candidates for the relative position and posture of the hand 14 at the time of placement.
  • the determiner 42 when there are multiple options for the target workpiece, the determiner 42 also determines the target workpiece from among the multiple works when determining the relative position and orientation of the hand 14. Since the options for the target workpiece increase, the range of choices for determining the relative position and orientation of the hand 14 capable of picking up the target workpiece is expanded.
  • the main control device 3 described in any one of [1] to [4] further includes an operation controller 43 that causes the robot arm 12 and the hand 14 to execute robot operations including a picking operation for picking the target work W and a placing operation for placing the target work W, and the searcher 41 and the determiner 42 search for candidates for the relative position and posture of the hand 14 at the time of placing for the next picking operation and the placing operation, and determine the relative position and posture of the hand 14 with respect to the target work W for picking, in parallel with the robot operation by the operation controller 43.
  • an operation controller 43 that causes the robot arm 12 and the hand 14 to execute robot operations including a picking operation for picking the target work W and a placing operation for placing the target work W, and the searcher 41 and the determiner 42 search for candidates for the relative position and posture of the hand 14 at the time of placing for the next picking operation and the placing operation, and determine the relative position and posture of the hand 14 with respect to the target work W for picking, in parallel with the robot operation by the operation controller 43.
  • the robot operation including the picking operation and the placing operation, and the search for candidates for the relative position and orientation of the hand 14 during placing and the determination of the relative position and orientation of the hand 14 during picking are performed in parallel.
  • the cycle time of the pick-and-place process can be shortened.
  • the main control device 3 described in any one of [1] to [5] further includes an imaging controller 44 that causes the first camera 51A to perform picking photography to obtain an image of the work before picking and causes the second camera 51B to perform place photography to obtain an image of the space in which the work is to be placed, the searcher 41 searches for the position candidates and the relative position and posture candidates of the hand 14 based on the image obtained by the place photography, the determiner 42 determines the relative position and posture of the hand 14 with respect to the target work W based on the image obtained by the pick photography, the imaging controller 44 causes the second camera 51B to perform the n-th place photography after the n-2th place operation (n is a natural number of 3 or more) and before the n-1th place operation, and the searcher 41 corrects the place target position of the n-1th place operation based on the image obtained by the nth place photography.
  • an imaging controller 44 that causes the first camera 51A to perform picking photography to obtain an image of the work before picking and causes the second camera 51B to perform place photography to
  • the searcher 41 searches for position candidates and candidates for the relative position and orientation of the hand 14 based on the image of the place photograph.
  • the determiner 42 determines the relative position and orientation of the hand 14 with respect to the target work W based on the image of the pick photograph.
  • the nth place photograph is performed after the n-2th place operation and before the n-1th place operation.
  • the n-1th place photograph is performed before the n-2th place operation.
  • the place target position of the n-1th place operation is basically determined based on the image before the n-2th place operation. Therefore, the situation of the space where the placement is scheduled to take place may have changed by the completion of the n-2th place operation.
  • the searcher 41 corrects the placement target position of the (n-1)th placement operation based on the image captured during the nth placement shot. This allows the (n-1)th placement operation to be executed more appropriately.
  • the robot system 100 includes a robot 1 having a robot arm 12 and a hand 14 attached to the robot arm 12, and a main control device 3 described in any one of [1] to [6].
  • a control method for a robot 1 having a robot arm 12 and a hand 14 attached to the robot arm 12 and performing pick-and-place processing of a workpiece includes searching for multiple candidates for the relative position and orientation of the hand 14 with respect to a workpiece to be placed at a candidate placement position, and determining the relative position and orientation of the hand 14 with respect to a workpiece W to be picked from multiple candidates for the relative position and orientation of the hand 14 at the time of placement.
  • Robot system 100 Robot system 1 Robot 12 Robot arm 14 Hand 3 Main control device (control device) 41 Searcher 42 Deciding device 43 Operation controller 44 Photography controller 51A First camera 51B Second camera W Target work

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A main control device 3 has a robot arm 12 and a hand 14 attached to the robot arm 12, and controls a robot 1 that performs pick-and-place processing of a workpiece. The main control device 3 comprises a searcher 41 that searches a plurality of candidates for the relative position and posture of the hand 14 with respect to a workpiece placed at a placement position candidate, and a determiner 42 that determines the relative position and posture of the hand 14 with respect to a workpiece W to be picked from a plurality of candidates for the relative position and posture of the hand at the time of placement.

Description

制御装置、ロボットシステム及びロボットの制御方法CONTROL DEVICE, ROBOT SYSTEM, AND ROBOT CONTROL METHOD

 ここに開示された技術は、制御装置、ロボットシステム及びロボットの制御方法に関する。 The technology disclosed herein relates to a control device, a robot system, and a method for controlling a robot.

 従来より、ピッキングを行うロボットを制御する制御装置が知られている。例えば、特許文献1に記載された制御装置は、ロボットアーム及びハンドを制御して、ワークをピッキングして、別の場所に搬送してプレースする。この制御装置は、ピッキング対象のワークの状態に応じて把持の態様を変更することによって、ロボットアーム及びハンドが他の物体と干渉しないようにワークの把持を実現する。 Control devices that control robots that perform picking are known from the past. For example, the control device described in Patent Document 1 controls the robot arm and hand to pick up a workpiece and transport and place it at another location. This control device changes the gripping mode depending on the state of the workpiece to be picked, thereby realizing gripping of the workpiece without the robot arm and hand interfering with other objects.

特開2004-230513号公報JP 2004-230513 A

 ワークのピッキングにおいては複数の候補の中から一の対象ワークが選択されることが多いため、一の対象ワークが選択される際にロボットアーム及びハンドと他の物体との干渉が考慮される傾向にある。 When picking a workpiece, a single target workpiece is often selected from multiple candidates, so interference between the robot arm and hand and other objects tends to be taken into consideration when selecting the target workpiece.

 しかしながら、ピッキング時の干渉だけを考慮すると、ワークの適切なプレースが困難となる虞がある。 However, if only interference during picking is considered, it may be difficult to place the workpiece appropriately.

 ここに開示された技術は、かかる点に鑑みてなされたものであり、その目的とするところは、ワークのピッキング及びプレースを適切に実行することにある。 The technology disclosed here has been developed in light of these points, and its purpose is to properly pick and place workpieces.

 本開示の制御装置は、ロボットアーム、及び、前記ロボットアームに取り付けられたハンドを有し、ワークのピックアンドプレース処理を行うロボットを制御する制御装置であって、プレースの位置候補にプレースされるワークに対する前記ハンドの相対的な位置及び姿勢候補を複数探索する探索器と、ピッキングの対象ワークに対する前記ハンドの相対的な位置及び姿勢を、プレース時の前記ハンドの相対的な位置及び姿勢の複数の候補の中から決定する決定器とを備える。 The control device disclosed herein is a control device that controls a robot that has a robot arm and a hand attached to the robot arm and performs pick-and-place processing of a workpiece, and is equipped with a searcher that searches for multiple candidates for the relative position and orientation of the hand with respect to the workpiece to be placed in a candidate placement position, and a determiner that determines the relative position and orientation of the hand with respect to the workpiece to be picked from multiple candidates for the relative position and orientation of the hand at the time of placement.

 本開示のロボットシステムは、ロボットアームと前記ロボットアームに取り付けられたハンドとを有するロボットと、前記制御装置とを備える。 The robot system disclosed herein comprises a robot having a robot arm and a hand attached to the robot arm, and the control device.

 本開示のロボットの制御方法は、ロボットアームと前記ロボットアームに取り付けられたハンドとを有し、ワークのピックアンドプレース処理を行うロボットの制御方法であって、プレースの位置候補にプレースされるワークに対する前記ハンドの相対的な位置及び姿勢の候補を複数探索することと、ピッキングの対象ワークに対する前記ハンドの相対的な位置及び姿勢を、プレース時の前記ハンドの相対的な位置及び姿勢の複数の候補の中から決定することとを含む。 The robot control method disclosed herein is a method for controlling a robot that has a robot arm and a hand attached to the robot arm and performs pick-and-place processing of a workpiece, and includes searching for multiple candidates for the relative position and orientation of the hand with respect to a workpiece to be placed at a candidate placement position, and determining the relative position and orientation of the hand with respect to the workpiece to be picked from the multiple candidates for the relative position and orientation of the hand at the time of placement.

 前記制御装置によれば、ワークのピッキング及びプレースを適切に実行することができる。 The control device allows the workpiece to be picked and placed appropriately.

 前記ロボットシステムによれば、ワークのピッキング及びプレースを適切に実行することができる。 The robot system can properly pick and place workpieces.

 前記ロボットの制御方法によれば、ワークのピッキング及びプレースを適切に実行することができる。 The robot control method described above allows the workpiece to be picked and placed appropriately.

図1は、ロボットシステムの構成を示す模式図である。FIG. 1 is a schematic diagram showing the configuration of a robot system. 図2は、ロボット制御装置及び主制御装置3の概略的なハードウェア構成を示す図である。FIG. 2 is a diagram showing a schematic hardware configuration of the robot control device and the main control device 3. As shown in FIG. 図3は、処理器の制御系統の構成を示すブロック図である。FIG. 3 is a block diagram showing the configuration of a control system of the processor. 図4は、第1コンテナ内を上方から視た図である。FIG. 4 is a view of the inside of the first container as seen from above. 図5は、第2コンテナ内を示す例示的な斜視図である。FIG. 5 is an exemplary perspective view showing the inside of the second container. 図6は、位置候補の探索を説明するための第2コンテナ内の模式的な平面図である。FIG. 6 is a schematic plan view of the inside of the second container for explaining the search for position candidates. 図7は、対象ワークを吸着するときのハンドを示す説明図である。FIG. 7 is an explanatory diagram showing the hand when picking up the target workpiece. 図8は、基準軸の方向に見た場合のハンドの相対位置姿勢を示す模式図である。FIG. 8 is a schematic diagram showing the relative position and orientation of the hand when viewed in the direction of the reference axis. 図9は、位置計画のフローチャートである。FIG. 9 is a flow chart of location planning. 図10は、プレース位置計画のサブルーチンのフローチャートである。FIG. 10 is a flow chart of the place location planning subroutine. 図11は、吸着位置計画のサブルーチンのフローチャートである。FIG. 11 is a flow chart of the pickup position planning subroutine. 図12は、プレース位置計画、吸着位置計画及びロボット動作のタイミングチャートである。FIG. 12 is a timing chart of the placement position plan, the pickup position plan, and the robot operation. 図13は、変形例に係る位置計画のフローチャートである。FIG. 13 is a flowchart of a position plan according to a modified example. 図14は、プレース位置修正のサブルーチンのフローチャートである。FIG. 14 is a flow chart of a subroutine for correcting a place position. 図15は、変形例に係るプレース位置計画、吸着位置計画及びロボット動作のタイミングチャートである。FIG. 15 is a timing chart of the placement position plan, the pickup position plan, and the robot operation according to the modified example.

 以下、例示的な実施形態を図面に基づいて詳細に説明する。図1は、ロボットシステム100の構成を示す模式図である。 Below, an exemplary embodiment will be described in detail with reference to the drawings. Figure 1 is a schematic diagram showing the configuration of a robot system 100.

 ロボットシステム100は、ロボット1と、ロボット1を制御する主制御装置3とを備えている。主制御装置3は、ロボット1を制御してロボット1に処理を実行させる。この例では、ロボット1による処理は、ピックアンドプレース処理である。例えば、ロボット1は、複数のワークがバラ積みされた第1コンテナ91内の対象ワークをピッキングして第2コンテナ92内に移送する。ロボット1は、対象ワークを第2コンテナ92内に規則的に配列する。ロボットシステム100は、ピックアンドプレース処理を繰り返すことによって、第1コンテナ91内の複数のワークを第2コンテナ92内に配列する。対象ワークは、例えば、略直方体の形状を有する。 The robot system 100 includes a robot 1 and a main control device 3 that controls the robot 1. The main control device 3 controls the robot 1 to cause the robot 1 to execute a process. In this example, the process performed by the robot 1 is a pick-and-place process. For example, the robot 1 picks up a target workpiece from a first container 91 in which multiple workpieces are piled up randomly, and transfers it into a second container 92. The robot 1 arranges the target workpieces in a regular pattern in the second container 92. The robot system 100 arranges the multiple workpieces in the first container 91 into the second container 92 by repeating the pick-and-place process. The target workpiece has, for example, a roughly rectangular parallelepiped shape.

 ロボットシステム100は、ワークの画像を取得するカメラ51を備えていてもよい。カメラ51は、第1コンテナ91内の画像を取得する第1カメラ51Aと、第2コンテナ92内の画像を取得する第2カメラ51Bとを含んでいてもよい。第1カメラ51Aは、第1コンテナ91の上方に固定的に配置されている。第1カメラ51Aは、第1コンテナ91内の画像を上方から撮影する。第1カメラ51Aは、第1コンテナ91内の複数のワークを含む画像を取得する。第2カメラ51Bは、第2コンテナ92の上方に固定的に配置されている。第2カメラ51Bは、第2コンテナ92内の画像を上方から撮影する。第2カメラ51Bは、第2コンテナ92内の複数のワークを含む画像を取得する。以下、第1カメラ51Aと第2カメラ51Bとを区別しない場合には、単にカメラ51と称する。 The robot system 100 may include a camera 51 that captures an image of the workpiece. The camera 51 may include a first camera 51A that captures an image inside the first container 91 and a second camera 51B that captures an image inside the second container 92. The first camera 51A is fixedly disposed above the first container 91. The first camera 51A captures an image inside the first container 91 from above. The first camera 51A captures an image including a plurality of workpieces inside the first container 91. The second camera 51B is fixedly disposed above the second container 92. The second camera 51B captures an image inside the second container 92 from above. The second camera 51B captures an image including a plurality of workpieces inside the second container 92. Hereinafter, when the first camera 51A and the second camera 51B are not distinguished from each other, they will be simply referred to as cameras 51.

 ここで、画像は、二次元画像又は三次元画像である。三次元画像は、点群データ、RGB-D画像、RGB画像、デプス画像又はボクセル等であり得る。つまり、カメラ51は、三次元カメラ、即ち、RGB-D画像を出力するRGB-Dカメラ、RGB画像を取得するステレオカメラ、又は、点群データを取得する三次元ビジョンセンサ等であり得る。 Here, the image is a two-dimensional image or a three-dimensional image. The three-dimensional image can be point cloud data, an RGB-D image, an RGB image, a depth image, or a voxel, etc. In other words, the camera 51 can be a three-dimensional camera, i.e., an RGB-D camera that outputs an RGB-D image, a stereo camera that acquires an RGB image, or a three-dimensional vision sensor that acquires point cloud data, etc.

 -ロボット-
 ロボット1は、ロボットアーム12及びロボットアーム12に取り付けられたハンド14を有する。ロボット1は、この例では、産業用ロボットである。ハンド14は、いわゆるエンドエフェクタである。ロボット1が配置される空間には、直交3軸のベース座標系が規定されている。
-robot-
The robot 1 has a robot arm 12 and a hand 14 attached to the robot arm 12. In this example, the robot 1 is an industrial robot. The hand 14 is a so-called end effector. A base coordinate system of three orthogonal axes is defined in a space in which the robot 1 is disposed.

 ロボットアーム12は、三次元状に動作するように構成されている。具体的には、ロボットアーム12は、少なくとも3自由度の並進を含む動作を行うように構成されている。この例では、ロボットアーム12は、垂直多関節型のロボットアームである。ロボットアーム12は、ベース10に支持されている。ロボットアーム12は、複数のリンクと、複数のリンクを接続する複数の関節と、複数の関節を回転駆動するサーボモータとを有している。 The robot arm 12 is configured to operate in a three-dimensional manner. Specifically, the robot arm 12 is configured to perform operations including translation with at least three degrees of freedom. In this example, the robot arm 12 is a vertical multi-joint robot arm. The robot arm 12 is supported by the base 10. The robot arm 12 has multiple links, multiple joints that connect the multiple links, and a servo motor that rotates the multiple joints.

 詳しくは、ロボットアーム12は、ベース10に連結された第1リンク12aと、第1リンク12aに連結された第2リンク12bと、第2リンク12bに連結された第3リンク12cと、第3リンク12cに連結された第4リンク12dと、第4リンク12dに連結された第5リンク12eとを有している。 In detail, the robot arm 12 has a first link 12a connected to the base 10, a second link 12b connected to the first link 12a, a third link 12c connected to the second link 12b, a fourth link 12d connected to the third link 12c, and a fifth link 12e connected to the fourth link 12d.

 詳しくは、ベース10と第1リンク12aとは、鉛直方向に延びる軸回りに回転可能な第1関節13aを介して互いに連結されている。第1リンク12aと第2リンク12bとは、水平方向に延びる軸回りに回転可能な第2関節13bを介して互いに連結されている。第2リンク12bと第3リンク12cとは、水平方向に延びる軸回りに回転可能な第3関節13cを介して互いに連結されている。第3リンク12cと第4リンク12dとは、第4リンク12dの軸心(即ち、第4リンク12dが延びる方向)回りに回転可能な第4関節13dを介して互いに連結されている。第4リンク12dと第5リンク12eとは、第4リンク12dの軸心と直交する軸回りに回転可能な第5関節13eを介して互いに連結されている。 In detail, the base 10 and the first link 12a are connected to each other via a first joint 13a that can rotate around an axis extending in the vertical direction. The first link 12a and the second link 12b are connected to each other via a second joint 13b that can rotate around an axis extending in the horizontal direction. The second link 12b and the third link 12c are connected to each other via a third joint 13c that can rotate around an axis extending in the horizontal direction. The third link 12c and the fourth link 12d are connected to each other via a fourth joint 13d that can rotate around the axis of the fourth link 12d (i.e., the direction in which the fourth link 12d extends). The fourth link 12d and the fifth link 12e are connected to each other via a fifth joint 13e that can rotate around an axis perpendicular to the axis of the fourth link 12d.

 ロボットアーム12は、各関節を回転駆動するサーボモータ16(図2参照)を有している。各サーボモータ16は、エンコーダ16a(図2参照)を有している。 The robot arm 12 has servo motors 16 (see Figure 2) that rotate each joint. Each servo motor 16 has an encoder 16a (see Figure 2).

 このように構成されたロボットアーム12は、直交三軸のそれぞれの方向への並進動作、及び、直交三軸のそれぞれの軸回りの回転動作を行うように構成されている。 The robot arm 12 thus configured is configured to perform translational movements in the directions of each of the three orthogonal axes, and rotational movements around each of the three orthogonal axes.

 ハンド14は、ロボットアーム12の所定の基準軸Tを中心とする半径方向へオフセットした位置に配置された吸着器15を含む。吸着器15は、基準軸Tに対して偏心した位置に配置されている。例えば、ハンド14は、ロボットアーム12に取り付けられるベース14aを有している。ベース14aは、基準軸Tを中心とする半径方向へ延びている。ベース14aのうち基準軸Tから遠い方の端部にプレート又はブロック等を介して吸着器15が取り付けられている。 The hand 14 includes a suction device 15 that is positioned at a position offset in the radial direction around a predetermined reference axis T of the robot arm 12. The suction device 15 is positioned eccentrically with respect to the reference axis T. For example, the hand 14 has a base 14a that is attached to the robot arm 12. The base 14a extends in the radial direction around the reference axis T. The suction device 15 is attached to the end of the base 14a that is farther from the reference axis T via a plate, block, or the like.

 さらに、吸着器15は、ロボットアーム12に対する位置を変更可能に構成されている。例えば、ハンド14は、基準軸T回りに回転可能にロボットアーム12に取り付けられる。具体的には、ハンド14は、ロボットアーム12の先端部、即ち、第5リンク12eに連結されている。第5リンク12eとハンド14とは、基準軸T回りに回転可能に第6関節13fを介して互いに連結されている。吸着器15は、基準軸T回りのハンド14の回転によってロボットアーム12に対する相対位置を変更する。 Furthermore, the suction device 15 is configured to be able to change its position relative to the robot arm 12. For example, the hand 14 is attached to the robot arm 12 so as to be rotatable around the reference axis T. Specifically, the hand 14 is connected to the tip of the robot arm 12, i.e., the fifth link 12e. The fifth link 12e and the hand 14 are connected to each other via the sixth joint 13f so as to be rotatable around the reference axis T. The suction device 15 changes its relative position with respect to the robot arm 12 by rotating the hand 14 around the reference axis T.

 ハンド14は、吸着器15によってワークを吸着する。吸着器15は、吸着パッドを有している。吸着器15には、負圧発生装置に接続されたエアホースが接続されている。エアホースには、アクチュエータ15a(図2参照)としての電磁弁が設けられている。電磁弁が制御されることによって、吸着器15による吸着及びその解除が切り替えられる。吸着器15は、対象ワークWの面を吸着する。エアホースには、圧力センサ15b(図2参照)が設けられている。圧力センサ15bは、エアホース内の負圧の大きさを検出する。 The hand 14 sucks the workpiece with a suction device 15. The suction device 15 has a suction pad. An air hose that is connected to a negative pressure generator is connected to the suction device 15. The air hose is provided with a solenoid valve that serves as an actuator 15a (see Figure 2). By controlling the solenoid valve, the suction device 15 can switch between suction and release. The suction device 15 sucks the surface of the target workpiece W. The air hose is provided with a pressure sensor 15b (see Figure 2). The pressure sensor 15b detects the magnitude of the negative pressure in the air hose.

 図2は、ロボット制御装置2及び主制御装置3の概略的なハードウェア構成を示す図である。主制御装置3は、ロボット制御装置2と信号及び情報等の送受信を行う。主制御装置3は、ロボット制御装置2を介してロボット1を制御する。主制御装置3は、指令をロボット制御装置2に出力する。ロボット制御装置2は、主制御装置3からの指令に応じて、ロボットアーム12のサーボモータ16を制御する。 FIG. 2 is a diagram showing the general hardware configuration of the robot controller 2 and the main controller 3. The main controller 3 transmits and receives signals and information to and from the robot controller 2. The main controller 3 controls the robot 1 via the robot controller 2. The main controller 3 outputs commands to the robot controller 2. The robot controller 2 controls the servo motor 16 of the robot arm 12 in response to commands from the main controller 3.

 主制御装置3には、第1カメラ51A及び第2カメラ51Bからの画像が入力される。主制御装置3は、画像に基づいて対象ワークWを検出する。それに加えて、主制御装置3は、検出された対象ワークWに基づいてロボット1の目標経路を生成し、生成された目標経路に対応する指令をロボット制御装置2に出力する。主制御装置3には、アクチュエータ15a及び圧力センサ15bが接続されている。主制御装置3は、アクチュエータ15a及び圧力センサ15bを介してハンド14による吸着を制御する。主制御装置3は、制御装置の一例である。 Images from the first camera 51A and the second camera 51B are input to the main control device 3. The main control device 3 detects the target workpiece W based on the images. In addition, the main control device 3 generates a target path for the robot 1 based on the detected target workpiece W, and outputs a command corresponding to the generated target path to the robot control device 2. An actuator 15a and a pressure sensor 15b are connected to the main control device 3. The main control device 3 controls suction by the hand 14 via the actuator 15a and the pressure sensor 15b. The main control device 3 is an example of a control device.

 ロボット制御装置2は、処理器21と記憶器22とメモリ23とを有している。 The robot control device 2 has a processor 21, a memory 22, and a memory 23.

 処理器21は、ロボット制御装置2の全体を制御する。処理器21は、各種の演算処理を行う。例えば、処理器21は、CPU(Central Processing Unit)等のプロセッサで形成されている。処理器21は、MCU(Micro Controller Unit)、MPU(Micro Processor Unit)、FPGA(Field Programmable Gate Array)、PLC(Programmable Logic Controller)、システムLSI等で形成されていてもよい。 The processor 21 controls the entire robot control device 2. The processor 21 performs various types of arithmetic processing. For example, the processor 21 is formed of a processor such as a CPU (Central Processing Unit). The processor 21 may also be formed of an MCU (Micro Controller Unit), an MPU (Micro Processor Unit), an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), a system LSI, etc.

 記憶器22は、処理器21で実行されるプログラム及び各種データを格納している。記憶器22は、不揮発性メモリ、HDD(Hard Disc Drive)又はSSD(Solid State Drive)等で形成される。メモリ23は、データ等を一時的に格納する。例えば、メモリ23は、揮発性メモリで形成される。 The memory 22 stores the programs executed by the processor 21 and various data. The memory 22 is formed of a non-volatile memory, a hard disc drive (HDD), a solid state drive (SSD), etc. The memory 23 temporarily stores data, etc. For example, the memory 23 is formed of a volatile memory.

 処理器21は、主制御装置3からの指令に基づいてサーボモータ16を駆動する。このとき、ロボット制御装置2は、エンコーダ16aの検出結果に基づいてサーボモータ16への供給電流をフィードバック制御する。 The processor 21 drives the servo motor 16 based on a command from the main control device 3. At this time, the robot control device 2 feedback controls the current supplied to the servo motor 16 based on the detection result of the encoder 16a.

 主制御装置3は、処理器31と記憶器32とメモリ33とを有している。 The main control device 3 has a processor 31, a memory 32, and a memory 33.

 処理器31は、主制御装置3の全体を制御する。処理器31は、各種の演算処理を行う。例えば、処理器31は、CPU(Central Processing Unit)等のプロセッサで形成されている。処理器31は、MCU(Micro Controller Unit)、MPU(Micro Processor Unit)、FPGA(Field Programmable Gate Array)、PLC(Programmable Logic Controller)、システムLSI等で形成されていてもよい。 The processor 31 controls the entire main control device 3. The processor 31 performs various types of arithmetic processing. For example, the processor 31 is formed of a processor such as a CPU (Central Processing Unit). The processor 31 may also be formed of an MCU (Micro Controller Unit), an MPU (Micro Processor Unit), an FPGA (Field Programmable Gate Array), a PLC (Programmable Logic Controller), a system LSI, etc.

 記憶器32は、処理器31で実行されるプログラム及び各種データを格納している。記憶器32は、不揮発性メモリ、HDD(Hard Disc Drive)又はSSD(Solid State Drive)等で形成される。 The memory 32 stores the programs and various data executed by the processor 31. The memory 32 is formed of a non-volatile memory, a HDD (Hard Disc Drive), an SSD (Solid State Drive), etc.

 例えば、記憶器32は、物体を検出する物体検出プログラム、ハンド14の位置姿勢を決定する位置姿勢決定プログラム及びロボット1の目標経路を生成する経路生成プログラム等を格納している。各種プログラムは、主制御装置3に各種機能を実現させる。 For example, the memory 32 stores an object detection program that detects objects, a position and orientation determination program that determines the position and orientation of the hand 14, and a path generation program that generates a target path for the robot 1. The various programs cause the main control device 3 to realize various functions.

 メモリ33は、データ等を一時的に格納する。例えば、メモリ33は、揮発性メモリで形成される。メモリ33は、カメラ51からの画像を記憶する。 Memory 33 temporarily stores data, etc. For example, memory 33 is formed of a volatile memory. Memory 33 stores images from camera 51.

 処理器31は、ロボット1に、ハンド14を対象ワークWまで移動させて吸着器15に対象ワークWを吸着させるピッキング動作及びハンド14に吸着された対象ワークWをプレース目標位置まで移動させて吸着器15に対象ワークWの吸着を解除させるプレース動作を実行させる。処理器31は、ピッキング動作の後にプレース動作をロボット1に実行させる。 The processor 31 causes the robot 1 to execute a picking operation in which the hand 14 moves to the target workpiece W and causes the suction device 15 to suction the target workpiece W, and a placing operation in which the target workpiece W adsorbed by the hand 14 moves to a target placement position and causes the suction device 15 to release the suction of the target workpiece W. The processor 31 causes the robot 1 to execute a placing operation after the picking operation.

 処理器31は、ピッキング動作に先立ち、ピッキング時のワークに対するハンド14の相対的な位置及び姿勢を決定する。処理器31は、プレース動作に先立ち、プレース時の対象ワークWのプレース目標位置を決定する。以下、ワークに対するハンド14の相対的な位置及び姿勢を単に「ハンド14の相対位置姿勢」と称する。 Prior to a picking operation, the processor 31 determines the relative position and orientation of the hand 14 with respect to the workpiece at the time of picking. Prior to a placing operation, the processor 31 determines the target placement position of the target workpiece W at the time of placement. Hereinafter, the relative position and orientation of the hand 14 with respect to the workpiece will be referred to simply as the "relative position and orientation of the hand 14."

 図3は、処理器31の制御系統の構成を示すブロック図である。処理器31は、記憶器32からプログラムをメモリ33に読み出して展開することによって、各種機能を実現する。詳しくは、処理器31は、収容スペースにおけるプレースの位置候補にプレースされるワークに対するハンド14の相対位置姿勢の候補を探索する探索器41と、ピッキングの対象ワークに対するハンド14の相対位置姿勢を決定する決定器42として機能する。さらに、処理器31は、対象ワークWをピッキングするピッキング動作、及び、対象ワークWをプレース位置へプレースするプレース動作を含むロボット動作をロボットアーム12及びハンド14に実行させる動作制御器43として機能する。また、処理器31は、カメラ51に画像を取得させる撮影制御器44と、画像に基づいてワークを検出するワーク検出器45と、画像に基づいて収容スペースの高さ情報を検出する高さ検出器46と、ロボットアーム12及びハンド14の目標経路を生成する経路生成器47としても機能し得る。 FIG. 3 is a block diagram showing the configuration of the control system of the processor 31. The processor 31 realizes various functions by reading out a program from the storage device 32 to the memory 33 and expanding it. In detail, the processor 31 functions as a searcher 41 that searches for candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at the candidate placement position in the storage space, and a determiner 42 that determines the relative position and orientation of the hand 14 with respect to the target workpiece to be picked. Furthermore, the processor 31 functions as an operation controller 43 that causes the robot arm 12 and the hand 14 to execute robot operations including a picking operation for picking the target workpiece W and a placing operation for placing the target workpiece W at the placement position. The processor 31 can also function as a photography controller 44 that causes the camera 51 to acquire an image, a workpiece detector 45 that detects the workpiece based on the image, a height detector 46 that detects height information of the storage space based on the image, and a path generator 47 that generates a target path for the robot arm 12 and the hand 14.

 撮影制御器44は、第1カメラ51A及び第2カメラ51Bを制御して、第1カメラ51A及び第2カメラ51Bのそれぞれに画像を取得させる。撮影制御器44は、第1コンテナ91内の画像を取得するピッキング撮影を第1カメラ51Aに実行させる。撮影制御器44は、第2コンテナ92内の画像を取得するプレース撮影を第2カメラ51Bに実行させる。撮影制御器44は、カメラ51からの画像をメモリ33に保存する。 The photography controller 44 controls the first camera 51A and the second camera 51B to capture images from each of the first camera 51A and the second camera 51B. The photography controller 44 causes the first camera 51A to perform picking photography to capture images inside the first container 91. The photography controller 44 causes the second camera 51B to perform place photography to capture images inside the second container 92. The photography controller 44 stores the images from the cameras 51 in the memory 33.

 ワーク検出器45は、画像の中からワークを画像認識によって検出する。画像には、複数のワークが含まれ得る。ワーク検出器45は、例えば、RANSAC(Random Sample Consensus)によって平面検出を行って、ワークの面(以下、「対象面S」という)を検出する。つまり、ワーク検出器45は、画像に含まれる所定の形状の平面を検出することによってワークを検出する。図4は、第1コンテナ91内を上方から視た図である。ワーク検出器45は、複数の対象面S、即ち、複数のワークを検出し得る。ワーク検出器45は、対象面Sを検出する際に、少なくとも対象面Sの中心位置、対象面Sの法線方向、対象面Sの縦横の向き及び対象面Sのサイズ(例えば、縦横サイズ)を求める。尚、対象面Sのサイズは、予め記憶器32に保存されていてもよい。 The work detector 45 detects a work from within an image by image recognition. The image may include multiple workpieces. The work detector 45 performs plane detection, for example, by RANSAC (Random Sample Consensus) to detect the surface of the workpiece (hereinafter referred to as the "target surface S"). In other words, the work detector 45 detects a workpiece by detecting a plane of a predetermined shape included in the image. FIG. 4 is a view of the inside of the first container 91 from above. The work detector 45 can detect multiple target surfaces S, i.e., multiple workpieces. When detecting the target surface S, the work detector 45 determines at least the center position of the target surface S, the normal direction of the target surface S, the vertical and horizontal orientations of the target surface S, and the size of the target surface S (e.g., the vertical and horizontal sizes). The size of the target surface S may be stored in advance in the memory 32.

 高さ検出器46は、対象ワークWの収容先となる収容スペースの高さ情報を検出する。この例では、収容スペースは、第2コンテナ92の内部である。高さ情報は、収容スペースの各位置の高さに関する情報であり、具体的には、第2コンテナ92内の各位置の高さ情報である。第2コンテナ92内のうち底面が露出しているエリアでは、底面の各位置の高さが高さ情報となる。第2コンテナ92内のうちワークが存在するエリアでは、ワークの上面の各位置の高さが高さ情報となる。高さ検出器46は、第2コンテナ92内の画像に基づいて、収容スペースの高さ情報を検出する。高さ検出器46は、例えば、三次元画像の点群データから、離散化又は平滑化などの前処理を行った上で、収容スペースの高さ情報を求める。高さ検出器46は、収容スペースの高さ情報を高さマップとして出力する。 The height detector 46 detects height information of the storage space in which the target workpiece W is to be stored. In this example, the storage space is inside the second container 92. The height information is information about the height of each position in the storage space, specifically, the height information of each position inside the second container 92. In areas inside the second container 92 where the bottom surface is exposed, the height information is the height of each position on the bottom surface. In areas inside the second container 92 where the workpiece is present, the height information is the height of each position on the top surface of the workpiece. The height detector 46 detects the height information of the storage space based on an image inside the second container 92. The height detector 46 obtains the height information of the storage space, for example, by performing preprocessing such as discretization or smoothing from the point cloud data of the three-dimensional image. The height detector 46 outputs the height information of the storage space as a height map.

 経路生成器47は、ロボットアーム12及びハンド14の目標経路を生成する。目標経路は、ロボットアーム12及びハンド14の目標位置及び目標姿勢の順序集合である。すなわち、目標経路は、ロボットアーム12及びハンド14の目標位置及び目標姿勢の経時的な変化である。目標経路は、ロボットアーム12及びハンド14の目標位置及び目標姿勢の経時的な変化を実現する、ロボットアーム12の各関節(第6関節13fを含む)の目標回転角度の順序集合であってもよい。目標経路は、ロボットアーム12及びハンド14の位置及び姿勢を一義的に規定する。目標経路は、基準軸T回りの吸着器15の目標回転位置も規定する。例えば、経路生成器47は、ピッキング動作のためのロボットアーム12及びハンド14の目標経路を生成する。経路生成器47は、プレース動作のためのロボットアーム12及びハンド14の目標経路を生成する。 The path generator 47 generates a target path for the robot arm 12 and the hand 14. The target path is an ordered set of target positions and target postures of the robot arm 12 and the hand 14. That is, the target path is a change in the target position and target posture of the robot arm 12 and the hand 14 over time. The target path may be an ordered set of target rotation angles of each joint of the robot arm 12 (including the sixth joint 13f) that realizes the change in the target position and target posture of the robot arm 12 and the hand 14 over time. The target path uniquely defines the position and posture of the robot arm 12 and the hand 14. The target path also defines the target rotation position of the suction device 15 around the reference axis T. For example, the path generator 47 generates a target path for the robot arm 12 and the hand 14 for a picking operation. The path generator 47 generates a target path for the robot arm 12 and the hand 14 for a placing operation.

 例えば、経路生成器47は、PRM(Probabilistic Roadmap Method)又はRRT(Rapidly-Exploring Random Tree)等によって経路計画を実行する。経路生成器47は、これらの手法において干渉チェック、即ち、ロボットアーム12及びハンド14が他の物体と干渉しないかの確認も実行する。干渉チェックとしては、例えば、点群データ又はCADデータを用いた干渉チェック等が行われる。 For example, the path generator 47 executes path planning using the Probabilistic Roadmap Method (PRM) or the Rapidly-Exploring Random Tree (RRT), etc. In these methods, the path generator 47 also executes an interference check, i.e., checks whether the robot arm 12 and the hand 14 interfere with other objects. The interference check may be, for example, an interference check using point cloud data or CAD data.

 探索器41は、収容スペースにおけるプレースの位置候補にプレースされるワークに対するハンド14の相対位置姿勢の候補を探索する。探索器41は、ハンド14の相対位置姿勢の複数の候補を探索する。ハンド14の相対位置姿勢の複数の候補のそれぞれに対応するプレースの位置候補は、互いに同じであってもよいし、互いに異なっていてもよい。つまり、探索器41は、一の位置候補に対してそれぞれ異なる複数のハンド14の相対位置姿勢の候補を探索してもよい。あるいは、探索器41は、それぞれ異なる位置候補に対して複数のハンド14の相対位置姿勢の候補を探索してもよい。このとき、複数のハンド14の相対位置姿勢の候補は、互いに異なっていてもよいし、互いに同じであってもよい。 The searcher 41 searches for candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at a candidate place position in the storage space. The searcher 41 searches for multiple candidates for the relative position and orientation of the hand 14. The place position candidates corresponding to the multiple candidates for the relative position and orientation of the hand 14 may be the same as each other, or may be different from each other. In other words, the searcher 41 may search for multiple different candidates for the relative position and orientation of the hand 14 with respect to one candidate position. Alternatively, the searcher 41 may search for multiple candidates for the relative position and orientation of the hand 14 with respect to different candidate positions. In this case, the candidates for the relative position and orientation of the hand 14 may be different from each other, or may be the same as each other.

 具体的には、探索器41は、位置候補と、その位置候補にプレースされたワークに対するハンド14の相対位置姿勢の候補との組み合わせであるプレース候補を複数探索する。複数のプレース候補のそれぞれは、位置候補とハンド14の相対位置姿勢の候補との組み合わせが異なる。複数のプレース候補に含まれる少なくとも一部の位置候補は、互いに同じであってもよい。複数のプレース候補に含まれる少なくとも一部のハンド14の相対位置姿勢の候補は、互いに同じであってもよい。 Specifically, the searcher 41 searches for multiple place candidates, which are combinations of position candidates and candidates for the relative position and orientation of the hand 14 with respect to the workpiece placed at the position candidates. Each of the multiple place candidates has a different combination of position candidates and candidates for the relative position and orientation of the hand 14. At least some of the position candidates included in the multiple place candidates may be the same as each other. At least some of the candidates for the relative position and orientation of the hand 14 included in the multiple place candidates may be the same as each other.

 探索器41は、ハンド14の相対位置姿勢の候補だけでなく、プレースの位置候補も探索してもよい。例えば、探索器41は、収容スペースのうちワークをプレース可能なスペース(以下、「プレース可能スペース」という)を検出する。探索器41は、収容スペースにおいてワークをプレースできる平面(以下、「プレース可能面」という)Pを高さマップに基づいて検出し、プレース可能面Pの上方のスペースをプレース可能スペースとする。図5は、第2コンテナ92内を示す例示的な斜視図である。図5において、ドットを付された面がプレース可能面Pである。プレース可能面Pは、上方に露出する平面であり、高さマップにおいて高さが一様な領域である。プレース可能面Pは、第2コンテナ92の底面のうち露出する部分である。既存ワークW0の上面も露出している場合にはプレース可能面Pとなり得る。尚、複数の既存ワークW0が並び、それらの既存ワークW0の上面が上方へ露出し、且つ、それらの既存ワークW0の上面の高さが揃っている場合には、それらの既存ワークW0の上面が単一のプレース可能面Pとなる。図5の例では、2つの既存ワークW0の上面は、それぞれ別のプレース可能面Pとして扱われる。 The searcher 41 may search not only for candidates for the relative position and orientation of the hand 14, but also for candidates for the placement position. For example, the searcher 41 detects a space in the storage space where a work can be placed (hereinafter referred to as a "placeable space"). The searcher 41 detects a plane P in the storage space where a work can be placed (hereinafter referred to as a "placeable surface") based on the height map, and sets the space above the placeable surface P as the placeable space. Figure 5 is an exemplary perspective view showing the inside of the second container 92. In Figure 5, the surface marked with dots is the placeable surface P. The placeable surface P is a plane exposed upward and is an area of uniform height in the height map. The placeable surface P is an exposed portion of the bottom surface of the second container 92. If the top surface of the existing work W0 is also exposed, it can become the placeable surface P. In addition, when multiple existing workpieces W0 are lined up, the upper surfaces of the existing workpieces W0 are exposed upward, and the heights of the upper surfaces of the existing workpieces W0 are uniform, the upper surfaces of the existing workpieces W0 become a single placeable surface P. In the example of FIG. 5, the upper surfaces of the two existing workpieces W0 are each treated as a separate placeable surface P.

 探索器41は、プレース可能面Pにおいてワークの位置を変更しならが、各位置においてワークをプレースできるか否かを判定する。図6は、位置候補の探索を説明するための第2コンテナ内の模式的な平面図である。プレース可能と判定された位置は、位置候補となる。プレースできるか否かは、ワークが他の物体に干渉することなくプレースできるか否かによって判定される。このとき、探索器41は、ワークの実際の大きさ(一点鎖線参照)に所定のマージンを加えた大きさを探索用のワークの大きさ(破線参照)とする。具体的には、探索器41は、ワークの底面を所定のマージンの分だけ拡大する。ワークの大きさは、予め記憶器32に保存されていてもよい。あるいは、ワークの大きさは、ワーク検出器45がワークを検出する際に特定されてもよい。 The searcher 41 determines whether or not the work can be placed at each position while changing the position of the work on the placement surface P. FIG. 6 is a schematic plan view of the inside of the second container to explain the search for position candidates. The positions determined to be possible for placement become position candidates. Whether or not the work can be placed is determined based on whether or not the work can be placed without interfering with other objects. At this time, the searcher 41 sets the size of the work for search (see dashed line) to the actual size of the work (see dashed line) plus a predetermined margin. Specifically, the searcher 41 enlarges the bottom surface of the work by the amount of the predetermined margin. The size of the work may be stored in the memory 32 in advance. Alternatively, the size of the work may be identified when the work detector 45 detects the work.

 こうして、探索器41は、収容スペースにおいてワークをプレースできる位置をプレースの位置候補として探索する。 In this way, the searcher 41 searches for positions in the storage space where the work can be placed as candidate placement positions.

 探索器41は、位置候補にプレースされるワークに対するハンド14の相対位置姿勢の候補を求める。 The searcher 41 finds candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at the candidate position.

 この例では、ハンド14の相対位置姿勢は、ワークに対する吸着器15の位置、基準軸Tの位置、基準軸Tの姿勢によって規定される。図7は、対象ワークWを吸着するときのハンド14を示す説明図である。吸着器15が基準軸Tに対して偏心しているので、ワークに対する吸着器15の位置、即ち、吸着位置が決まっても、基準軸Tは、吸着器15を中心として360度の任意の位置に配置され得る。この例では、ハンド14の相対位置姿勢のうち吸着器15の位置及び基準軸Tの姿勢は決まっている。吸着器15の吸着位置は、対象面Sの中央である。基準軸Tは、対象面Sに対して略直交する。ハンド14の相対位置姿勢のうち基準軸Tの位置は、吸着器15を中心として360度の任意の角度位置に設定され得る。 In this example, the relative position and orientation of the hand 14 is determined by the position of the suction device 15 with respect to the workpiece, the position of the reference axis T, and the orientation of the reference axis T. FIG. 7 is an explanatory diagram showing the hand 14 when suctioning the target workpiece W. Since the suction device 15 is eccentric with respect to the reference axis T, even if the position of the suction device 15 with respect to the workpiece, i.e., the suction position, is determined, the reference axis T can be positioned at any position within 360 degrees around the suction device 15. In this example, the position of the suction device 15 and the orientation of the reference axis T among the relative position and orientation of the hand 14 are determined. The suction position of the suction device 15 is the center of the target surface S. The reference axis T is approximately perpendicular to the target surface S. The position of the reference axis T among the relative position and orientation of the hand 14 can be set to any angle position within 360 degrees around the suction device 15.

 このように、基準軸Tの位置が任意に設定され得るために、ワークをプレースするときにハンド14又はロボットアーム12が他の物体に干渉する場合がある。特に、収容スペースのうち角を含むスペースにワークがプレースされる場合には、角を形成する第2コンテナ92の側壁又は既存ワークW0の側面等にハンド14又はロボットアーム12が干渉する虞がある。そのような位置候補にワークがプレースされる場合には、基準軸Tが第2コンテナ92の側壁等から離れるようなハンド14の相対位置姿勢が適切である。 In this way, since the position of the reference axis T can be set arbitrarily, the hand 14 or robot arm 12 may interfere with other objects when placing the workpiece. In particular, when the workpiece is placed in a space that includes a corner within the storage space, there is a risk that the hand 14 or robot arm 12 may interfere with the side wall of the second container 92 that forms the corner or the side surface of the existing workpiece W0. When the workpiece is placed in such a candidate position, the appropriate relative position and orientation of the hand 14 is one in which the reference axis T is away from the side wall of the second container 92, etc.

 例えば、記憶器32には、吸着器15を中心とする基準軸Tの角度位置が異なる複数のハンド14の相対位置姿勢が保存されている。図8は、基準軸Tの方向に見た場合のハンド14の相対位置姿勢を示す模式図である。例えば、図8に示すように、吸着器15を中心とする基準軸Tの角度位置が30度間隔で異なる12種類のハンド14の相対位置姿勢が記憶器32に保存されていてもよい。探索器41は、ハンド14の複数の相対位置姿勢の1つを複数の位置候補の1つに組み合わせてプレース候補を作成する。 For example, the memory 32 stores the relative positions and orientations of multiple hands 14 with different angular positions of the reference axis T centered on the suction device 15. Figure 8 is a schematic diagram showing the relative positions and orientations of the hand 14 when viewed in the direction of the reference axis T. For example, as shown in Figure 8, the memory 32 may store the relative positions and orientations of 12 types of hands 14 with different angular positions of the reference axis T centered on the suction device 15 at 30 degree intervals. The searcher 41 creates a place candidate by combining one of the multiple relative positions and orientations of the hand 14 with one of multiple position candidates.

 探索器41は、ハンド14及びロボットアーム12が他の物体に干渉することなくワークを保持できる、位置候補とハンド14の相対位置姿勢との組み合わせをプレース候補として作成する。具体的には、探索器41は、プレース候補に対応するハンド14及びロボットアーム12の目標経路を生成できるか否かを判定する。つまり、探索器41は、ハンド14及びロボットアーム12が他の物体に干渉しない目標経路の生成の可能な位置候補とハンド14の相対位置姿勢の候補との組み合わせをプレース候補とする。 The searcher 41 creates, as a place candidate, a combination of a position candidate and a relative position and orientation of the hand 14, where the hand 14 and robot arm 12 can hold a workpiece without interfering with other objects. Specifically, the searcher 41 determines whether or not a target path for the hand 14 and robot arm 12 corresponding to the place candidate can be generated. In other words, the searcher 41 sets, as a place candidate, a combination of a position candidate and a candidate relative position and orientation of the hand 14, where a target path for the hand 14 and robot arm 12 that does not interfere with other objects can be generated.

 具体的には、経路生成器47は、位置候補及びハンド14の相対位置姿勢の候補に対応する状態にハンド14を移動させるための、ハンド14及びロボットアーム12の目標経路を生成する。プレース動作においてはロボットアーム12及びハンド14が通過する中継位置が設定されているので、経路生成器47は、中継位置からの目標経路を生成する。このとき、経路生成器47は、目標経路の干渉チェックを行い、干渉の有無を確認する。つまり、経路生成器47は、干渉の無い目標経路を生成する。干渉がある場合には、経路生成器47は、目標経路を生成しない。目標経路が生成された場合には、探索器41は、そのときの位置候補とハンド14の相対位置姿勢の候補との組み合わせをプレース候補とする。 Specifically, the path generator 47 generates a target path for the hand 14 and robot arm 12 to move the hand 14 to a state corresponding to the position candidate and the candidate for the relative position and orientation of the hand 14. In the placing operation, an intermediate position through which the robot arm 12 and the hand 14 pass is set, so the path generator 47 generates a target path from the intermediate position. At this time, the path generator 47 performs an interference check of the target path to confirm the presence or absence of interference. In other words, the path generator 47 generates a target path without interference. If interference is present, the path generator 47 does not generate a target path. When a target path is generated, the searcher 41 sets a combination of the position candidate at that time and the candidate for the relative position and orientation of the hand 14 as a placing candidate.

 探索器41は、プレース候補を登録する。すなわち、探索器41は、位置候補とハンド14の相対位置姿勢との組み合わせをメモリ33又は記憶器32に保存する。探索器41は、さらに目標経路をプレース候補に対応付けて、メモリ33又は記憶器32に保存する。 The searcher 41 registers the place candidates. That is, the searcher 41 stores the combination of the position candidate and the relative position and orientation of the hand 14 in the memory 33 or the storage device 32. The searcher 41 further associates the target route with the place candidate and stores it in the memory 33 or the storage device 32.

 このようにして、探索器41は、複数のプレース候補を探索する。 In this way, the searcher 41 searches for multiple place candidates.

 さらに、探索器41は、複数のプレース候補に優先度を付与する。具体的には、探索器41は、プレース候補に含まれる位置候補に優先度を付与する。例えば、探索器41は、ワークの収容効率の観点で複数の位置候補に優先度を付与する。探索器41は、複数の位置候補の中から、収容スペースにおける角のスペースに対応する位置候補に高い優先度を付与してもよい。第2コンテナ92内にワークを収容する際には、角を含むスペースにワークを詰めて配置することがワークの収容効率の観点で好ましい。また、第2コンテナ92内にワークを収容する際にワークの側部、例えば側面を第2コンテナ92の側壁又は既存ワークW0の側面に接触させることもワークの収容効率の観点で好ましい。そのため、探索器41は、複数の位置候補の中から、ワークの側部が他の物体に接触する位置候補にも高い優先度を付与してもよい。 Furthermore, the searcher 41 assigns priorities to multiple place candidates. Specifically, the searcher 41 assigns priorities to position candidates included in the place candidates. For example, the searcher 41 assigns priorities to multiple position candidates from the viewpoint of work storage efficiency. The searcher 41 may assign a high priority to a position candidate corresponding to a corner space in the storage space from among the multiple position candidates. When storing the work in the second container 92, it is preferable from the viewpoint of work storage efficiency to pack the work into a space including a corner. In addition, when storing the work in the second container 92, it is also preferable from the viewpoint of work storage efficiency to have the side of the work, for example, the side surface, come into contact with the side wall of the second container 92 or the side surface of the existing work W0. Therefore, the searcher 41 may assign a high priority to a position candidate where the side of the work comes into contact with another object from among the multiple position candidates.

 探索器41は、収容の安定性の観点で複数の位置候補に優先度を付与してもよい。例えば、探索器41は、第2コンテナ92の底面からの位置候補の高さが低いほど、高い優先度を付与してもよい。ワークを第2コンテナ92の底面にプレースする方が、ワークを既存ワークW0の上にプレースするよりもワークを安定的にプレースすることができる。 The searcher 41 may assign priorities to multiple position candidates from the viewpoint of storage stability. For example, the searcher 41 may assign a higher priority to a position candidate that is lower in height from the bottom surface of the second container 92. Placing the work on the bottom surface of the second container 92 allows the work to be placed more stably than placing the work on top of the existing work W0.

 このように、探索器41は、プレース位置の位置候補とそれに対応するハンド14の相対位置姿勢の候補との組み合わせであるプレース候補を複数探索すると共に、複数のプレース候補に優先度を付与する。 In this way, the searcher 41 searches for multiple place candidates, which are combinations of candidate placement positions and corresponding candidates for the relative position and orientation of the hand 14, and assigns priorities to the multiple place candidates.

 決定器42は、ピッキングの対象ワークに対するハンド14の相対位置姿勢をプレース時のハンド14の複数の相対位置姿勢の候補の中から決定する。具体的には、決定器42は、ハンド14及びロボットアーム12が他の物体に干渉することなく対象ワークを保持できるハンド14の相対位置姿勢を決定する。具体的には、決定器42は、ハンド14が相対位置姿勢で対象ワークをピッキングするためのハンド14及びロボットアーム12の目標経路を生成できるか否かを判定する。つまり、決定器42は、ハンド14及びロボットアーム12が他の物体に干渉しない目標経路の生成の可能なハンド14の相対位置姿勢を、複数のプレース候補に含まれる複数のハンド14の相対位置姿勢の候補の中から決定する。 The determiner 42 determines the relative position and orientation of the hand 14 with respect to the target workpiece to be picked from among multiple candidates for the relative position and orientation of the hand 14 at the time of placing. Specifically, the determiner 42 determines the relative position and orientation of the hand 14 at which the hand 14 and robot arm 12 can hold the target workpiece without interfering with other objects. Specifically, the determiner 42 judges whether or not the hand 14 can generate a target path for the hand 14 and robot arm 12 to pick up the target workpiece in the relative position and orientation. In other words, the determiner 42 determines the relative position and orientation of the hand 14 that can generate a target path for the hand 14 and robot arm 12 not interfering with other objects from among multiple candidates for the relative position and orientation of the hand 14 included in multiple place candidates.

 詳しくは、決定器42は、複数のハンド14の相対位置姿勢の中からハンド14の一の相対位置姿勢を暫定的に選択する。経路生成器47は、選択されたハンド14の相対位置姿勢で対象ワークをピッキングするための、ハンド14及びロボットアーム12の目標経路を生成する。ピッキング動作においてはロボットアーム12及びハンド14が通過する中継位置が設定されているので、経路生成器47は、中継位置からの目標経路を生成する。このとき、経路生成器47は、目標経路の干渉チェックを行い、干渉の有無を確認する。つまり、経路生成器47は、干渉の無い目標経路を生成する。干渉がある場合には、経路生成器47は、目標経路を生成しない。目標経路が生成された場合には、決定器42は、暫定的なハンド14の相対位置姿勢をピッキング時のハンド14の相対位置姿勢に決定する。目標経路が生成されない場合には、決定器42は、ハンド14の別の相対位置姿勢を選択し、経路生成器47は、選択されたハンド14の相対位置姿勢に関して目標経路の生成を試みる。 In detail, the determiner 42 provisionally selects one relative position and posture of the hand 14 from among the relative positions and postures of the multiple hands 14. The path generator 47 generates a target path for the hand 14 and the robot arm 12 to pick up the target workpiece at the selected relative position and posture of the hand 14. In the picking operation, an intermediate position through which the robot arm 12 and the hand 14 pass is set, so the path generator 47 generates a target path from the intermediate position and posture. At this time, the path generator 47 performs an interference check of the target path to confirm the presence or absence of interference. In other words, the path generator 47 generates a target path without interference. If interference is present, the path generator 47 does not generate a target path. If the target path is generated, the determiner 42 determines the provisional relative position and posture of the hand 14 as the relative position and posture of the hand 14 at the time of picking. If the target path is not generated, the determiner 42 selects another relative position and posture of the hand 14, and the path generator 47 attempts to generate a target path for the selected relative position and posture of the hand 14.

 対象ワークの選択肢が複数ある場合、即ち、複数のワークの中から対象ワークを選択する場合には、決定器42は、ハンド14の相対位置姿勢を決定する際に複数のワークの中から対象ワークも決定する。複数のハンド14の相対位置姿勢の中から対象ワークをピッキングできるハンド14の相対位置姿勢を決定する場合に、対象ワークの選択肢が増えるほど、ハンド14の相対位置姿勢の選択の幅が拡がる。 When there are multiple options for the target workpiece, i.e., when selecting a target workpiece from multiple workpieces, the determiner 42 also determines the target workpiece from the multiple works when determining the relative position and posture of the hand 14. When determining the relative position and posture of the hand 14 that can pick up the target workpiece from the relative positions and postures of multiple hands 14, the more options there are for the target workpiece, the wider the range of choices for the relative position and posture of the hand 14.

 ピッキング時のハンド14の相対位置姿勢が決定されることによって、決定されたハンド14の相対位置姿勢に対応するプレース候補で規定される候補位置がプレース目標位置として決定される。つまり、決定器42は、ピッキング時のハンド14の相対位置姿勢を決定すると共に、プレース目標位置を決定する。さらに、決定器42は、前述の如く、ピッキング時の目標経路も生成する。 By determining the relative position and orientation of the hand 14 during picking, a candidate position defined by a place candidate corresponding to the determined relative position and orientation of the hand 14 is determined as a place target position. In other words, the determiner 42 determines the relative position and orientation of the hand 14 during picking, and also determines the place target position. Furthermore, the determiner 42 also generates a target path for picking, as described above.

 動作制御器43は、対象ワークWをピッキングするピッキング動作、及び、対象ワークWをプレース目標位置へプレースするプレース動作を含むロボット動作をロボットアーム12及びハンド14に実行させる。動作制御器43は、目標経路に応じた指令をロボット制御装置2に出力する。具体的には、動作制御器43は、ロボットアーム12の各関節の目標回転角度に応じた指令をロボット制御装置2に出力する。ロボット制御装置2が前述の如く、指令に基づいてサーボモータ16を駆動することによって、ロボットアーム12及びハンド14が目標経路に従って移動する。 The motion controller 43 causes the robot arm 12 and hand 14 to execute robot motions including a picking motion for picking up the target workpiece W and a placing motion for placing the target workpiece W at a target placement position. The motion controller 43 outputs a command according to the target path to the robot control device 2. Specifically, the motion controller 43 outputs a command according to the target rotation angle of each joint of the robot arm 12 to the robot control device 2. As described above, the robot control device 2 drives the servo motor 16 based on the command, causing the robot arm 12 and hand 14 to move along the target path.

 動作制御器43は、ハンド14のアクチュエータ15aを制御することによって吸着器15による吸着及びその解除を切り替える。動作制御器43は、アクチュエータ15aを制御して吸着器15に負圧を発生させることによって、吸着器15にワークを吸着させる。動作制御器43は、アクチュエータ15aを制御して吸着器15の負圧を解放することによって、吸着器15によるワークの吸着を解除させる。動作制御器43は、圧力センサ15bの検出結果に基づいて、吸着器15による対象ワークWの吸着の完了を判定する。動作制御器43は、圧力センサ15bによって検出される負圧が所定の閾値以上になる場合に吸着の完了を判定する。 The operation controller 43 switches between suction by the suction device 15 and release of the suction by controlling the actuator 15a of the hand 14. The operation controller 43 controls the actuator 15a to generate negative pressure in the suction device 15, thereby causing the suction device 15 to suction the workpiece. The operation controller 43 controls the actuator 15a to release the negative pressure in the suction device 15, thereby causing the suction device 15 to release the suction of the workpiece. The operation controller 43 determines whether suction of the target workpiece W by the suction device 15 has been completed based on the detection result of the pressure sensor 15b. The operation controller 43 determines whether suction has been completed when the negative pressure detected by the pressure sensor 15b is equal to or greater than a predetermined threshold value.

 続いて、このように構成されたロボットシステム100の動作について説明する。ロボットシステム100のピックアンドプレース処理においては、位置計画及びロボット動作が並行して行われる。位置計画には、プレース位置計画及び吸着位置計画が含まれる。ロボット動作には、ピッキング動作及びプレース動作が含まれる。位置計画について図9のフローチャートを参照して説明する。図9は、位置計画のフローチャートである。 Next, the operation of the robot system 100 configured in this manner will be described. In the pick-and-place process of the robot system 100, position planning and robot operation are performed in parallel. Position planning includes a place position plan and a pickup position plan. Robot operation includes a picking operation and a placing operation. Position planning will be described with reference to the flowchart in Figure 9. Figure 9 is a flowchart of position planning.

 まず、ステップS101において、撮影制御器44は、プレース撮影を第2カメラ51Bに実行させる。プレース撮影は、第2コンテナ92内の撮影である。これにより、第2コンテナ92内の画像が取得される。 First, in step S101, the photography controller 44 causes the second camera 51B to perform place photography. The place photography is photography of the inside of the second container 92. As a result, an image of the inside of the second container 92 is obtained.

 次に、ステップS102において、探索器41は、プレース位置計画を実行する。探索器41は、複数のプレース候補を探索する。このとき、プレース候補のそれぞれに対する、ロボットアーム12及びハンド14の目標経路も生成される。 Next, in step S102, the searcher 41 executes a place position plan. The searcher 41 searches for multiple place candidates. At this time, the searcher 41 also generates a target path for the robot arm 12 and hand 14 for each of the place candidates.

 一方、ステップS101,S102と並行して、ステップS103において、撮影制御器44は、ピッキング撮影を第1カメラ51Aに実行させる。ピッキング撮影は、第1コンテナ91内の撮影である。これにより、第1コンテナ91内の画像が取得される。尚、ステップS103の実行は、ステップS101より前でも後であってもよい。ただし、ワーク検出器45によって特定されたワークWの大きさを用いてプレース位置計画が行われる場合には、ステップS103,S104は、ステップS102よりも前に実行される。 Meanwhile, in parallel with steps S101 and S102, in step S103, the photography controller 44 causes the first camera 51A to perform picking photography. The picking photography is photography of the inside of the first container 91. As a result, an image of the inside of the first container 91 is acquired. Note that step S103 may be performed before or after step S101. However, if the placement position planning is performed using the size of the work W identified by the work detector 45, steps S103 and S104 are performed before step S102.

 次に、ステップS104において、ワーク検出器45は、第1コンテナ91内の画像の中からワークを検出する。検出されワークを単に「検出ワーク」と称する。 Next, in step S104, the work detector 45 detects a work from within the image of the first container 91. The detected work is simply referred to as the "detected work."

 続いて、ステップS105において、決定器42は、吸着位置計画を実行する。決定器42は、ピッキングする対象ワークWを決定し、対象ワークWに対するハンド14の相対位置姿勢を決定する。ハンド14の相対位置姿勢が決定されることによって、対応するプレース候補で規定される位置候補がプレース目標位置として決定される。さらに、吸着位置計画では、対象ワークWをピッキングするためのロボットアーム12及びハンド14の目標経路も生成される。 Next, in step S105, the determiner 42 executes a pickup position plan. The determiner 42 determines the target work W to be picked, and determines the relative position and orientation of the hand 14 with respect to the target work W. By determining the relative position and orientation of the hand 14, a position candidate specified by the corresponding place candidate is determined as the place target position. Furthermore, in the pickup position plan, a target path of the robot arm 12 and the hand 14 for picking up the target work W is also generated.

 こうして位置計画によって、対象ワークW、ハンド14の相対位置姿勢、プレース目標位置、ピッキング時の目標経路及びプレース時の目標経路が決定される。 In this way, the position planning determines the target workpiece W, the relative position and orientation of the hand 14, the target placement position, the target path for picking, and the target path for placing.

 動作制御器43は、ロボットアーム12及びハンド14を目標経路に従って動作させることによって、ピッキング動作及びプレース動作を実行させる。 The motion controller 43 performs picking and placing operations by moving the robot arm 12 and hand 14 along the target path.

 具体的には、動作制御器43は、対象ワークWをハンド14の相対位置姿勢で吸着するように、ロボットアーム12及びハンド14を制御する。具体的には、動作制御器43は、ロボットアーム12及びハンド14をピッキングの目標経路に従って移動させ、吸着器15を対象ワークWまで移動させる。動作制御器43は、吸着器15に対象ワークWを吸着させる。 Specifically, the motion controller 43 controls the robot arm 12 and the hand 14 so as to adsorb the target workpiece W in the relative position and posture of the hand 14. Specifically, the motion controller 43 moves the robot arm 12 and the hand 14 along the target picking path, and moves the suction device 15 to the target workpiece W. The motion controller 43 causes the suction device 15 to adsorb the target workpiece W.

 その後、動作制御器43は、プレース目標位置に対象ワークWが配置されるように、ロボットアーム12及びハンド14を制御する。具体的には、動作制御器43は、ロボットアーム12及びハンド14をプレースの目標経路に従って移動させ、対象ワークWをプレース目標位置まで移動させる。対象ワークWがプレース目標位置に到達すると、動作制御器43は、吸着器15による吸着を解除させる。こうして、プレース目標位置への対象ワークWのプレースが完了する。これにより、ピックアンドプレース処理が完了する。 Then, the motion controller 43 controls the robot arm 12 and hand 14 so that the target workpiece W is placed at the target placement position. Specifically, the motion controller 43 moves the robot arm 12 and hand 14 along the target placement path, and moves the target workpiece W to the target placement position. When the target workpiece W reaches the target placement position, the motion controller 43 releases the suction by the suction device 15. In this way, the placement of the target workpiece W at the target placement position is completed. This completes the pick-and-place process.

 主制御装置3は、一の対象ワークWについてピックアンドプレース処理が完了すると、別の対象ワークWのピックアンドプレース処理を実行する。主制御装置3は、このような制御を繰り返すことによって、第1コンテナ91内の対象ワークWを順次、第2コンテナ92内へ移送する。 When the main control device 3 completes the pick-and-place process for one target workpiece W, it executes the pick-and-place process for another target workpiece W. By repeating this type of control, the main control device 3 sequentially transfers the target workpieces W in the first container 91 into the second container 92.

 =プレース位置計画=
 次に、プレース位置計画について詳しく説明する。図10は、プレース位置計画のサブルーチンのフローチャートである。
=Place location plan=
Next, the place position planning will be described in detail with reference to Fig. 10, which is a flow chart of the place position planning subroutine.

 ステップS201において、高さ検出器46は、収容スペース、即ち、第2コンテナ92の内部の高さマップを作成する。 In step S201, the height detector 46 creates a height map of the storage space, i.e., the interior of the second container 92.

 次に、ステップS202において、探索器41は、プレース位置の位置候補を探索する。詳しくは、探索器41は、第2コンテナ92内において、ワークの位置候補を複数作成する。 Next, in step S202, the searcher 41 searches for position candidates for the place position. In detail, the searcher 41 creates multiple position candidates for the workpiece within the second container 92.

 次に、ステップS203において、探索器41は、位置候補に優先度を付与する。前述の如く、探索器41は、例えば収容効率又は収容安定性の観点で、複数の位置候補のそれぞれに優先度を付与する。 Next, in step S203, the searcher 41 assigns priorities to the location candidates. As described above, the searcher 41 assigns priorities to each of the multiple location candidates, for example, from the perspective of accommodation efficiency or accommodation stability.

 探索器41は、ステップS204において、記憶器32に保存されたハンド14の複数の相対位置姿勢から一の相対位置姿勢を読み出す。探索器41は、ステップS205において、複数の位置候補から最も優先度の高い位置候補を読み出す。 In step S204, the searcher 41 reads out one relative position and orientation from the multiple relative positions and orientations of the hand 14 stored in the memory 32. In step S205, the searcher 41 reads out the position candidate with the highest priority from the multiple position candidates.

 経路生成器47は、ステップS206において、読み出された相対位置姿勢のハンド14がワークを吸着した状態で、読み出された位置候補にワークをプレースする目標経路を生成する。このとき、経路生成器47は、目標経路の干渉チェックを行い、干渉の無い目標経路を生成する。干渉がある場合には、経路生成器47は、目標経路を生成しない。 In step S206, the path generator 47 generates a target path for placing the workpiece at the read-out position candidate with the hand 14 in the read-out relative position and orientation adsorbing the workpiece. At this time, the path generator 47 checks for interference with the target path and generates a target path without interference. If interference is present, the path generator 47 does not generate a target path.

 ステップS207において、探索器41は、目標経路が生成されたか否かを判定する。目標経路が生成された場合には、探索器41は、今回の相対位置姿勢をハンド14の相対位置姿勢の候補とする。探索器41は、ステップS208において、位置候補とハンド14の相対位置姿勢の候補との組み合わせをプレース候補として登録する。 In step S207, the searcher 41 determines whether or not a target path has been generated. If a target path has been generated, the searcher 41 sets the current relative position and orientation as a candidate for the relative position and orientation of the hand 14. In step S208, the searcher 41 registers a combination of the position candidate and the candidate for the relative position and orientation of the hand 14 as a place candidate.

 一方、目標経路が生成されなかったことは、今回の位置候補と相対位置姿勢との組み合わせではワークを適切にプレースできないことを意味する。そのため、目標経路が生成されなかった場合には、探索器41は、ステップS209において、全ての位置候補に関して、今回の相対位置姿勢での目標経路の生成を試みたか否かを判定する。全ての位置候補に関して目標経路の生成が試されていない場合には、探索器41は、ステップS205へ戻り、複数の位置候補から次に優先度の高い位置候補を読み出す。その後、経路生成器47は、新たな位置候補に関して目標経路の生成を試み(ステップS206)、探索器41は、目標経路が生成されたか否かを判定する(ステップS207)。 On the other hand, if a target path has not been generated, it means that the workpiece cannot be placed appropriately with the combination of the current position candidate and relative position and orientation. Therefore, if a target path has not been generated, the searcher 41 determines in step S209 whether or not it has attempted to generate a target path with the current relative position and orientation for all position candidates. If it has not attempted to generate a target path for all position candidates, the searcher 41 returns to step S205 and reads out the position candidate with the next highest priority from the multiple position candidates. The path generator 47 then attempts to generate a target path for the new position candidate (step S206), and the searcher 41 determines whether or not a target path has been generated (step S207).

 こうして、探索器41は、今回の相対位置姿勢でワークを適切にプレースできる位置候補を優先度の高い順に探索する。探索器41は、ワークを適切にプレースできる位置候補を見つけると、その位置候補を今回の相対位置姿勢に対応付ける。 In this way, the searcher 41 searches for position candidates in which the workpiece can be appropriately placed in the current relative position and orientation, in descending order of priority. When the searcher 41 finds a position candidate in which the workpiece can be appropriately placed, it associates the position candidate with the current relative position and orientation.

 しかし、ステップS209において全ての位置候補に関して目標経路の生成が既に試されている場合には、今回の相対位置姿勢でワークを適切にプレースできる位置が複数の位置候補に含まれていない。その場合には、今回の相対位置姿勢に関してはプレース候補が登録されない。探索器41は、ステップS204へ戻り、別の相対位置姿勢に関してプレース候補の探索を行う。具体的には、探索器41は、記憶器32に保存されたハンド14の複数の相対位置姿勢から別の相対位置姿勢を読み出し、ステップS205以降の処理を繰り返す。 However, if target path generation has already been attempted for all position candidates in step S209, the multiple position candidates do not include a position where the workpiece can be appropriately placed at the current relative position and orientation. In that case, no place candidate is registered for the current relative position and orientation. The searcher 41 returns to step S204 and searches for a place candidate for another relative position and orientation. Specifically, the searcher 41 reads out another relative position and orientation from the multiple relative positions and orientations of the hand 14 stored in the memory 32, and repeats the processing from step S205 onwards.

 ステップS208でのプレース候補の登録後、探索器41は、ステップS210において、全ての相対位置姿勢に関してプレース候補の探索が完了したか否かを判定する。全ての相対位置姿勢に関してプレース候補の探索が完了してない場合には、探索器41は、ステップS204へ戻り、別の相対位置姿勢に関してプレース候補の探索を行う。 After the place candidates are registered in step S208, the searcher 41 determines in step S210 whether the search for place candidates has been completed for all relative positions and orientations. If the search for place candidates has not been completed for all relative positions and orientations, the searcher 41 returns to step S204 and searches for place candidates for another relative position and orientation.

 全ての相対位置姿勢に関してプレース候補の探索が完了している場合には、探索器41は、プレース位置計画を終了する。 When the search for place candidates has been completed for all relative positions and orientations, the searcher 41 ends the place position planning.

 このようなプレース位置計画によって、位置候補とハンド14の相対位置姿勢の候補との組み合わせであるプレース候補が複数作成される。つまり、プレース位置計画によってプレース候補の集合が作成される。 By this type of place position planning, multiple place candidates are created, which are combinations of position candidates and candidates for the relative position and orientation of the hand 14. In other words, a set of place candidates is created by the place position planning.

 =吸着位置計画=
 次に、吸着位置計画について詳しく説明する。図11は、吸着位置計画のサブルーチンのフローチャートである。
=Pickup position planning=
Next, the pickup position planning will be described in detail with reference to a flowchart of a subroutine for the pickup position planning.

 ステップS301において、決定器42は、複数のプレース候補に含まれるハンド14の複数の相対位置姿勢の候補の中から一の相対位置姿勢の候補を読み出す。具体的には、決定器42は、複数のプレース候補のうち最も優先度の高いプレース候補で規定されたハンド14の相対位置姿勢の候補を読み出す。 In step S301, the determiner 42 reads out one relative position/posture candidate from among the multiple relative position/posture candidates of the hand 14 included in the multiple place candidates. Specifically, the determiner 42 reads out the relative position/posture candidate of the hand 14 that is defined by the place candidate with the highest priority among the multiple place candidates.

 次に、決定器42は、ステップS302において、相対位置姿勢の候補を各検出ワークに対応付ける。具体的には、決定器42は、ハンド14を各検出ワークに対して相対位置姿勢の候補で配置した場合の、ハンド14の絶対的又は制御上の位置及び姿勢を求める。例えば、このハンドの位置及び姿勢は、ワークに対する相対的な位置及び姿勢ではなく、ロボット1の制御で用いられる座標系(例えば、ロボット座標系等)におけるハンド14の位置及び姿勢である。ステップS302において、検出ワークとハンド14の位置及び姿勢との組み合わせ(この組み合わせを単に「検出ワークとハンド14との組み合わせ」と称する)が、検出ワークの個数に応じて作成される。 Next, in step S302, the determiner 42 associates the candidates for relative position and orientation with each detected work. Specifically, the determiner 42 determines the absolute or controlled position and orientation of the hand 14 when the hand 14 is placed with respect to each detected work at the candidate relative position and orientation. For example, this hand position and orientation is not a relative position and orientation with respect to the work, but the position and orientation of the hand 14 in a coordinate system (e.g., a robot coordinate system, etc.) used to control the robot 1. In step S302, combinations of the detected work and the position and orientation of the hand 14 (this combination is simply referred to as a "combination of detected work and hand 14") are created according to the number of detected works.

 続いて、ステップS303において、決定器42は、検出ワークとハンド14との組み合わせのそれぞれに優先度を付与する。例えば、決定器42は、ハンド14の現在位置からの移動量が少ないハンド14の位置及び姿勢を含む組み合わせに高い優先度を付与する。 Next, in step S303, the determiner 42 assigns a priority to each combination of the detected workpiece and the hand 14. For example, the determiner 42 assigns a high priority to a combination that includes a position and orientation of the hand 14 that has a small amount of movement from the current position of the hand 14.

 ステップS304において、決定器42は、検出ワークとハンド14との複数の組み合わせの中から一の組み合わせを読み出す。このとき、決定器42は、最も優先度の高い組み合わせを読み出す。 In step S304, the determiner 42 reads out one combination from among multiple combinations of the detection workpiece and the hand 14. At this time, the determiner 42 reads out the combination with the highest priority.

 そして、経路生成器47は、ステップS305において、ハンド14の現在位置から読み出された組み合わせで規定される、ハンド14の位置及び姿勢までのロボットアーム12の目標経路を生成する。このとき、経路生成器47は、目標経路の干渉チェックを行い、干渉の無い目標経路を生成する。経路生成器47は、周辺物体の状況によっては、干渉の無い目標経路を生成できない場合もある。 Then, in step S305, the path generator 47 generates a target path for the robot arm 12 to the position and posture of the hand 14, which is defined by the combination read from the current position of the hand 14. At this time, the path generator 47 performs an interference check of the target path, and generates a target path without interference. Depending on the state of surrounding objects, the path generator 47 may not be able to generate a target path without interference.

 ステップS306において、決定器42は、干渉の無い目標経路が生成されたか否かを判定する。干渉の無い目標経路が生成されていない場合には、ステップS307において、決定器42は、検出ワークとハンド14との複数の組み合わせの全てに関して目標経路の生成を試みたかを判定する。全ての組み合わせに関して目標経路の生成が試されていない場合には、決定器42は、ステップS304へ戻り、複数の組み合わせから次に優先度の高い組み合わせを読み出す。その後、経路生成器47は、新たな組み合わせに関して目標経路の生成を試み(ステップS305)、決定器42は、目標経路が生成されたか否かを判定する(ステップS306)。 In step S306, the determiner 42 determines whether an interference-free target path has been generated. If an interference-free target path has not been generated, in step S307, the determiner 42 determines whether an attempt has been made to generate target paths for all of the multiple combinations of the detected workpiece and the hand 14. If target path generation has not been attempted for all combinations, the determiner 42 returns to step S304 and reads out the combination with the next highest priority from the multiple combinations. The path generator 47 then attempts to generate a target path for the new combination (step S305), and the determiner 42 determines whether a target path has been generated (step S306).

 こうして、決定器42は、ステップ301で設定されたハンド14の相対位置姿勢の候補で適切な目標経路の生成できる検出ワークを優先度の高い順に探索する。決定器42は、適切な目標経路が生成されると、ステップS308において、その目標経路をピッキング時の目標経路に決定する。言い換えると、決定器42は、適切な目標経路が生成されたときの検出ワークを対象ワークWに決定し、そのときのハンド14の相対位置姿勢の候補をピッキング時のハンド14の相対位置姿勢に決定する。ピッキングのための対象ワークW及びハンド14の相対位置姿勢が決定されると、決定器42は、吸着位置計画を終了する。さらに、ピッキングのハンド14の相対位置姿勢が決定されることによって、対応するプレース候補で規定される位置候補がプレース目標位置に決定される。 In this way, the determiner 42 searches for detected workpieces for which an appropriate target path can be generated from the candidates for the relative position and posture of the hand 14 set in step 301, in descending order of priority. When an appropriate target path is generated, the determiner 42 determines the target path as the target path at the time of picking in step S308. In other words, the determiner 42 determines the detected workpiece at the time the appropriate target path was generated as the target workpiece W, and determines the candidate for the relative position and posture of the hand 14 at that time as the relative position and posture of the hand 14 at the time of picking. When the relative position and posture of the target workpiece W and the hand 14 for picking are determined, the determiner 42 ends the suction position planning. Furthermore, by determining the relative position and posture of the hand 14 for picking, the position candidate specified by the corresponding place candidate is determined as the place target position.

 しかし、ステップS307において全ての組み合わせに関して目標経路の生成が既に試されている場合には、今回設定されたハンド14の相対位置姿勢の候補で適切にピッキングできる検出ワークが存在しない。その場合には、ステップS309において、決定器42は、全てのプレース候補で規定されたハンド14の相対位置姿勢の候補に関して適切にピッキングできる検出ワークの探索が完了したかを判定する。全てのプレース候補で規定されたハンド14の相対位置姿勢の候補に関して検出ワークの探索が完了していない場合には、決定器42は、ステップS301へ戻り、複数のプレース候補のうち次に優先度の高いプレース候補で規定されたハンド14の相対位置姿勢の候補を読み出す。そして、新たに読み出された相対位置姿勢の候補に関して、ステップS302以降の処理が繰り返される。つまり、決定器42は、変更されたハンド14の相対位置姿勢の候補で適切な目標経路の生成できる検出ワークを探索する。 However, if the generation of the target path has already been tried for all combinations in step S307, there is no detection work that can be appropriately picked with the candidate relative position and orientation of the hand 14 set this time. In that case, in step S309, the determiner 42 determines whether the search for detection work that can be appropriately picked with respect to the candidates for the relative position and orientation of the hand 14 defined in all place candidates has been completed. If the search for detection work with respect to the candidates for the relative position and orientation of the hand 14 defined in all place candidates has not been completed, the determiner 42 returns to step S301 and reads out the candidate for the relative position and orientation of the hand 14 defined in the place candidate with the next highest priority among the multiple place candidates. Then, the processing from step S302 onwards is repeated for the newly read candidate for the relative position and orientation. In other words, the determiner 42 searches for detection work that can generate an appropriate target path with the candidate for the relative position and orientation of the changed hand 14.

 一方、全てのプレース候補で規定されたハンド14の相対位置姿勢に関して検出ワークの探索が完了している場合には、決定器42は、ステップS310において、ピックアンドプレース処理を終了する。 On the other hand, if the search for the detected workpiece has been completed for the relative positions and orientations of the hand 14 defined in all placement candidates, the determiner 42 ends the pick-and-place process in step S310.

 このように、決定器42は、ピッキングの対象ワークWに対するハンド14の相対位置姿勢を、プレース候補に含まれるハンド14の相対位置姿勢の候補の中から決定する。さらに、決定器42は、複数のワークの中から対象ワークWを決定する。 In this way, the determiner 42 determines the relative position and orientation of the hand 14 with respect to the target workpiece W to be picked from among the candidates for the relative position and orientation of the hand 14 included in the place candidates. Furthermore, the determiner 42 determines the target workpiece W from among multiple works.

 それに加えて、ピッキング時のハンド14の相対位置姿勢が決定されることによって、対象ワークWのプレース目標位置も決定される。プレース位置計画においてプレース候補のそれぞれの目標経路が既に作成されている。そのため、プレース目標位置が決定されると、それに応じてプレース動作における目標経路も決定される。 In addition, the target placement position of the target work W is also determined by determining the relative position and posture of the hand 14 during picking. The target paths for each placement candidate have already been created in the placement position plan. Therefore, once the target placement position is determined, the target path for the placement operation is also determined accordingly.

 続いて、プレース位置計画、吸着位置計画及びロボット動作の並行処理について図12を用いて説明する。図12は、プレース位置計画、吸着位置計画及びロボット動作のタイミングチャートである。 Next, the parallel processing of the place position plan, the pick-up position plan, and the robot operation will be explained using FIG. 12. FIG. 12 is a timing chart of the place position plan, the pick-up position plan, and the robot operation.

 プレース位置計画に関し、まずプレース撮影が実行される(S101)。これにより、第2コンテナ92内の画像が取得される。続いて、探索器41は、第2コンテナ92内の画像に基づいてプレース位置計画を実行する(S102)。これにより、複数のプレース候補が求められる。 For place position planning, first, place photography is performed (S101). This results in an image of the inside of the second container 92 being acquired. Next, the searcher 41 performs place position planning based on the image of the inside of the second container 92 (S102). This results in multiple place candidates being obtained.

 プレース位置計画の完了後に、把持位置計画が実行される(S105)。把持位置計画に先立って、ピッキング撮影(S103)及びワークの検出(S104)が実行される。詳しくは、ピッキング撮影によって、第1コンテナ91内の画像が取得される。第1コンテナ91内の画像に基づいて、ワークが検出される。把持位置計画は、第1コンテナ91内の画像、検出ワーク及びプレース候補に基づいて実行される。把持位置計画によって、対象ワークWが決定されると共に、対象ワークWに対するハンド14の相対位置姿勢が決定される。 After the placement position planning is completed, the gripping position planning is executed (S105). Prior to the gripping position planning, picking photography (S103) and workpiece detection (S104) are executed. In detail, an image of the inside of the first container 91 is acquired by picking photography. A workpiece is detected based on the image of the inside of the first container 91. The gripping position planning is executed based on the image of the inside of the first container 91, the detected workpiece, and the placement candidate. The gripping position planning determines the target workpiece W and the relative position and orientation of the hand 14 with respect to the target workpiece W.

 把持位置計画の完了後に、ピッキング動作が実行される。動作制御器43は、ロボット1に把持位置計画によって決定されたハンド14の相対位置姿勢で対象ワークWをピッキングさせる。 After the gripping position plan is completed, the picking operation is executed. The operation controller 43 causes the robot 1 to pick up the target workpiece W in the relative position and posture of the hand 14 determined by the gripping position plan.

 ピッキング動作の後に、プレース動作が実行される。動作制御器43は、プレース位置計画及び把持位置計画によって決定された目標経路に従ってロボット1を動作させる。これにより、対象ワークWがプレース目標位置にプレースされる。 After the picking operation, a placing operation is executed. The operation controller 43 operates the robot 1 according to the target path determined by the placing position plan and the gripping position plan. As a result, the target work W is placed at the placing target position.

 こうして、1回のプレース位置計画、把持位置計画、ピッキング動作及びプレース動作が完了する。 In this way, one placement position planning, gripping position planning, picking operation, and placement operation are completed.

 このようなピックアンドプレース処理においては、一部の処理は、他の処理と並行して実行される。例えば、プレース撮影は、ピッキング撮影、ワーク検出、把持位置計画又はピッキング動作と並行して行われ得る。プレース位置計画は、ピッキング撮影、ワーク検出、把持位置計画、ピッキング動作又はプレース動作と並行して行われ得る。ピッキング撮影は、プレース撮影、プレース位置計画又はプレース動作と並行して行われ得る。把持位置計画は、プレース撮影、ピッキング動作又はプレース動作と並行して行われ得る。 In such pick-and-place processing, some processes are performed in parallel with other processes. For example, place photography can be performed in parallel with pick photography, workpiece detection, gripping position planning, or picking operation. Place position planning can be performed in parallel with pick photography, workpiece detection, gripping position planning, picking operation, or placing operation. Pick photography can be performed in parallel with place photography, place position planning, or placing operation. Grip position planning can be performed in parallel with place photography, picking operation, or placing operation.

 この例では、ロボット動作と並行して、次回のピックアンドプレース処理におけるプレース位置計画及び把持位置計画が実行される。具体的には、プレース撮影は、1回前のピックアンドプレース処理におけるピッキング動作と並行して実行される。ピッキング動作中は第2コンテナ92内の状況が変化しないので、ピッキング動作中に次回のプレース撮影が実行され得る。ピッキング撮影は、1回前のピックアンドプレース処理におけるプレース動作と並行して実行される。プレース動作中は第1コンテナ91内の状況が変化しないので、ピッキング撮影が実行され得る。 In this example, the place position plan and grip position plan for the next pick-and-place process are executed in parallel with the robot operation. Specifically, the place photography is executed in parallel with the picking operation in the previous pick-and-place process. Since the situation inside the second container 92 does not change during the picking operation, the next place photography can be executed during the picking operation. The pick photography is executed in parallel with the place operation in the previous pick-and-place process. Since the situation inside the first container 91 does not change during the placing operation, the pick photography can be executed.

 プレース位置計画及び把持位置計画のそれぞれは、1回前のピックアンドプレース処理におけるピッキング動作及びプレース動作と並行して実行される。プレース位置計画及び把持位置計画のそれぞれは画像に基づいて実行されるので、プレース位置計画及び把持位置計画のそれぞれはロボット動作中に実行され得る。 Each of the place position plan and the grip position plan is executed in parallel with the picking operation and the placing operation in the previous pick-and-place process. Since each of the place position plan and the grip position plan is executed based on an image, each of the place position plan and the grip position plan can be executed during the robot operation.

 このように、いくつかの処理が並行して実行されることによって、ピックアンドプレース処理のサイクルタイムを短縮できる。 In this way, several processes can be performed in parallel, reducing the cycle time of the pick-and-place process.

 ただし、プレース位置計画は、1回前のピックアンドプレース処理のプレース動作の完了前のプレース撮影による画像に基づいて実行される。つまり、プレース位置計画は、1回前のプレース動作でのワークがプレースされていない状態の画像に基づいて実行される。そのため、プレース位置計画では、1回前のピックアンドプレース処理のプレース目標位置を除外して、プレース候補が求められる。 However, the placement position planning is performed based on an image captured before the completion of the placement operation of the previous pick-and-place process. In other words, the placement position planning is performed based on an image of the state in which the workpiece has not yet been placed in the previous placement operation. Therefore, in the placement position planning, the placement target position of the previous pick-and-place process is excluded and placement candidates are found.

 〈変形例〉
 続いて、ピックアンドプレース処理の変形例について説明する。図13は、変形例に係る位置計画のフローチャートである。
<Modification>
Next, a modified example of the pick-and-place process will be described with reference to FIG 13, which is a flowchart of position planning according to the modified example.

 前述の如く、プレース位置計画は、1回前のプレース動作の完了前のプレース撮影による画像に基づいて実行されため、プレース位置計画では、1回前のプレース目標位置を除外して、プレース候補が求められる。しかしながら、1回前のプレース動作で実際にプレースされたワークがプレース目標位置からずれている虞もある。あるいは、1回前のプレース動作に起因して、既存のワークが移動する虞もある。そこで、n-2(nは3以上の自然数)回目のプレース動作の後で且つn-1回目のプレース動作の前にn回目のプレース撮影が実行され、n回目のプレース撮影による画像とn-1回目のプレース撮影による画像との比較に基づいてn-1回目のプレース動作のプレース目標位置の修正の要否が判定される。n回目のプレース撮影の時点では、n-1回目のプレース動作がまだ実行されていないので、n-1回目のプレース動作のプレース目標位置を修正できる。 As described above, the place position planning is performed based on the image captured by the place shooting before the completion of the previous placing operation, so in the place position planning, the place candidate is found excluding the previous placing target position. However, there is a possibility that the workpiece actually placed in the previous placing operation may be displaced from the placing target position. Alternatively, there is a possibility that the existing workpiece may move due to the previous placing operation. Therefore, the nth place shooting is performed after the n-2th (n is a natural number of 3 or more) placing operation and before the n-1th placing operation, and based on a comparison between the image captured by the nth place shooting and the image captured by the n-1th place shooting, it is determined whether or not the place target position of the n-1th placing operation needs to be corrected. At the time of the nth place shooting, the n-1th placing operation has not yet been performed, so the place target position of the n-1th placing operation can be corrected.

 尚、図13におけるステップS101,S102,S103,S104,S105の処理は、図9におけるステップS101,S102,S103,S104,S105の処理と同様である。 Note that the processes in steps S101, S102, S103, S104, and S105 in FIG. 13 are similar to the processes in steps S101, S102, S103, S104, and S105 in FIG. 9.

 詳しくは、まず、ステップS101において、撮影制御器44は、プレース撮影を第2カメラ51Bに実行させる。 In more detail, first, in step S101, the shooting controller 44 causes the second camera 51B to perform place shooting.

 次に、ステップS401において、探索器41は、今回、即ち、n回目のプレース撮影の画像と前回、即ち、n-1回目のプレース撮影の画像との差を判定する。具体的には、高さ検出器46によって作成される高さマップに基づいて画像の差を判定する。探索器41は、n回目のプレース撮影の画像とn-1回目のプレース撮影の画像との差分をn回目のプレース撮影の画像の高さマップとn-1回目のプレース撮影の画像の高さマップとの差分によって検出する。n-1回目のプレース撮影は、n-2回目のプレース動作よりも前に実行されている。n回目のプレース撮影は、n-2回目のプレース動作の後であってn-1回目のプレース動作よりも前に実行されている。そのため、高さマップの差分は、n-2回目のプレース動作による第2コンテナ92内の状況の変化に相当する。 Next, in step S401, the searcher 41 determines the difference between the image of the current, i.e., the nth place photograph and the image of the previous, i.e., the n-1th place photograph. Specifically, the image difference is determined based on the height map created by the height detector 46. The searcher 41 detects the difference between the image of the nth place photograph and the image of the n-1th place photograph by the difference between the height map of the image of the nth place photograph and the height map of the image of the n-1th place photograph. The n-1th place photograph is performed before the n-2th place operation. The nth place photograph is performed after the n-2th place operation and before the n-1th place operation. Therefore, the difference in the height map corresponds to the change in the situation inside the second container 92 due to the n-2th place operation.

 続いて、ステップS402において、探索器41は、前回、即ち、n-1回目のプレース動作のプレース目標位置の修正の必要性を画像の差分に基づいて判定する。尚、n-1回目のプレース動作は、まだ実行されていない。n回目のプレース撮影の画像とn-1回目のプレース撮影の画像との差分によって、n-2回目のプレース動作によってプレースされたワークの位置がわかる。さらに、n回目のプレース撮影の画像とn-1回目のプレース撮影の画像との差分によって、n-2回目のプレース動作の影響による他のワークの移動等もわかる。 Next, in step S402, the searcher 41 determines the need to correct the placement target position of the previous, i.e., the n-1th, placing operation based on the image difference. Note that the n-1th placing operation has not yet been performed. The position of the workpiece placed in the n-2th placing operation can be determined from the difference between the image taken in the nth placing operation and the image taken in the n-1th placing operation. Furthermore, the movement of other workpieces due to the influence of the n-2th placing operation can also be determined from the difference between the image taken in the nth placing operation and the image taken in the n-1th placing operation.

 例えば、n-2回目のプレース動作のプレース目標位置とn-2回目のプレース動作により実際にプレースされたワークの位置とのずれが所定の範囲を超えている場合には、探索器41は、n-1回目のプレース目標位置の修正が必要であると判定する。あるいは、n-1回目のプレース目標位置内に別のワークが進入している場合には、探索器41は、n-1回目のプレース目標位置の修正が必要であると判定する。 For example, if the deviation between the target position of the n-2th placing operation and the position of the workpiece actually placed by the n-2th placing operation exceeds a predetermined range, the searcher 41 determines that the target position of the n-1th placing operation needs to be corrected. Alternatively, if another workpiece has entered the target position of the n-1th placing operation, the searcher 41 determines that the target position of the n-1th placing operation needs to be corrected.

 探索器41は、n-1回目のプレース目標位置の修正が必要ないと判定した場合には、ステップS102へ進み、今回、即ち、n回目のプレース位置計画を実行する。プレース位置計画は、前述の説明の通りである。 If the searcher 41 determines that the (n-1)th place target position does not need to be corrected, it proceeds to step S102 and executes the current, i.e., nth, place position plan. The place position plan is as described above.

 一方、探索器41は、n-1回目のプレース目標位置の修正が必要であると判定した場合には、ステップS403において、プレース位置修正を実行して、n-1回目のプレース目標位置を修正する。尚、動作制御器43は、修正されたプレース目標位置に基づいてn-1回目のプレース動作を実行する。探索器41は、プレース位置修正を実行した後、ステップS102においてn回目のプレース位置計画を実行する。 On the other hand, if the searcher 41 determines that the n-1th place target position needs to be corrected, then in step S403, it executes a place position correction to correct the n-1th place target position. The operation controller 43 executes the n-1th place operation based on the corrected place target position. After executing the place position correction, the searcher 41 executes the nth place position plan in step S102.

 尚、プレース位置修正は、n-2回目のプレース動作に起因する第2コンテナ92内の変化に応じてn-1回目のプレース目標位置を修正するので、2回目の位置計画まではステップS401,S402,S403の処理は実行されない。3回目の位置計画以降で、ステップS401が実行される。 Note that, since the placement position correction involves correcting the (n-1)th placement target position in response to changes in the second container 92 resulting from the (n-2)th placement operation, steps S401, S402, and S403 are not executed until the second position planning. Step S401 is executed from the third position planning onwards.

 続いて、プレース位置修正について詳しく説明する。図14は、プレース位置修正のサブルーチンのフローチャートである。 Next, we will explain the place position correction in detail. Figure 14 is a flowchart of the place position correction subroutine.

 まず、探索器41は、ステップS501において、プレース位置の位置候補を探索する。詳しくは、探索器41は、第2コンテナ92内において、ワークの位置候補を複数作成する。探索器41は、ステップS401で作成された高さマップを用いて位置候補を探索する。次に、ステップS502において、探索器41は、位置候補に優先度を付与する。 First, in step S501, the searcher 41 searches for position candidates for the place position. In detail, the searcher 41 creates multiple position candidates for the workpiece in the second container 92. The searcher 41 searches for the position candidates using the height map created in step S401. Next, in step S502, the searcher 41 assigns priorities to the position candidates.

 これらステップS501,S502はそれぞれ、ステップS202,S203と同様の処理である。ただし、これらの処理は、前回、即ち、n-1回目のプレース目標位置の修正のための処理であるが、n回目のプレース撮影による画像に基づいている。 These steps S501 and S502 are the same processes as steps S202 and S203, respectively. However, these processes are for correcting the previous, i.e., n-1th, place target position, but are based on the image captured during the nth place shot.

 その後、探索器41は、ステップS503において、複数の位置候補から最も優先度の高い位置候補を読み出す。この処理は、ステップS205と同様である。 Then, in step S503, the searcher 41 reads out the location candidate with the highest priority from among the multiple location candidates. This process is the same as step S205.

 そして、経路生成器47は、ステップS504において、読み出された位置候補にワークをプレースする目標経路を生成する。このプレース位置修正を実行しているときはピッキング動作の実行中であり、ハンド14の相対位置姿勢を変更できない。そのため、経路生成器47は、既に決定されたハンド14の相対位置姿勢で位置候補にワークをプレースする目標経路を生成する。このとき、経路生成器47は、目標経路の干渉チェックを行い、干渉の有無を確認する。つまり、経路生成器47は、干渉の無い目標経路を生成する。干渉がある場合には、経路生成器47は、目標経路を生成しない。 Then, in step S504, the path generator 47 generates a target path for placing the workpiece at the read-out position candidate. When this placement position correction is being performed, a picking operation is being performed, and the relative position and orientation of the hand 14 cannot be changed. Therefore, the path generator 47 generates a target path for placing the workpiece at the position candidate at the already determined relative position and orientation of the hand 14. At this time, the path generator 47 performs an interference check of the target path to confirm whether or not there is interference. In other words, the path generator 47 generates a target path without interference. If there is interference, the path generator 47 does not generate a target path.

 ステップS505において、探索器41は、目標経路が生成されたか否かを判定する。この処理は、ステップS207と同様である。目標経路が生成された場合には、探索器41は、ステップS506において、位置候補を前回、即ち、n-1回目のプレース目標位置として決定すると共に、生成された目標経路を前回、即ち、n-1回目のプレース動作の目標経路として設定する。 In step S505, the searcher 41 determines whether or not a target route has been generated. This process is the same as in step S207. If a target route has been generated, in step S506, the searcher 41 determines the position candidate as the previous, i.e., the (n-1)th place target position, and sets the generated target route as the previous, i.e., the (n-1)th place operation target route.

 一方、目標経路が生成されなかった場合には、探索器41は、ステップS507において、全ての位置候補に関して目標経路の生成を試みたか否かを判定する。この処理は、ステップS209と同様である。全ての位置候補に関して目標経路の生成が試されていない場合には、探索器41は、ステップS503へ戻り、複数の位置候補から次に優先度の高い位置候補を読み出す。その後、経路生成器47は、新たな位置候補に関して目標経路の生成を試み(ステップS504)、探索器41は、目標経路が生成されたか否かを判定する(ステップS505)。 On the other hand, if a target route has not been generated, the searcher 41 determines in step S507 whether or not an attempt has been made to generate a target route for all location candidates. This process is the same as step S209. If an attempt has not been made to generate a target route for all location candidates, the searcher 41 returns to step S503 and reads out the location candidate with the next highest priority from the multiple location candidates. The route generator 47 then attempts to generate a target route for the new location candidate (step S504), and the searcher 41 determines whether or not a target route has been generated (step S505).

 尚、探索器41が全ての位置候補に関しても目標経路の生成を試みた場合には、探索器41は、ステップS508において、ピックアンドプレース処理を終了する。 If the searcher 41 has attempted to generate target paths for all position candidates, the searcher 41 ends the pick-and-place process in step S508.

 こうして、探索器41は、既に決定された相対位置姿勢を変えることなく、新たなプレース目標位置を位置候補の中から優先度の高い順に探索する。探索器41は、ワークを適切にプレースできる位置候補を見つけると、n-1回目のロボット動作、即ち、現在進行中のロボット動作におけるプレース動作のプレース目標位置を新たな位置候補に修正する。 In this way, the searcher 41 searches for a new placement target position from among the position candidates in descending order of priority without changing the relative position and orientation that has already been determined. When the searcher 41 finds a position candidate where the workpiece can be appropriately placed, it corrects the placement target position of the n-1th robot operation, i.e., the placement operation in the currently ongoing robot operation, to the new position candidate.

 このプレース位置修正は、現在進行中のロボット動作におけるプレース動作が開始されるまでに完了される。プレース位置修正の後は、現在進行中のロボット動作の次のロボット動作のためのプレース位置計画(S102)が実行される。 This place position correction is completed by the time the place operation in the currently ongoing robot operation is started. After the place position correction, the place position plan (S102) for the next robot operation of the currently ongoing robot operation is executed.

 このような位置計画の並行処理について図15を参照して説明する。図15は、変形例に係るプレース位置計画、吸着位置計画及びロボット動作のタイミングチャートである。 Such parallel processing of position planning will be described with reference to FIG. 15. FIG. 15 is a timing chart of the place position planning, pickup position planning, and robot operation in the modified example.

 図15の例では、2回目のピッキング動作と並行して、3回目のプレース撮影が実行される。探索器41は、3回目のプレース撮影の後に、3回目のプレース撮影による画像と2回目のプレース撮影による画像との差分を求め(S401)、2回目のプレース動作のプレース目標位置の修正の要否を判定する(S402)。2回目のプレース撮影は、1回目のプレース動作の前に実行され、3回目のプレース撮影は、1回目のプレース動作の後に実行される。そのため、3回目のプレース撮影による画像と2回目のプレース撮影による画像との比較によって、1回目のプレース動作に起因する第2コンテナ92内における状況の変化がわかる。 In the example of FIG. 15, a third place photograph is taken in parallel with the second picking operation. After the third place photograph, the searcher 41 determines the difference between the image taken by the third place photograph and the image taken by the second place photograph (S401), and determines whether or not the place target position of the second place operation needs to be corrected (S402). The second place photograph is taken before the first place operation, and the third place photograph is taken after the first place operation. Therefore, by comparing the image taken by the third place photograph with the image taken by the second place photograph, the change in the situation inside the second container 92 caused by the first place operation can be determined.

 探索器41は、2回目のプレース目標位置の修正が必要と判定すると、プレース位置修正を実行する(S403)。プレース位置修正によって2回目のプレース目標位置が修正され、それに応じた目標経路が生成される。プレース位置修正は、2回目のプレース動作の前に完了する。動作制御器43は、修正されたプレース目標位置及び目標経路に応じて2回目のプレース動作を実行する。 When the searcher 41 determines that the second place target position needs to be corrected, it executes the place position correction (S403). The second place target position is corrected by the place position correction, and a target path is generated accordingly. The place position correction is completed before the second place operation. The operation controller 43 executes the second place operation according to the corrected place target position and target path.

 探索器41は、プレース位置修正の完了後は、3回目のプレース位置計画を実行する。 After completing the place position correction, the searcher 41 executes the third place position plan.

 このように、位置計画とロボット動作とが並行して行われる場合、プレース動作完了する前の収容スペースの画像に基づいて次のプレース位置計画が実行され得る。そこで、n回目のプレース撮影をn-1回目のプレース動作の前に実行し、n-1回目のプレース目標位置の修正の必要性をn回目のプレース撮影の画像を用いて判定する。そして、プレース目標位置の修正が必要な場合には、現在のハンド14の相対位置姿勢でワークを適切にプレースできる位置にプレース目標位置を修正する。これにより、ピックアンドプレース処理のサイクルタイムを短縮しつつ、ワークのプレースを適切に実行できる。 In this way, when position planning and robot operation are performed in parallel, the next place position plan can be executed based on the image of the storage space before the placing operation is completed. Therefore, the nth place photograph is executed before the n-1th placing operation, and the need to correct the n-1th placing target position is determined using the image from the nth place photograph. Then, if correction of the placing target position is necessary, the placing target position is corrected to a position where the work can be appropriately placed with the current relative position and orientation of the hand 14. This allows the work to be appropriately placed while shortening the cycle time of the pick-and-place process.

 このようなピックアンドプレース処理では、ワークのプレースの位置候補に対応するハンド14の相対位置姿勢の候補が複数作成され、ハンド14の複数の相対位置姿勢の候補の中からピッキング時のハンド14の相対位置姿勢が決定される。これにより、位置候補へのワークのプレースを実現できる、ハンド14の相対位置姿勢でワークのピッキングが行われる。その結果、ワークのプレースを予め考慮した上でワークのピッキングの仕方を決定することができる。さらに、プレースの位置候補に対応するハンド14の相対位置姿勢の候補は複数作成されるが、ピッキング時のハンド14の相対位置姿勢はそれらの中から決定されるので、いくつものピッキング時のハンド14の相対位置姿勢を演算する必要がない。つまり、ピッキング時のハンド14の相対位置姿勢を求めるための演算量を低減できる。 In such a pick-and-place process, multiple candidates for the relative position and orientation of the hand 14 corresponding to candidate positions for placing the workpiece are created, and the relative position and orientation of the hand 14 at the time of picking is determined from the multiple candidates for the relative position and orientation of the hand 14. This allows the workpiece to be picked at a relative position and orientation of the hand 14 that allows the workpiece to be placed at the candidate position. As a result, it is possible to determine how to pick the workpiece after taking into consideration the placement of the workpiece in advance. Furthermore, multiple candidates for the relative position and orientation of the hand 14 corresponding to candidate positions for placing are created, but the relative position and orientation of the hand 14 at the time of picking is determined from among them, so there is no need to calculate multiple relative positions and orientations of the hand 14 at the time of picking. In other words, the amount of calculation required to determine the relative position and orientation of the hand 14 at the time of picking can be reduced.

 また、複数の位置候補が存在する場合、複数の位置候補には優先度が付与されている。そして、優先度の高い位置候補に対応するハンド14の相対位置姿勢から順に、ワークをピッキングできるか否かが判定される。そのため、ピッキング時のハンド14の相対位置姿勢は、できるだけ優先度の高い位置候補に対応させることができる。その結果、比較的優先度の高い位置候補へのワークのプレースを実現できる。 In addition, when there are multiple position candidates, priorities are assigned to the multiple position candidates. Then, it is determined whether or not the workpiece can be picked up, starting from the relative position and posture of the hand 14 that corresponds to the position candidate with the highest priority. Therefore, the relative position and posture of the hand 14 at the time of picking can be made to correspond to the position candidate with the highest priority as much as possible. As a result, it is possible to place the workpiece at a position candidate with a relatively high priority.

 さらに、ピッキング時のハンド14の相対位置姿勢を決定する際には、複数のワークの中からピッキングの対象ワークWが同時に決定される。つまり、プレース時のハンド14の相対位置姿勢の複数の候補のうちの一の相対位置姿勢の候補で適切にピッキングできるか否かを複数のワークに対して試すことができる。すなわち、ハンド14の相対位置姿勢の候補でピッキングできるか否かを判定する際の選択肢を増やすことができる。 Furthermore, when determining the relative position and orientation of the hand 14 during picking, the target work W to be picked is determined from among multiple workpieces at the same time. In other words, it is possible to test multiple workpieces to see whether one of multiple candidates for the relative position and orientation of the hand 14 during placing can be used to properly pick the workpiece. In other words, it is possible to increase the options available when determining whether a workpiece can be picked using a candidate for the relative position and orientation of the hand 14.

 《その他の実施形態》
 以上のように、本出願において開示する技術の例示として、前記実施形態を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施の形態にも適用可能である。また、前記実施形態で説明した各構成要素を組み合わせて、新たな実施の形態とすることも可能である。また、添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、前記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。
Other Embodiments
As described above, the above embodiment has been described as an example of the technology disclosed in this application. However, the technology in this disclosure is not limited to this, and can be applied to embodiments in which modifications, replacements, additions, omissions, etc. are appropriately performed. In addition, it is also possible to combine the components described in the above embodiment to form a new embodiment. In addition, among the components described in the attached drawings and detailed description, not only components essential for solving the problem but also components that are not essential for solving the problem in order to exemplify the technology may be included. Therefore, the fact that these non-essential components are described in the attached drawings and detailed description should not immediately be taken to mean that these non-essential components are essential.

 例えば、主制御装置3が組み込まれたロボットシステム100は、対象ワークWを第1コンテナ91から第2コンテナ92へ移送するものに限定されない。例えば、ロボット1は、デバンニングを実行するものであってもよい。対象ワークWの収容先は、第2コンテナ92に限定されない。例えば、対象ワークWの収容先は、コンベア、棚又はパレット等であってもよい。 For example, the robot system 100 incorporating the main control device 3 is not limited to one that transfers the target work W from the first container 91 to the second container 92. For example, the robot 1 may perform debunking. The storage location of the target work W is not limited to the second container 92. For example, the storage location of the target work W may be a conveyor, a shelf, a pallet, or the like.

 ロボットアーム12は、垂直多関節型のロボットアームに限定されない。ロボットアーム12は、水平多関節型、パラレルリンク型、直角座標型、又は極座標型のロボットアーム等であってもよい。 The robot arm 12 is not limited to a vertical multi-joint type robot arm. The robot arm 12 may be a horizontal multi-joint type, a parallel link type, a Cartesian coordinate type, or a polar coordinate type robot arm, etc.

 吸着器15は、ロボットアーム12に対して基準軸T回りに回転しなくてもよい。吸着器15は、基準軸T回りに回転不能であっても、ロボットアーム12の基準軸Tを中心とする半径方向にオフセットした位置に配置、即ち、基準軸Tに対して偏心した位置に配置されていればよい。吸着器15は、負圧ではなく、磁力等によってワークを吸着してもよい。 The suction device 15 does not have to rotate around the reference axis T relative to the robot arm 12. Even if the suction device 15 cannot rotate around the reference axis T, it only needs to be positioned at a position offset in the radial direction around the reference axis T of the robot arm 12, that is, positioned eccentrically with respect to the reference axis T. The suction device 15 may attract the workpiece by magnetic force or the like instead of negative pressure.

 また、ハンド14によるワークの保持は、吸着に限定されず、把持等であってもよい。 In addition, the holding of the workpiece by the hand 14 is not limited to suction, but may also be gripping, etc.

 カメラ51は、第1カメラ51A及び第2カメラ51Bを含んでいなくてもよい。1つのカメラ51が第1コンテナ91内の画像と第2コンテナ92内の画像とを取得してもよい。カメラ51は、固定的に配置されたものではなく、ロボットアーム12に取り付けられ、ロボットアーム12によって移動させられてもよい。 The camera 51 does not have to include a first camera 51A and a second camera 51B. A single camera 51 may capture an image of the inside of the first container 91 and an image of the inside of the second container 92. The camera 51 does not have to be fixedly disposed, but may be attached to the robot arm 12 and moved by the robot arm 12.

 ロボットシステム100は、カメラ51を備えていなくてもよい。その場合、外部から主制御装置3に画像が入力されてもよい。画像の取得方法は問わない。外部から入力された画像は、メモリ33に記憶されてもよいし、記憶器32に記憶されてもよい。 The robot system 100 may not be equipped with a camera 51. In that case, an image may be input to the main control device 3 from outside. The method of acquiring the image is not important. The image input from outside may be stored in the memory 33 or in the storage device 32.

 尚、ピッキングされる対象ワークは、バラ積みされてなく、整列されていてもよい。対象ワークは、直方体のワークに限定されない。例えば、対象ワークは、略三角柱の形状を有していてもよい。対象ワークは、肥料、石灰、砂利等の粉体又は粒体が詰められた袋であってもよい。 The target work to be picked may not be piled up randomly, but may be aligned. The target work is not limited to a rectangular parallelepiped work. For example, the target work may have a shape of a roughly triangular prism. The target work may be a bag filled with powder or granular material such as fertilizer, lime, gravel, etc.

 動作制御器43、撮影制御器44及びワーク検出器45の処理は、一例に過ぎない。例えば、ワーク検出器45は、様々な手法によって画像からワークを検出し得る。 The processing by the operation controller 43, the photography controller 44, and the work detector 45 is merely an example. For example, the work detector 45 can detect a work from an image using various methods.

 探索器41及び決定器42の処理は、一例に過ぎない。例えば、探索器41は、プレースの位置候補の探索を任意の手法で行うことができる。あるいは、探索器41は、プレースの位置候補を外部から受け取ってもよい。 The processing of the searcher 41 and the determiner 42 is merely an example. For example, the searcher 41 can search for place location candidates using any method. Alternatively, the searcher 41 may receive place location candidates from outside.

 探索器41は、ハンド14の相対位置姿勢の候補のそれぞれに対して、位置候補を割り当てているが、これに限定されない。例えば、探索器41は、位置候補のそれぞれに対して、ハンド14の相対位置姿勢の候補を割り当ててもよい。 The searcher 41 assigns a position candidate to each of the candidates for the relative position and orientation of the hand 14, but is not limited to this. For example, the searcher 41 may assign a candidate for the relative position and orientation of the hand 14 to each of the position candidates.

 探索器41は、単一の位置候補に対してハンド14の相対位置姿勢の複数の候補を割り当てて、複数のプレース候補を作成してもよい。あるいは、探索器41は、ハンド14の相対位置姿勢の単一の候補に対して複数の位置候補を割り当てて、複数のプレース候補を作成してもよい。 The searcher 41 may assign multiple candidates for the relative position and orientation of the hand 14 to a single candidate position to create multiple place candidates. Alternatively, the searcher 41 may assign multiple position candidates to a single candidate for the relative position and orientation of the hand 14 to create multiple place candidates.

 探索器41は、プレース候補を作成する際に、目標経路を生成できるか否かによって、ハンド14及びロボットアーム12が他の物体に干渉することなくワークを保持できるか否かを判定しているが、これに限定されない。例えば、探索器41は、位置候補にプレースされたワークをハンド14で保持したときに、ハンド14及びロボットアーム12が他の物体に干渉するか否かだけを判定してもよい。 When creating a place candidate, the searcher 41 determines whether the hand 14 and robot arm 12 can hold the workpiece without interfering with other objects depending on whether a target path can be generated, but is not limited to this. For example, the searcher 41 may only determine whether the hand 14 and robot arm 12 will interfere with other objects when the hand 14 holds the workpiece placed in the position candidate.

 決定器42は、対象ワークWに対するハンド14の相対位置姿勢を決定する際に、目標経路を生成できるか否かによって、ハンド14及びロボットアーム12が他の物体に干渉することなくワークを保持できるか否かを判定しているが、これに限定されない。例えば、決定器42は、対象ワークをハンド14で保持したときに、ハンド14及びロボットアーム12が他の物体に干渉するか否かだけを判定してもよい。 When determining the relative position and orientation of the hand 14 with respect to the target workpiece W, the determiner 42 determines whether the hand 14 and robot arm 12 can hold the workpiece without interfering with other objects depending on whether a target path can be generated, but is not limited to this. For example, the determiner 42 may only determine whether the hand 14 and robot arm 12 will interfere with other objects when the target workpiece is held by the hand 14.

 決定器42がハンド14の相対位置姿勢を決定する際に、対象ワークWは既に決まっていてもよい。 When the determiner 42 determines the relative position and orientation of the hand 14, the target workpiece W may already be determined.

 制御装置は、主制御装置3のような単一の装置でなくてもよい。制御装置は、分離された複数の装置を含んでいてもよい。例えば、ワーク検出器45と決定器42と動作制御器43とは、別々の装置によってそれぞれ実現されてもよい。 The control device does not have to be a single device such as the main control device 3. The control device may include multiple separate devices. For example, the work detector 45, the determiner 42, and the operation controller 43 may each be realized by separate devices.

 フローチャートは、一例に過ぎない。フローチャートにおけるステップを適宜、変更、置き換え、付加、省略等を行ってもよい。また、フローチャートにおけるステップの順番を変更したり、直列的な処理を並列的に処理したりしてもよい。例えば、図9のステップS101のプレース撮影は、S103のピッキング撮影又はステップS104のワーク検出と連続的に実行されてもよい。 The flowchart is merely an example. Steps in the flowchart may be changed, replaced, added, omitted, etc. as appropriate. The order of steps in the flowchart may also be changed, and serial processing may be performed in parallel. For example, the placement photography in step S101 in FIG. 9 may be performed consecutively with the picking photography in S103 or the workpiece detection in step S104.

 また、位置計画とロボット動作とは、並行ではなく、連続的に実行されてもよい。 Also, position planning and robot operation may be performed sequentially rather than in parallel.

 本明細書中に記載されている構成要素により実現される機能は、当該記載された機能を実現するようにプログラムされた、汎用プロセッサ、特定用途プロセッサ、集積回路、ASICs(Application Specific Integrated Circuits)、CPU(a Central Processing Unit)、従来型の回路、及び/又はそれらの組合せを含む、回路(circuitry)又は演算回路(processing circuitry)において実装されてもよい。プロセッサは、トランジスタ及びその他の回路を含み、回路又は演算回路とみなされる。プロセッサは、メモリに格納されたプログラムを実行する、プログラマブルプロセッサ(programmed processor)であってもよい。 The functions provided by the components described herein may be implemented in circuitry or processing circuitry, including general purpose processors, application specific processors, integrated circuits, ASICs (Application Specific Integrated Circuits), CPUs (Central Processing Units), conventional circuits, and/or combinations thereof, programmed to provide the described functions. A processor includes transistors and other circuits and is considered a circuit or processing circuit. A processor may be a programmable processor that executes a program stored in a memory.

 本明細書において、回路(circuitry)、ユニット、手段は、記載された機能を実現するようにプログラムされたハードウェア、又は実行するハードウェアである。当該ハードウェアは、本明細書に開示されているあらゆるハードウェア、又は、当該記載された機能を実現するようにプログラムされた、又は、実行するものとして知られているあらゆるハードウェアであってもよい。 In this specification, a circuit, unit, or means is hardware that is programmed to realize or executes the described functions. The hardware may be any hardware disclosed in this specification or any hardware known to be programmed to realize or execute the described functions.

 当該ハードウェアが回路(circuitry)のタイプであるとみなされるプロセッサである場合、当該回路、手段、又はユニットは、ハードウェアと、当該ハードウェア及び又はプロセッサを構成する為に用いられるソフトウェアの組合せである。 If the hardware is a processor that is considered to be a type of circuitry, the circuit, means, or unit is a combination of hardware and software used to configure the hardware and/or processor.

 本開示の技術をまとめると以下のようになる。 The technology disclosed here can be summarized as follows:

 [1] 主制御装置3(制御装置)は、ロボットアーム12、及び、前記ロボットアーム12に取り付けられたハンド14を有し、ワークのピックアンドプレース処理を行うロボット1を制御する制御装置であって、プレースの位置候補にプレースされるワークに対する前記ハンド14の相対的な位置及び姿勢の候補を複数探索する探索器41と、ピッキングの対象ワークWに対する前記ハンド14の相対的な位置及び姿勢を、プレース時の前記ハンド14の相対的な位置及び姿勢の複数の候補の中から決定する決定器42とを備える。 [1] The main control device 3 (control device) is a control device that controls the robot 1 that has a robot arm 12 and a hand 14 attached to the robot arm 12 and performs pick-and-place processing of a workpiece, and is equipped with a searcher 41 that searches for multiple candidates for the relative position and orientation of the hand 14 with respect to the workpiece to be placed at a candidate place position, and a determiner 42 that determines the relative position and orientation of the hand 14 with respect to the workpiece W to be picked from multiple candidates for the relative position and orientation of the hand 14 at the time of placement.

 この構成によれば、まず、プレースの位置候補に対応するハンド14の相対位置姿勢の複数の候補が作成され、次に、ハンド14の相対位置姿勢の複数の候補の中から、ピッキングの対象ワークWに対するハンド14の相対位置姿勢が決定される。ピッキング時のハンド14の相対位置姿勢を決定する際には複数の候補が与えられているため、ピッキング時にワークの適切な保持を実現しやすくなる。さらに、このときの複数の候補はプレースの位置候補に対応しているため、複数の候補の中から決定されたハンド14の相対位置姿勢でピッキングされたワークは、位置候補に適切にプレースできる。その結果、ワークのピッキングもプレースも適切に実行できる。 With this configuration, first, multiple candidates for the relative position and orientation of the hand 14 corresponding to the candidate placement positions are created, and then the relative position and orientation of the hand 14 with respect to the workpiece W to be picked is determined from the multiple candidates for the relative position and orientation of the hand 14. Since multiple candidates are provided when determining the relative position and orientation of the hand 14 during picking, it becomes easier to achieve appropriate holding of the workpiece during picking. Furthermore, since the multiple candidates at this time correspond to the candidate placement positions, the workpiece picked in the relative position and orientation of the hand 14 determined from the multiple candidates can be appropriately placed in the candidate position. As a result, both picking and placing of the workpiece can be performed appropriately.

 [2] [1]に記載の主制御装置3において、前記探索器41は、複数の前記位置候補を探索すると共に、複数の前記位置候補に対応する前記ハンド14の相対的な位置及び姿勢の複数の候補を探索し、前記決定器42は、前記対象ワークWに対する前記ハンド14の相対的な位置及び姿勢に決定された、プレース時の前記ハンド14の相対的な位置及び姿勢の候補に対応する前記位置候補をワークのプレース目標位置に決定する。 [2] In the main control device 3 described in [1], the searcher 41 searches for a plurality of the position candidates and also searches for a plurality of candidates for the relative position and orientation of the hand 14 corresponding to the plurality of the position candidates, and the determiner 42 determines the position candidate corresponding to the candidate for the relative position and orientation of the hand 14 at the time of placement, which is determined as the relative position and orientation of the hand 14 with respect to the target workpiece W, as the placement target position of the workpiece.

 この構成によれば、ピッキング時のハンド14の相対位置姿勢を決定することによって、プレース目標位置も決定される。つまり、ピッキング時のハンド14の相対位置姿勢とプレース目標位置とが互いに関連して決定される。その結果、ワークのピッキングもプレースも適切に実行できる。 With this configuration, the placement target position is also determined by determining the relative position and orientation of the hand 14 during picking. In other words, the relative position and orientation of the hand 14 during picking and the placement target position are determined in relation to each other. As a result, both picking and placing of the work can be performed appropriately.

 [3] [1]又は[2]に記載の主制御装置3において、前記探索器41は、複数の位置候補に優先度を付与し、前記決定器42は、プレース時の前記ハンド14の相対的な位置及び姿勢の複数の候補が前記対象ワークWに対する前記ハンド14の相対的な位置及び姿勢となるか否かを、対応する前記位置候補の前記優先度の順に判定する。 [3] In the main control device 3 described in [1] or [2], the searcher 41 assigns priorities to multiple position candidates, and the determiner 42 determines whether multiple candidates for the relative position and orientation of the hand 14 at the time of placement are the relative position and orientation of the hand 14 with respect to the target workpiece W, in order of the priority of the corresponding position candidates.

 この構成によれば、複数の位置候補に対応するハンド14の相対位置姿勢の複数の候補が作成され、複数の位置候補には優先度が付与される。そして、ハンド14の相対位置姿勢の複数の候補が対象ワークWに対するハンド14の相対位置姿勢となるか否かが、対応する位置候補の優先度の順に判定される。これにより、優先度の比較的高い位置候補に対応するハンド14の相対位置姿勢の候補がピッキング時のハンド14の相対位置姿勢に決定されやすくなる。その結果、優先度の比較的高い位置候補へのワークのプレースが実現される。 With this configuration, multiple candidates for the relative position and orientation of the hand 14 corresponding to multiple position candidates are created, and priorities are assigned to the multiple position candidates. Then, whether or not the multiple candidates for the relative position and orientation of the hand 14 become the relative position and orientation of the hand 14 with respect to the target workpiece W is determined in the order of priority of the corresponding position candidates. This makes it easier for a candidate for the relative position and orientation of the hand 14 corresponding to a position candidate with a relatively high priority to be determined as the relative position and orientation of the hand 14 at the time of picking. As a result, the placement of the workpiece in a position candidate with a relatively high priority is realized.

 [4] [1]乃至[3]の何れか1つに記載の主制御装置3において、前記決定器42は、ピッキングの対象ワークWに対する前記ハンド14の相対的な位置及び姿勢を、プレース時の前記ハンド14の相対的な位置及び姿勢の複数の候補の中から決定するときに、複数のワークの中から前記対象ワークWを決定する。 [4] In the main control device 3 described in any one of [1] to [3], the determiner 42 determines the target work W from among a plurality of works when determining the relative position and posture of the hand 14 with respect to the target work W to be picked from among a plurality of candidates for the relative position and posture of the hand 14 at the time of placement.

 この構成によれば、対象ワークの選択肢が複数ある場合には、決定器42は、ハンド14の相対位置姿勢を決定する際に複数のワークの中から対象ワークも決定する。対象ワークの選択肢が増えるので、対象ワークをピッキングできるハンド14の相対位置姿勢を決定する選択の幅が拡がる。 With this configuration, when there are multiple options for the target workpiece, the determiner 42 also determines the target workpiece from among the multiple works when determining the relative position and orientation of the hand 14. Since the options for the target workpiece increase, the range of choices for determining the relative position and orientation of the hand 14 capable of picking up the target workpiece is expanded.

 [5] [1]乃至[4]の何れか1つに記載の主制御装置3において、前記対象ワークWをピッキングするピッキング動作、及び、前記対象ワークWをプレースするプレース動作を含むロボット動作を前記ロボットアーム12及び前記ハンド14に実行させる動作制御器43とをさらに備え、前記探索器41及び前記決定器42は、前記動作制御器43による前記ロボット動作と並行して、次回の前記ピッキング動作及び前記プレース動作のためのプレース時の前記ハンド14の相対的な位置及び姿勢の候補の探索並びにピッキングの前記対象ワークWに対する前記ハンド14の相対的な位置及び姿勢の決定を行う。 [5] The main control device 3 described in any one of [1] to [4] further includes an operation controller 43 that causes the robot arm 12 and the hand 14 to execute robot operations including a picking operation for picking the target work W and a placing operation for placing the target work W, and the searcher 41 and the determiner 42 search for candidates for the relative position and posture of the hand 14 at the time of placing for the next picking operation and the placing operation, and determine the relative position and posture of the hand 14 with respect to the target work W for picking, in parallel with the robot operation by the operation controller 43.

 この構成によれば、ピッキング動作及びプレース動作を含むロボット動作と、プレース時のハンド14の相対位置姿勢の候補の探索及びピッキング時のハンド14の相対位置姿勢の決定とが並行して行われる。その結果、ピックアンドプレース処理のサイクルタイムを短縮できる。 With this configuration, the robot operation, including the picking operation and the placing operation, and the search for candidates for the relative position and orientation of the hand 14 during placing and the determination of the relative position and orientation of the hand 14 during picking are performed in parallel. As a result, the cycle time of the pick-and-place process can be shortened.

 [6] [1]乃至[5]の何れか1つに記載の主制御装置3において、ピッキング前のワークの画像を取得するピッキング撮影を第1カメラ51Aに実行させると共に、ワークのプレース予定のスペースの画像を取得するプレース撮影を第2カメラ51Bに実行させる撮影制御器44をさらに備え、前記探索器41は、前記プレース撮影による画像に基づいて、前記位置候補並びに前記ハンド14の相対的な位置及び姿勢の候補を探索し、前記決定器42は、前記ピッキング撮影による画像に基づいて、前記対象ワークWに対する前記ハンド14の相対的な位置及び姿勢を決定し、前記撮影制御器44は、n-2回目(nは3以上の自然数)の前記プレース動作の後で且つn-1回目の前記プレース動作の前に前記第2カメラ51Bにn回目の前記プレース撮影を実行させ、前記探索器41は、n回目の前記プレース撮影による画像に基づいてn-1回目の前記プレース動作のプレース目標位置を修正する。 [6] The main control device 3 described in any one of [1] to [5] further includes an imaging controller 44 that causes the first camera 51A to perform picking photography to obtain an image of the work before picking and causes the second camera 51B to perform place photography to obtain an image of the space in which the work is to be placed, the searcher 41 searches for the position candidates and the relative position and posture candidates of the hand 14 based on the image obtained by the place photography, the determiner 42 determines the relative position and posture of the hand 14 with respect to the target work W based on the image obtained by the pick photography, the imaging controller 44 causes the second camera 51B to perform the n-th place photography after the n-2th place operation (n is a natural number of 3 or more) and before the n-1th place operation, and the searcher 41 corrects the place target position of the n-1th place operation based on the image obtained by the nth place photography.

 この構成によれば、探索器41は、プレース撮影の画像に基づいて、位置候補及びハンド14の相対位置姿勢の候補を探索する。決定器42は、ピッキング撮影の画像に基づいて、対象ワークWに対するハンド14の相対位置姿勢を決定する。n回目のプレース撮影は、n-2回目のプレース動作の後で且つn-1回目のプレース動作の前に実行される。つまり、n-1回目のプレース撮影は、n-2回目のプレース動作の前に実行される。n-1回目のプレース動作のプレース目標位置は、基本的には、n-2回目のプレース動作の前の画像に基づいて決められている。そのため、n-2回目のプレース動作が完了することによって、プレース予定のスペースの状況が変化している可能性がある。n回目のプレース撮影は、n-2回目のプレース動作後に行われるので、n-2回目のプレース動作完了後のプレース予定のスペースの画像を取得する。さらに、n回目のプレース動作は、n-1回目のプレース動作の開始前に行われる。そこで、探索器41は、n回目のプレース撮影による画像に基づいてn-1回目のプレース動作のプレース目標位置を修正する。これにより、n-1回目のプレース動作をより適切に実行できる。 According to this configuration, the searcher 41 searches for position candidates and candidates for the relative position and orientation of the hand 14 based on the image of the place photograph. The determiner 42 determines the relative position and orientation of the hand 14 with respect to the target work W based on the image of the pick photograph. The nth place photograph is performed after the n-2th place operation and before the n-1th place operation. In other words, the n-1th place photograph is performed before the n-2th place operation. The place target position of the n-1th place operation is basically determined based on the image before the n-2th place operation. Therefore, the situation of the space where the placement is scheduled to take place may have changed by the completion of the n-2th place operation. Since the nth place photograph is performed after the n-2th place operation, an image of the space where the placement is scheduled to take place after the completion of the n-2th place operation is obtained. Furthermore, the nth place operation is performed before the start of the n-1th place operation. Therefore, the searcher 41 corrects the placement target position of the (n-1)th placement operation based on the image captured during the nth placement shot. This allows the (n-1)th placement operation to be executed more appropriately.

 [7] ロボットシステム100は、ロボットアーム12と前記ロボットアーム12に取り付けられたハンド14とを有するロボット1と、[1]乃至[6]の何れか1つに記載の主制御装置3とを備える。 [7] The robot system 100 includes a robot 1 having a robot arm 12 and a hand 14 attached to the robot arm 12, and a main control device 3 described in any one of [1] to [6].

 この構成によれば、まず、プレースの位置候補に対応するハンド14の相対位置姿勢の複数の候補が作成され、次に、ハンド14の相対位置姿勢の複数の候補の中から、ピッキングの対象ワークWに対するハンド14の相対位置姿勢が決定される。ピッキング時のハンド14の相対位置姿勢を決定する際には複数の候補が与えられているため、ピッキング時にワークの適切な保持を実現しやすくなる。さらに、このときの複数の候補はプレースの位置候補に対応しているため、複数の候補の中から決定されたハンド14の相対位置姿勢でピッキングされたワークは、位置候補に適切にプレースできる。その結果、ワークのピッキングもプレースも適切に実行できる。 With this configuration, first, multiple candidates for the relative position and orientation of the hand 14 corresponding to the candidate placement positions are created, and then the relative position and orientation of the hand 14 with respect to the workpiece W to be picked is determined from the multiple candidates for the relative position and orientation of the hand 14. Since multiple candidates are provided when determining the relative position and orientation of the hand 14 during picking, it becomes easier to achieve appropriate holding of the workpiece during picking. Furthermore, since the multiple candidates at this time correspond to the candidate placement positions, the workpiece picked in the relative position and orientation of the hand 14 determined from the multiple candidates can be appropriately placed in the candidate position. As a result, both picking and placing of the workpiece can be performed appropriately.

 [8]ロボットアーム12と前記ロボットアーム12に取り付けられたハンド14とを有し、ワークのピックアンドプレース処理を行うロボット1の制御方法は、プレースの位置候補にプレースされるワークに対する前記ハンド14の相対的な位置及び姿勢の候補を複数探索することと、ピッキングの対象ワークWに対する前記ハンド14の相対的な位置及び姿勢を、プレース時の前記ハンド14の相対的な位置及び姿勢の複数の候補の中から決定することとを含む。 [8] A control method for a robot 1 having a robot arm 12 and a hand 14 attached to the robot arm 12 and performing pick-and-place processing of a workpiece includes searching for multiple candidates for the relative position and orientation of the hand 14 with respect to a workpiece to be placed at a candidate placement position, and determining the relative position and orientation of the hand 14 with respect to a workpiece W to be picked from multiple candidates for the relative position and orientation of the hand 14 at the time of placement.

 この構成によれば、まず、プレースの位置候補に対応するハンド14の相対位置姿勢の複数の候補が作成され、次に、ハンド14の相対位置姿勢の複数の候補の中から、ピッキングの対象ワークWに対するハンド14の相対位置姿勢が決定される。ピッキング時のハンド14の相対位置姿勢を決定する際には複数の候補が与えられているため、ピッキング時にワークの適切な保持を実現しやすくなる。さらに、このときの複数の候補はプレースの位置候補に対応しているため、複数の候補の中から決定されたハンド14の相対位置姿勢でピッキングされたワークは、位置候補に適切にプレースできる。その結果、ワークのピッキングもプレースも適切に実行できる。 With this configuration, first, multiple candidates for the relative position and orientation of the hand 14 corresponding to the candidate placement positions are created, and then the relative position and orientation of the hand 14 with respect to the workpiece W to be picked is determined from the multiple candidates for the relative position and orientation of the hand 14. Since multiple candidates are provided when determining the relative position and orientation of the hand 14 during picking, it becomes easier to achieve appropriate holding of the workpiece during picking. Furthermore, since the multiple candidates at this time correspond to the candidate placement positions, the workpiece picked in the relative position and orientation of the hand 14 determined from the multiple candidates can be appropriately placed in the candidate position. As a result, both picking and placing of the workpiece can be performed appropriately.

100 ロボットシステム
1   ロボット
12  ロボットアーム
14  ハンド
3   主制御装置(制御装置)
41  探索器
42  決定器
43  動作制御器
44  撮影制御器
51A 第1カメラ
51B 第2カメラ
W   対象ワーク
100 Robot system 1 Robot 12 Robot arm 14 Hand 3 Main control device (control device)
41 Searcher 42 Deciding device 43 Operation controller 44 Photography controller 51A First camera 51B Second camera W Target work

Claims (8)

 ロボットアーム、及び、前記ロボットアームに取り付けられたハンドを有し、ワークのピックアンドプレース処理を行うロボットを制御する制御装置であって、
 プレースの位置候補にプレースされるワークに対する前記ハンドの相対的な位置及び姿勢の候補を複数探索する探索器と、
 ピッキングの対象ワークに対する前記ハンドの相対的な位置及び姿勢を、プレース時の前記ハンドの相対的な位置及び姿勢の複数の候補の中から決定する決定器とを備える制御装置。
A control device for controlling a robot that has a robot arm and a hand attached to the robot arm and performs a pick-and-place process for a workpiece,
a searcher that searches for a plurality of candidates for a relative position and orientation of the hand with respect to a workpiece to be placed at a candidate placement position;
A control device comprising: a determiner that determines the relative position and orientation of the hand with respect to the workpiece to be picked from among a plurality of candidates for the relative position and orientation of the hand at the time of placing.
 請求項1に記載の制御装置において、
 前記探索器は、複数の前記位置候補を探索すると共に、複数の前記位置候補に対応する前記ハンドの相対的な位置及び姿勢の複数の候補を探索し、
 前記決定器は、前記対象ワークに対する前記ハンドの相対的な位置及び姿勢に決定された、プレース時の前記ハンドの相対的な位置及び姿勢の候補に対応する前記位置候補をワークのプレース目標位置に決定する制御装置。
2. The control device according to claim 1,
the searcher searches a plurality of the position candidates and searches a plurality of candidates for relative positions and orientations of the hand corresponding to the plurality of the position candidates;
The determiner is a control device that determines the position candidate corresponding to the candidate relative position and orientation of the hand at the time of placement, which is determined as the relative position and orientation of the hand with respect to the target workpiece, as the placement target position of the workpiece.
 請求項2に記載の制御装置において、
 前記探索器は、複数の位置候補に優先度を付与し、
 前記決定器は、プレース時の前記ハンドの相対的な位置及び姿勢の複数の候補が前記対象ワークに対する前記ハンドの相対的な位置及び姿勢となるか否かを、対応する前記位置候補の前記優先度の順に判定する制御装置。
3. The control device according to claim 2,
The searcher prioritizes a plurality of location candidates;
The determiner is a control device that determines whether multiple candidates for the relative position and posture of the hand at the time of placement are the relative position and posture of the hand with respect to the target work in order of the priority of the corresponding position candidates.
 請求項1に記載の制御装置において、
 前記決定器は、ピッキングの対象ワークに対する前記ハンドの相対的な位置及び姿勢を、プレース時の前記ハンドの相対的な位置及び姿勢の複数の候補の中から決定するときに、複数のワークの中から前記対象ワークを決定する制御装置。
2. The control device according to claim 1,
The determiner is a control device that determines the target work from among multiple works when determining the relative position and posture of the hand with respect to the work to be picked from among multiple candidates for the relative position and posture of the hand at the time of placing.
 請求項1に記載の制御装置において、
 前記対象ワークをピッキングするピッキング動作、及び、前記対象ワークをプレースするプレース動作を含むロボット動作を前記ロボットアーム及び前記ハンドに実行させる動作制御器とをさらに備え、
 前記探索器及び前記決定器は、前記動作制御器による前記ロボット動作と並行して、次回の前記ピッキング動作及び前記プレース動作のためのプレース時の前記ハンドの相対的な位置及び姿勢の候補の探索並びにピッキングの前記対象ワークに対する前記ハンドの相対的な位置及び姿勢の決定を行う制御装置。
2. The control device according to claim 1,
and an operation controller that causes the robot arm and the hand to execute robot operations including a picking operation for picking the target workpiece and a placing operation for placing the target workpiece,
The searcher and the determiner are control devices that, in parallel with the robot operation by the operation controller, search for candidates for the relative position and posture of the hand at the time of placing for the next picking operation and placing operation, and determine the relative position and posture of the hand with respect to the target work to be picked.
 請求項5に記載の制御装置において、
 ピッキング前のワークの画像を取得するピッキング撮影を第1カメラに実行させると共に、ワークのプレース予定のスペースの画像を取得するプレース撮影を第2カメラに実行させる撮影制御器をさらに備え、
 前記探索器は、前記プレース撮影による画像に基づいて、前記位置候補並びに前記ハンドの相対的な位置及び姿勢の候補を探索し、
 前記決定器は、前記ピッキング撮影による画像に基づいて、前記対象ワークに対する前記ハンドの相対的な位置及び姿勢を決定し、
 前記撮影制御器は、n-2回目(nは3以上の自然数)の前記プレース動作の後で且つn-1回目の前記プレース動作の前に前記第2カメラにn回目の前記プレース撮影を実行させ、
 前記探索器は、n回目の前記プレース撮影による画像に基づいてn-1回目の前記プレース動作のプレース目標位置を修正する制御装置。
6. The control device according to claim 5,
The system further includes an imaging controller that causes the first camera to perform picking imaging to acquire an image of the work before it is picked, and causes the second camera to perform place imaging to acquire an image of the space in which the work is to be placed,
The searcher searches for the position candidates and the relative position and posture candidates of the hand based on the image captured by the place photographing,
The determiner determines a relative position and orientation of the hand with respect to the target work based on the image obtained by the picking photography,
the shooting controller causes the second camera to execute the n-th place shooting after the n-2-th (n is a natural number equal to or greater than 3) place operation and before the n-1-th place operation;
The searcher is a control device that corrects a place target position of the (n-1)th placing operation based on an image captured by the nth placing operation.
 ロボットアームと前記ロボットアームに取り付けられたハンドとを有するロボットと、
 請求項1乃至6の何れか1つに記載の制御装置とを備えるロボットシステム。
a robot having a robot arm and a hand attached to the robot arm;
A robot system comprising the control device according to any one of claims 1 to 6.
 ロボットアームと前記ロボットアームに取り付けられたハンドとを有し、ワークのピックアンドプレース処理を行うロボットの制御方法であって、
 プレースの位置候補にプレースされるワークに対する前記ハンドの相対的な位置及び姿勢の候補を複数探索することと、
 ピッキングの対象ワークに対する前記ハンドの相対的な位置及び姿勢を、プレース時の前記ハンドの相対的な位置及び姿勢の複数の候補の中から決定することとを含むロボットの制御方法。
A method for controlling a robot having a robot arm and a hand attached to the robot arm, the robot arm performing a pick-and-place process for a workpiece, comprising the steps of:
Searching for a plurality of candidates for a position and an orientation of the hand relative to a workpiece to be placed at a candidate placement position;
and determining the relative position and orientation of the hand with respect to the workpiece to be picked from among a plurality of candidates for the relative position and orientation of the hand at the time of placing.
PCT/JP2023/020184 2023-05-30 2023-05-30 Control device, robot system, and robot control method Pending WO2024247135A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/020184 WO2024247135A1 (en) 2023-05-30 2023-05-30 Control device, robot system, and robot control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/020184 WO2024247135A1 (en) 2023-05-30 2023-05-30 Control device, robot system, and robot control method

Publications (1)

Publication Number Publication Date
WO2024247135A1 true WO2024247135A1 (en) 2024-12-05

Family

ID=93657092

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/020184 Pending WO2024247135A1 (en) 2023-05-30 2023-05-30 Control device, robot system, and robot control method

Country Status (1)

Country Link
WO (1) WO2024247135A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018144159A (en) * 2017-03-03 2018-09-20 株式会社キーエンス ROBOT SETTING DEVICE, ROBOT SYSTEM, ROBOT SETTING METHOD, ROBOT SETTING PROGRAM, COMPUTER READABLE RECORDING MEDIUM, AND RECORDED DEVICE
JP2019042828A (en) * 2017-08-30 2019-03-22 株式会社ダイフク Picking equipment
WO2022153373A1 (en) * 2021-01-12 2022-07-21 川崎重工業株式会社 Action generation device, robot system, action generation method, and action generation program
WO2022162969A1 (en) * 2021-01-27 2022-08-04 川崎重工業株式会社 Robot system, movement path generation device and movement path generation method
JP2023072410A (en) * 2021-11-12 2023-05-24 株式会社東芝 Picking system, control device, picking method, program and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018144159A (en) * 2017-03-03 2018-09-20 株式会社キーエンス ROBOT SETTING DEVICE, ROBOT SYSTEM, ROBOT SETTING METHOD, ROBOT SETTING PROGRAM, COMPUTER READABLE RECORDING MEDIUM, AND RECORDED DEVICE
JP2019042828A (en) * 2017-08-30 2019-03-22 株式会社ダイフク Picking equipment
WO2022153373A1 (en) * 2021-01-12 2022-07-21 川崎重工業株式会社 Action generation device, robot system, action generation method, and action generation program
WO2022162969A1 (en) * 2021-01-27 2022-08-04 川崎重工業株式会社 Robot system, movement path generation device and movement path generation method
JP2023072410A (en) * 2021-11-12 2023-05-24 株式会社東芝 Picking system, control device, picking method, program and storage medium

Similar Documents

Publication Publication Date Title
CN111504328B (en) Robot motion planning method, path planning method, grasping method and device thereof
JP5685027B2 (en) Information processing apparatus, object gripping system, robot system, information processing method, object gripping method, and program
JP7191569B2 (en) gripping device
JP5304469B2 (en) Bin picking system
CN110587592B (en) Robot control device, robot control method, and computer-readable recording medium
KR20140008262A (en) Robot system, robot, robot control device, robot control method, and robot control program
CN112292235B (en) Robot control device, robot control method and recording medium
JP2012030320A (en) Work system, working robot controller, and work program
US10656097B2 (en) Apparatus and method for generating operation program of inspection system
WO2020179416A1 (en) Robot control device, robot control method, and robot control program
CN113165187B (en) Image information processing device, control system and image information processing method
US20190030722A1 (en) Control device, robot system, and control method
US20240157567A1 (en) Picking system
WO2024247135A1 (en) Control device, robot system, and robot control method
JP7708840B2 (en) Robot device for detecting interference between robot components
JP7466003B2 (en) ROBOT SYSTEM, PICKING METHOD, AND COMPUTER PROGRAM
JP2017056528A (en) Transfer system and transfer method
JP2001088073A (en) Appearance inspection device
JP6708142B2 (en) Robot controller
WO2024247134A1 (en) Control device, robot system, and robot control method
WO2024247136A1 (en) Control device, robot system, and robot control method
TW202333915A (en) Control device of substrate transfer robot and control method of joint motor
WO2020050405A1 (en) Work device
WO2024166293A1 (en) Control device, robot system, and robot control method
JP7583942B2 (en) ROBOT CONTROL DEVICE, ROBOT CONTROL SYSTEM, AND ROBOT CONTROL METHOD

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23939613

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2025523766

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2025523766

Country of ref document: JP