[go: up one dir, main page]

WO2020022041A1 - Système de commande, procédé de commande et programme - Google Patents

Système de commande, procédé de commande et programme Download PDF

Info

Publication number
WO2020022041A1
WO2020022041A1 PCT/JP2019/026959 JP2019026959W WO2020022041A1 WO 2020022041 A1 WO2020022041 A1 WO 2020022041A1 JP 2019026959 W JP2019026959 W JP 2019026959W WO 2020022041 A1 WO2020022041 A1 WO 2020022041A1
Authority
WO
WIPO (PCT)
Prior art keywords
state
robot
control
frame
target frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/026959
Other languages
English (en)
Japanese (ja)
Inventor
加藤 豊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Omron Tateisi Electronics Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp, Omron Tateisi Electronics Co filed Critical Omron Corp
Publication of WO2020022041A1 publication Critical patent/WO2020022041A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01RELECTRICALLY-CONDUCTIVE CONNECTIONS; STRUCTURAL ASSOCIATIONS OF A PLURALITY OF MUTUALLY-INSULATED ELECTRICAL CONNECTING ELEMENTS; COUPLING DEVICES; CURRENT COLLECTORS
    • H01R43/00Apparatus or processes specially adapted for manufacturing, assembling, maintaining, or repairing of line connectors or current collectors or for joining electric conductors
    • H01R43/26Apparatus or processes specially adapted for manufacturing, assembling, maintaining, or repairing of line connectors or current collectors or for joining electric conductors for engaging or disengaging the two parts of a coupling device

Definitions

  • the present technology relates to a control system, a control method, and a program for controlling a robot.
  • Non-Patent Document 1 The technique described in Non-Patent Document 1 is based on the premise that the relative position between the terminal supported by the robot hand and the camera is constant. Therefore, if the relative positional relationship between the terminal supported by the robot hand and the camera is deviated, the connection between the state of the terminal and the state of the connector may be lost, and the terminal may not be inserted into the connector.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a control system, a control method, and a program in which the states of a plurality of objects can be changed in a coordinated manner.
  • the control system includes first to Nth robots, an imaging device for imaging the first to Nth objects, and a control system for controlling the first to Nth robots.
  • a control device. N is an integer of 2 or more.
  • the i-th robot changes the state of the i-th object.
  • i is an integer of 1 to N-1.
  • the Nth robot changes one state of the Nth object and the imaging device.
  • the other of the N-th object and the imaging device is installed at a fixed position.
  • the control device acquires change information for each of the first to Nth objects.
  • the change information corresponding to the j-th object indicates a relationship between the control amount of the j-th robot and the change amount of the state of the j-th object on the image of the imaging device.
  • the control device is configured to perform a first process of acquiring a real image captured by the imaging device, a second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects, a real image and a target frame. And a third process for controlling each of the first to Nth robots based on the above.
  • the control device causes the state of the j-th object on the real image to approach the state of the j-th object on the target frame based on the change information corresponding to the j-th object.
  • the control amount of the j-th robot is calculated, and the j-th robot is controlled according to the calculated control amount.
  • the state of the target on the real image can be changed to the state of the target on the target frame based on the real image, the target frame, and the change information.
  • the states of the first to Nth objects change in a coordinated manner according to the reference moving image.
  • control device may determine that a deviation between a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame is less than a threshold. Then, the target frame is updated. According to this disclosure, the state of the first object can be reliably changed according to the state on the target frame.
  • the control device selects the second target frame from the plurality of frames.
  • the control device includes: a first time when a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold; The first robot and the second robot such that the deviation between the state of the target object and the state of the second target object on the second target frame is less than a second threshold value at a second time. Control.
  • the time difference between the time when the first object reaches the state of the first target frame and the time when the second object reaches the state of the second target frame is a desired time. Then, the state of the first object and the state of the second object can be changed.
  • the imaging apparatus images the (N + 1) th object together with the first to Nth objects.
  • the reference moving image includes the (N + 1) th object.
  • the control device selects a second target frame from the plurality of frames.
  • the control device includes: a first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold;
  • the first robot is controlled such that the second time at which the deviation between the state of the target object and the state of the first target object on the second target frame is less than the second threshold satisfies a specified condition.
  • the time difference between the time when the (N + 1) th object reaches the state of the first target frame and the time when the first object reaches the state of the second target frame is a desired time. Then, the state of the first object can be changed.
  • control device may control a state of at least one of the first to Nth objects on the real image and a state of the target frame during a period from when the target frame is selected to when a predetermined time has elapsed. If the deviation from the state of at least one of the above objects does not become smaller than the threshold value, it is determined that an abnormality has occurred in the control system. According to this disclosure, a countermeasure against an abnormality of the control system can be started quickly.
  • the Nth robot changes the state of the imaging device.
  • the control device controls only the Nth robot in the third process.
  • each of the first to N-th robots is controlled in the third processing.
  • the control of the first to N-1st robots causes the first to N-1st objects and the N-th robot to be controlled.
  • the state with the object can be changed in conjunction with the reference moving image.
  • control device repeatedly executes a series of processing including the first processing to the third processing, and starts the first processing of the next series of processing while performing the third processing.
  • the target robot is continuously controlled according to the latest actual image without stopping the operation of the target robot. As a result, the state of the object can be changed quickly.
  • the control device selects a frame included in the predicted horizon period from the reference moving image as a target frame.
  • the control device determines a state of the j-th object on the frame included in the predicted horizon period in the reference moving image and a state of the j-th object on the image of the imaging device in the predicted horizon period.
  • the control amount of the j-th robot during the control horizon period is calculated so as to minimize the deviation. According to this disclosure, a change in the state of the target object can be made closer to the reference moving image.
  • the state indicates at least one of the position, posture, shape, and size of the object.
  • the position, posture, shape, and size of the first object can be respectively changed to the position, posture, shape, and size of the first object on the reference moving image.
  • the control method controls the first to Nth robots using the imaging device for imaging the first to Nth objects.
  • N is an integer of 2 or more.
  • the i-th robot changes the state of the i-th object.
  • i is an integer of 1 to N-1.
  • the Nth robot changes one state of the Nth object and the imaging device.
  • the other of the N-th object and the imaging device is installed at a fixed position.
  • the control method includes a first step of acquiring change information for each of the first to Nth objects.
  • the change information corresponding to the j-th object indicates a relationship between the control amount of the j-th robot and the change amount of the state of the j-th object on the image of the imaging device.
  • j is an integer of 1 to N.
  • the control method includes: a second step of acquiring a real image captured by the imaging device; a third step of selecting a target frame from reference moving images indicating samples of the first to Nth objects; A fourth step of controlling each of the first to Nth robots based on the above.
  • the fourth step is a j-th robot for bringing the state of the j-th object on the real image closer to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. And controlling the j-th robot in accordance with the calculated control amount.
  • a program is a program for causing a computer to execute the above control method.
  • the states of a plurality of objects also change in a coordinated manner.
  • the states of a plurality of objects change in a coordinated manner.
  • FIG. 1 is a schematic diagram illustrating an outline of a control system according to a first embodiment.
  • FIG. 4 is a diagram illustrating an example of a real image captured by an imaging device and a first reference moving image.
  • FIG. 3 is a diagram illustrating an example of a real image captured by an imaging device and a second reference moving image.
  • FIG. 2 is a schematic diagram illustrating a hardware configuration of a control device included in the control system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of the control device according to the first embodiment.
  • FIG. 6 is a diagram illustrating an example of a method for creating a template.
  • FIG. 3 is a block diagram illustrating a functional configuration of a first control unit and a second control unit according to the first embodiment.
  • FIG. 5 is a diagram illustrating a method of generating a first change information set in a first control unit.
  • FIG. 4 is a diagram illustrating a method of calculating a control amount by a calculation unit of a first control unit.
  • 9 is a flowchart illustrating an example of a flow of a change information generation process performed by a change information generation unit.
  • 11 is a flowchart showing the flow of processing of a subroutine of step S2 shown in FIG. 6 is a flowchart illustrating an example of a flow of a process of controlling the target robot to change the state of the target object along the reference moving image in the first embodiment.
  • 13 is a flowchart showing the flow of processing of a subroutine of step S44 shown in FIG. FIG.
  • FIG. 3 is a diagram illustrating a relationship between a closest frame and a target frame.
  • 13 is a flowchart showing the flow of a subroutine of step S46 shown in FIG. It is a figure which shows another example of the reference moving image used as a sample of the connection of a male connector and a female connector.
  • 15 is a flowchart illustrating an example of a flow of a target frame selection process in Modification 1 of Embodiment 1. It is a flowchart which shows an example of the flow of abnormality determination processing.
  • FIG. 9 is a schematic diagram illustrating an object of a control system according to a second modification of the first embodiment.
  • FIG. 9 is a schematic diagram illustrating an outline of a control system according to a second embodiment.
  • FIG. 7 is a block diagram illustrating a functional configuration of a control device according to a second embodiment. It is a figure showing an example of a screen for designating a required passage frame and a related frame.
  • 20 is a flowchart illustrating an example of a flow of a target frame selection process in Embodiment 2.
  • FIG. 14 is a schematic diagram illustrating an object of a control system according to a first modification of the second embodiment.
  • FIG. 15 is a diagram illustrating an example of a reference moving image according to a first modification of the second embodiment. It is a figure showing an example of arrangement of four imaging devices.
  • FIG. 7 is a block diagram illustrating a functional configuration of a control device according to a second embodiment. It is a figure showing an example of a screen for designating a required passage frame and a related frame.
  • 20 is a flowchart illustrating an example of a flow of a target frame selection process in Embodiment 2.
  • FIG. 14 is a schematic diagram illustrating an
  • FIG. 13 is a schematic diagram illustrating an object of a control system according to a second modification of the second embodiment.
  • FIG. 9 is a schematic diagram illustrating an outline of a control system according to a third embodiment.
  • FIG. 13 is a block diagram illustrating a functional configuration of a control device according to Embodiment 3.
  • 13 is a flowchart illustrating a processing flow of a first control unit and a second control unit according to the third embodiment.
  • FIG. 13 is a schematic diagram illustrating an outline of a part of a control system according to a modification of the third embodiment.
  • FIG. 1 is a schematic diagram illustrating an outline of the control system according to the first embodiment.
  • the control system 1 connects the male connector 2a and the female connector 2b by inserting the male connector 2a into the female connector 2b in, for example, an industrial product production line.
  • control system 1 includes imaging devices 21 and 22, robots 30a and 30b, robot controllers 40a and 40b, and a control device 50.
  • the imaging devices 21 and 22 capture an image of a subject present in the field of view and generate image data (hereinafter, simply referred to as “image”).
  • image image data
  • the imaging devices 21 and 22 are set at fixed positions different from those of the robots 30a and 30b.
  • the imaging devices 21 and 22 are installed at different places, and image the male connector 2a and the female connector 2b as subjects from different directions.
  • the imaging devices 21 and 22 perform imaging according to a predetermined imaging cycle, and output a real image obtained by the imaging to the control device 50.
  • the robot 30a is a mechanism for changing the state (here, position and posture) of the male connector 2a, and is, for example, a vertical articulated robot.
  • the robot 30a has a hand 31a at its tip for supporting (holding) the male connector 2a, and changes the position and posture of the hand 31a with six degrees of freedom.
  • the robot 30a changes the position and the posture of the male connector 2a held by the hand 31a with six degrees of freedom.
  • the six degrees of freedom include translational degrees of freedom in the X, Y, and Z directions, and rotational degrees of freedom in the pitch, yaw, and roll directions.
  • the number of degrees of freedom of the hand 31a is not limited to six, and may be three to five or seven or more.
  • the robot 30a has a plurality of servo motors, and the position and posture of the male connector 2a are changed by driving the servo motors.
  • An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
  • the robot 30b is a mechanism for changing the state (here, position and posture) of the female connector 2b, and is, for example, an XY ⁇ stage.
  • the robot 30b has a stage 31b for supporting (mounting) the female connector 2b, and changes the position and posture of the stage 31b with three degrees of freedom. That is, the robot 30b changes the position and posture of the female connector 2b mounted on the stage 31b with three degrees of freedom.
  • the three degrees of freedom include a translational degree of freedom in the X direction and the Y direction and a rotational degree of freedom in a rotational direction ( ⁇ direction) about an axis orthogonal to the XY plane.
  • the number of degrees of freedom of the robot 30b is not limited to three, and may be four or more.
  • the robot 30b has a plurality of servomotors, and drives and changes the position and orientation of the female connector 2b.
  • An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
  • the robot controller 40a controls the operation of the robot 30a according to the control command received from the control device 50.
  • the robot controller 40a receives from the control device 50 control commands for translational degrees of freedom in the X, Y, and Z directions and rotational degrees of freedom in the pitch, yaw, and roll directions. These X direction, Y direction, Z direction, pitch direction, yaw direction, and roll direction are indicated by the coordinate system of the robot 30a.
  • the robot controller 40a performs feedback control on the robot 30a so that the translation amounts of the hand 31a in the X, Y, and Z directions approach control commands for the degrees of freedom of translation in the X, Y, and Z directions, respectively. Do.
  • the robot controller 40a performs feedback control on the robot 30a so that the rotational movement amounts of the hand 31a in the pitch direction, the yaw direction, and the roll direction approach the control commands for the rotational degrees of freedom in the pitch, yaw, and roll directions, respectively. Do.
  • the robot controller 40b controls the operation of the robot 30b according to the control command received from the control device 50.
  • the robot controller 40b receives from the control device 50 control commands for the degrees of freedom of translation and rotation in the X and Y directions. These X direction, Y direction, and rotation direction are indicated by the coordinate system of the robot 30b.
  • the robot controller 40b performs feedback control on the robot 30b such that the translation amounts of the stage 31b in the X and Y directions approach control commands for the degrees of freedom of translation in the X and Y directions, respectively.
  • the robot controller 40b performs feedback control on the robot 30b such that the rotational movement amount of the stage 31b approaches the control command of the rotational degree of freedom.
  • the control device 50 controls the robots 30a and 30b via the robot controllers 40a and 40b, respectively.
  • the control device 50 stores a first reference moving image and a second reference moving image, each of which represents a sample of the male connector 2a and the female connector 2b.
  • the first reference moving image is a moving image when viewed from the position of the imaging device 21.
  • the second reference moving image is a moving image when viewed from the position of the imaging device 22.
  • Each of the first reference moving image and the second reference moving image includes a plurality of frames (hereinafter, referred to as M (M is an integer of 2 or more)) arranged in time series.
  • M is an integer of 2 or more
  • the k-th frame (k is an integer from 1 to M) of the first reference moving image and the k-th frame of the second reference moving image are obtained by simultaneously viewing the male connector 2a and the female connector 2b in a certain state from different directions. It is an image when it is.
  • the control device 50 acquires change information indicating the relationship between the control amount of the robot 30a and the change amount of the state of the male connector 2a on the real images of the imaging devices 21 and 22. Further, the control device 50 acquires change information indicating the relationship between the control amount of the robot 30b and the change amount of the state of the female connector 2b on the real images of the imaging devices 21 and 22.
  • the controller 50 performs the following first to third processing.
  • the control device 50 repeatedly executes a series of processes including the first to third processes.
  • the first process is a process of acquiring actual images captured by the imaging devices 21 and 22.
  • the second process is a process of selecting a target frame from each of the first reference moving image and the second reference moving image.
  • the control device 50 selects the k-th frame from the second reference moving image as the target frame.
  • the third process is a process for controlling each of the robots 30a and 30b based on the actual image and the target frame.
  • the control device 50 calculates and calculates the control amount of the robot 30a for bringing the state of the male connector 2a on the actual image closer to the state of the male connector 2a on the target frame based on the change information corresponding to the robot 30a.
  • the robot 30a is controlled according to the control amount.
  • the control device 50 generates a control command indicating a control amount of the robot 30a, and outputs the generated control command to the robot controller 40a.
  • control device 50 calculates a control amount of the robot 30b for bringing the state of the female connector 2b on the actual image closer to the state of the female connector 2b on the target frame based on the change information corresponding to the robot 30b, The robot 30b is controlled according to the calculated control amount.
  • the control device 50 generates a control command indicating a control amount of the robot 30b, and outputs the generated control command to the robot controller 40b.
  • FIG. 2 is a diagram illustrating an example of an actual image captured by the imaging device 21 and a first reference moving image.
  • FIG. 3 is a diagram illustrating an example of a real image captured by the imaging device 22 and a second reference moving image.
  • FIG. 2 shows real images 90a to 93a captured by the imaging device 21 and frames 70a to 73a of the first reference moving image.
  • FIG. 3 shows real images 90b to 93b imaged by the imaging device 22, and frames 70b to 73b of the second reference moving image.
  • the real images 90a and 90b are images captured at the same time.
  • the real images 91a and 91b are images captured at the same time after the real images 90a and 90b.
  • the real images 92a and 92b are images captured at the same time after the real images 91a and 91b.
  • the real images 93a and 93b are images captured at the same time after the real images 92a and 92b.
  • Each of the frames 70a and 70b is the first frame in the corresponding reference moving image.
  • Each of the frames 71a and 71b is the s-th (s is an integer of 2 or more) frame in the corresponding reference moving image.
  • Each of the frames 72a and 72b is a t-th (t is an integer greater than s) frame in the corresponding reference moving image.
  • Each of the frames 73a and 73b is a u-th (u is an integer greater than t) frame in the corresponding reference moving image.
  • the male connector 2a held by the hand 31a moves downward from above the female connector 2b. 2 shows a state of connection with the female connector 2b.
  • the control device 50 acquires real images 90a and 90b including the female connector 2b placed on the stage 31b and the male connector 2a held by the hand 31a from the imaging devices 21 and 22, respectively.
  • the control device 50 selects, from the first reference moving image and the second reference moving image, the frames 71a and 71b when the female connector 2b has moved to a desired position and posture, respectively, as target frames.
  • the controller 50 controls the stage 31b to bring the state of the female connector 2b on the actual images 90a, 90b closer to the state of the female connector 2b on the frames 71a, 71b based on the change information corresponding to the female connector 2b. Calculate the amount. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40b. The robot controller 40b controls the robot 30b according to a control command. As a result, as shown in the real images 91a and 91b, the position and posture of the female connector 2b change to desired positions and postures (the positions and postures indicated by the frames 71a and 71b).
  • the control device 50 changes the control amount of the hand 31a for bringing the state of the male connector 2a on the real images 90a and 90b closer to the state of the male connector 2a on the frames 71a and 71b. calculate. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a.
  • the robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the actual images 91a and 91b, the state of the male connector 2a changes to the position and posture above the female connector 2b (the positions and postures indicated by the frames 71a and 71b).
  • control device 50 selects the frames 72a and 72b when the male connector 2a has moved to a position immediately above the female connector 2b as target frames.
  • the control device 50 controls the hand 31a to bring the state of the male connector 2a on the real images 91a and 91b closer to the state of the male connector 2a on the frames 72a and 72b based on the change information corresponding to the male connector 2a. Calculate the amount. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a.
  • the robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the real images 92a and 92b, the position and posture of the male connector 2a change to the position and posture directly above the female connector 2b (the position and posture indicated by the frames 72a and 72b).
  • control device 50 selects the frames 73a and 73b when the connection between the male connector 2a and the female connector 2b is completed as target frames.
  • the control device 50 controls the hand 31a to bring the state of the male connector 2a on the real images 92a and 92b closer to the state of the male connector 2a on the frames 73a and 73b based on the change information corresponding to the male connector 2a. Is calculated. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a. The robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the real images 93a and 93b, the male connector 2a moves to the position and posture (the positions and postures indicated by the frames 73a and 73b) at which the connection to the female connector 2b is completed.
  • control device 50 can change the male connector 2a and the female connector 2b from the state on the actual image to the state on the target frame.
  • the states of the male connector 2a and the female connector 2b change in a coordinated manner according to the first reference moving image and the second reference moving image.
  • the control device 50 can control the robots 30a and 30b without using calibration data for associating the coordinate systems of the imaging devices 21 and 22 with the robots 30a and 30b. Therefore, the operator does not need to perform the calibration in advance. Further, the operator designs in advance an operation program of the robot 30a for changing the state of the male connector 2a to a desired state and an operation program of the robot 30b for changing the state of the female connector 2b to a desired state. No need to do. Therefore, it is possible to reduce the labor required for changing the states of the male connector 2a and the female connector 2b to desired states by the robots 30a and 30b.
  • the positions and postures of the male connector 2a and the female connector 2b in the real space are specified from the images taken by the imaging devices 21 and 22, and the robots 30a and 30b are controlled based on the positions and postures. There is a way to do it.
  • the accuracy of the calibration data is reduced in accordance with the aging of the robots 30a and 30b, and the male connector 2a and the female connector 2b cannot be connected well.
  • the male connector 2a and the female connector 2b cannot be connected properly due to a positional shift or individual difference between the male connector 2a and the female connector 2b. Even in such a case, by using this application example, the male connector 2a and the female connector 2b can be connected in accordance with the reference moving image.
  • FIG. 4 is a schematic diagram illustrating a hardware configuration of a control device included in the control system according to the first embodiment.
  • the control device 50 has a structure according to a computer architecture, and executes various programs as described below by executing a program installed in advance by a processor.
  • control device 50 includes a processor 510 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 512, a display controller 514, a system controller 516, It includes an I / O (Input @ Output) controller 518, a hard disk 520, a camera interface 522, an input interface 524, a robot controller interface 526, a communication interface 528, and a memory card interface 530. These units are connected to each other so as to enable data communication with the system controller 516 as the center.
  • a processor 510 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 512, a display controller 514, a system controller 516, It includes an I / O (Input @ Output) controller 518, a hard disk 520, a camera interface 522, an input interface 524, a robot controller interface 526, a communication
  • the processor 510 exchanges programs (codes) and the like with the system controller 516 and executes them in a predetermined order, thereby realizing the intended arithmetic processing.
  • the system controller 516 is connected to the processor 510, the RAM 512, the display controller 514, and the I / O controller 518 via buses, respectively, exchanges data with each unit, and performs processing of the entire control device 50. Govern.
  • the RAM 512 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and includes programs read from the hard disk 520, images (image data) acquired by the imaging devices 21 and 22, The processing result for the image, work data, and the like are stored.
  • DRAM Dynamic Random Access Memory
  • the display controller 514 is connected to the display unit 532, and outputs a signal for displaying various information to the display unit 532 according to an internal command from the system controller 516.
  • the I / O controller 518 controls data exchange between a recording medium connected to the control device 50 and an external device. More specifically, the I / O controller 518 is connected to the hard disk 520, the camera interface 522, the input interface 524, the robot controller interface 526, the communication interface 528, and the memory card interface 530.
  • the hard disk 520 is typically a nonvolatile magnetic storage device, and stores various information in addition to the control program 550 executed by the processor 510.
  • the control program 550 installed on the hard disk 520 is distributed while being stored in a memory card 536 or the like.
  • a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be employed.
  • the camera interface 522 corresponds to an input unit that receives image data from the imaging devices 21 and 22, and mediates data transmission between the processor 510 and the imaging devices 21 and 22.
  • the camera interface 522 includes image buffers 522a and 522b for temporarily storing image data from the imaging devices 21 and 22, respectively.
  • a single image buffer that can be shared may be provided for a plurality of imaging devices, it is preferable to independently arrange a plurality of image buffers in association with each imaging device in order to speed up processing.
  • the input interface 524 mediates data transmission between the processor 510 and an input device 534 such as a keyboard, a mouse, a touch panel, and a dedicated console.
  • the robot controller interface 526 mediates data transmission between the processor 510 and the robot controllers 40a and 40b.
  • the communication interface 528 mediates data transmission between the processor 510 and another personal computer or server (not shown).
  • the communication interface 528 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the memory card interface 530 mediates data transmission between the processor 510 and the memory card 536 as a recording medium.
  • the memory card 536 circulates in a state where a control program 550 executed by the control device 50 and the like are stored, and the memory card interface 530 reads the control program 550 from the memory card 536.
  • the memory card 536 includes a general-purpose semiconductor storage device such as an SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), and an optical recording medium such as a CD-ROM (Compact Disk-Read Only Memory). Consists of Alternatively, a program downloaded from a distribution server or the like may be installed in the control device 50 via the communication interface 528.
  • an OS for providing basic functions of the computer in addition to an application for providing functions according to the present embodiment may be installed.
  • the control program according to the present embodiment executes processing by calling necessary modules in a predetermined order and / or timing among program modules provided as a part of the OS. Is also good.
  • control program according to the present embodiment may be provided by being incorporated in a part of another program. Even in such a case, the program itself does not include a module included in another program to be combined as described above, and the process is executed in cooperation with the other program. That is, the control program according to the present embodiment may be a form incorporated in such another program.
  • part or all of the functions provided by executing the control program may be implemented as a dedicated hardware circuit.
  • FIG. 5 is a block diagram illustrating a functional configuration of the control device according to the first embodiment.
  • the control device 50 includes a reference moving image storage unit 51, a teaching range selection unit 52, an image processing unit 53, a target frame selection unit 54, a first control unit 55a, and a second control unit.
  • the reference moving image storage unit 51 includes the hard disk 520 and the RAM 512 shown in FIG.
  • the teaching range selection unit 52 and the image processing unit 53 are realized by the processor 510 shown in FIG.
  • the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image.
  • the first reference moving image and the second reference moving image show how the male connector 2a and the female connector 2b are moved and connected to each other by manually operating the robots 30a and 30b.
  • the first reference moving image and the second reference moving image may show a state in which the male connector 2a and the female connector 2b are moved by an operator's hand and connected to each other.
  • the difference between the working distance (WD) of the imaging device 21 when the male connector 2a is closest to the imaging device 21 and when the male connector 2a is farthest from the imaging device 21 is the working distance. Small enough compared to.
  • the difference in working distance of the imaging device 22 between when the male connector 2a is closest to the imaging device 22 and when the male connector 2a is farthest from the imaging device 22 is smaller than the working distance. Small enough.
  • a change in the posture of the male connector 2a during movement is very small. Therefore, the shape and size of the male connector 2a hardly change in the first reference moving image and the second reference moving image.
  • the difference between the working distance of the imaging device 21 when the female connector 2b is closest to the imaging device 21 and the working distance of the female connector 2b when it is farthest from the imaging device 21 is the working distance. It is small enough.
  • the difference between the working distance of the imaging device 22 when the female connector 2b is at the position closest to the imaging device 22 and when the female connector 2b is at the position farthest from the imaging device 22 is the working distance. It is small enough. Further, a change in the posture of the female connector 2b during movement is very small. Therefore, the shape and size of the female connector 2b hardly change in the first reference moving image and the second reference moving image.
  • the teaching range selection unit 52 selects, for each object (here, the male connector 2a and the female connector 2b), a teaching range serving as a sample of the object from the first reference moving image and the second reference moving image.
  • the teaching range selection unit 52 displays a screen prompting the user to select a teaching range on the display unit 532.
  • the operator checks each frame of the first reference moving image and the second reference moving image, and operates the input device 534 to determine the first frame and the last frame of a series of frames in which the object is performing a desired operation. specify.
  • the teaching range selection unit 52 selects a designated frame from the first frame to the last frame as the teaching range.
  • the teaching range selection unit 52 sets the female connector from the first frame 70 a to a frame after the s-th frame (a frame where a part of the female connector 2 b starts to be cut off). 2b is selected as the teaching range.
  • the teaching range selection unit 52 selects a range from the first frame 70a (a frame where the entire male connector 2a starts to appear) to the u-th frame 73a as the teaching range of the male connector 2a.
  • the teaching range selection unit 52 sets the female from the first frame 70b to the frame after the s-th frame (the frame where a part of the female connector 2b starts to be cut off). Select as the teaching range of the connector 2b.
  • the teaching range selection unit 52 selects the range from the first frame 70b (the frame where the entire male connector 2a starts to appear) to the u-th frame 73b as the teaching range of the male connector 2a.
  • the image processing unit 53 performs image processing on the target image, and detects an object from the target image using template matching.
  • a template which is data representing the image feature of the target object is prepared in advance, and the degree of matching of the image feature between the target image and the template is evaluated. This is a process for detecting the position, posture, shape, and size of the image.
  • the target images on which the image processing unit 53 performs the image processing are the frames of the first reference moving image, the frames of the second reference moving image, and the real images captured by the imaging devices 21 and 22.
  • the image processing unit 53 creates a template for each object (the male connector 2a and the female connector 2b) as advance preparation.
  • FIG. 6 is a diagram showing an example of a method for creating a template.
  • FIG. 6A shows a frame selected from the first reference moving image.
  • FIG. 6B shows a frame selected from the second reference moving image.
  • the image processing unit 53 causes the display unit 532 (see FIG. 4) to display a frame selected by the operator from each of the first reference moving image and the second reference moving image. The operator may visually select a frame in which the entire object (the male connector 2a or the female connector 2b) is shown.
  • the image processing unit 53 accepts the designation of the area of the target object on the frame displayed on the display unit 532. For example, the operator operates the input device 534 (see FIG. 4) to input a line 3a surrounding the male connector 2a and a line 3b surrounding the female connector 2b.
  • the image processing unit 53 specifies a region surrounded by the line 3a as an image region of the male connector 2a, and specifies a region surrounded by the line 3b as an image region of the female connector 2b.
  • the image processing unit 53 extracts a plurality of feature points of the male connector 2a and their feature amounts from the image region surrounded by the line 3a for each frame of the first reference image and the second reference image.
  • the image processing unit 53 creates the coordinates and the feature amount of each of the plurality of feature points on the image as a template of the male connector 2a.
  • the image processing unit 53 extracts a plurality of feature points of the female connector 2b and their feature amounts from the image region surrounded by the line 3b. Extract.
  • the image processing unit 53 creates the coordinates and the feature amount of each of the plurality of feature points on the image as a template of the female connector 2b.
  • a feature point is a point characterized by a corner or an outline included in an image, and is, for example, an edge point.
  • the feature quantity is, for example, luminance, luminance gradient direction, quantization gradient direction, HoG (Histogram of Oriented Gradients), HAAR-like, SIFT (Scale-Invariant Feature Transform), and the like.
  • the luminance gradient direction represents a direction (angle) of a luminance gradient in a local region centered on a feature point as a continuous value
  • the quantization gradient direction is a local region centered on a feature point.
  • the direction of the luminance gradient is represented by a discrete value (for example, eight directions are held by 1-byte information of 0 to 7).
  • the image processing unit 53 extracts a plurality of feature points and their feature amounts from the frame of the first reference moving image or the real image captured by the imaging device 21.
  • the image processing unit 53 detects the target in the image by comparing the extracted feature points and feature amounts with the template of the target created from the frame of the first reference image.
  • the image processing unit 53 extracts a plurality of feature points and their feature amounts from the frame of the second reference moving image and the image captured by the imaging device 22.
  • the image processing unit 53 detects the target in the image by comparing the extracted feature points and feature amounts with the template of the target created from the frame of the second reference image.
  • the image processing unit 53 outputs, for each target object (male connector 2a and female connector 2b), the coordinates on the image of each feature point of the target object extracted from the target image.
  • Target frame selection section The target frame selection unit 54 selects a target frame from the first reference moving image and the second reference moving image. However, when the k-th frame of the first reference moving image is selected as the target frame, the target frame selecting unit 54 selects the k-th frame of the second reference moving image as the target frame. A specific example of the target frame selection method will be described later.
  • the first control unit 55a controls the robot 30a via the robot controller 40a and changes the state of the male connector 2a.
  • the second controller 55b controls the robot 30b via the robot controller 40b to change the state of the female connector 2b.
  • FIG. 7 is a block diagram showing a functional configuration of the first control unit and the second control unit according to the first embodiment.
  • each of the first control unit 55a and the second control unit 55b includes a change information generation unit 56, a change information storage unit 57, a calculation unit 58, a command unit 59, and an end determination unit. 60.
  • the change information storage unit 57 includes the hard disk 520 and the RAM 512 shown in FIG.
  • the change information generation unit 56, the calculation unit 58, the command unit 59, and the end determination unit 60 are realized by the processor 510 illustrated in FIG.
  • the change information generation unit 56 performs a first change indicating the relationship between the control amount of the target robot and the change amount of the state of the target object on the real image captured by the imaging device 21 for each of the plurality of degrees of freedom. Generate information.
  • the change information generating unit 56 stores a first change information set 571 including a plurality of first change information generated for a plurality of degrees of freedom in the change information storage unit 57.
  • the change information generating unit 56 displays, for each of the plurality of degrees of freedom, a relationship between the control amount of the target robot and the change amount of the state of the target object on the real image captured by the imaging device 22. 2 Change information is generated.
  • the change information generation unit 56 stores a second change information set 572 including a plurality of pieces of second change information generated for a plurality of degrees of freedom in the change information storage unit 57.
  • the target object is the male connector 2a in the first control unit 55a and the female connector 2b in the second control unit 55b.
  • the target robot is the robot 30a in the first control unit 55a, and the robot 30b in the second control unit 55b.
  • the plurality of degrees of freedom are six degrees of freedom in the first control unit 55a and three degrees of freedom in the second control unit 55b.
  • the first change information and the second change information indicate the amount of change in the state of the target on the image when the target robot is controlled by the unit control amount.
  • the first change information and the second change information include an object on the image before controlling the target robot by the unit control amount, and an object on the image after controlling the target robot by the unit control amount.
  • a mapping that converts to
  • the change information generating unit 56 generates the first change information set 571 for each frame in the teaching range of the first reference moving image. Further, the change information generating unit 56 generates a second change information set 572 for each frame in the teaching range of the second reference moving image.
  • the process of generating and storing the first change information set 571 and the second change information set 572 by the change information generating unit 56 is executed as preparation.
  • a method of generating the first change information set 571 in the first control unit 55a will be described with reference to FIG. Note that the method of generating the second change information set 572 in the first control unit 55a and the method of generating the first change information set 571 and the second change information set 572 in the second control unit 55b are the same. The description of the generation method is omitted.
  • FIG. 8 is a diagram illustrating a method of generating the first change information set in the first control unit.
  • FIG. 8A shows the k-th frame 84 of the first reference moving image.
  • the state (here, position and orientation) of the male connector 2a corresponding to the k-th frame 84 in the real space is set as a reference state.
  • FIG. 8B illustrates an image 94a captured by the imaging device 21 after the male connector 2a is translated from the reference state by the unit control amount to the degree of translational freedom in the Y direction.
  • FIG. 8C illustrates an image 94b captured by the imaging device 21 after the male connector 2a is translated from the reference state by the unit control amount to the degree of translation in the X direction.
  • FIG. 8A shows the k-th frame 84 of the first reference moving image.
  • the state (here, position and orientation) of the male connector 2a corresponding to the k-th frame 84 in the real space is set as a reference state.
  • FIG. 8B
  • FIG. 8D shows an image 94c captured by the imaging device 21 after the male connector 2a is translated from the reference state by a unit control amount to the translational freedom in the Z direction.
  • FIG. 8E shows an image 94d captured by the imaging device 21 after the male connector 2a is rotationally moved by the unit control amount to the rotational degree of freedom in the pitch direction from the reference state.
  • FIG. 8F illustrates an image 94e captured by the imaging device 21 after the male connector 2a is rotationally moved by a unit control amount to the rotational degree of freedom in the yaw direction from the reference state.
  • FIG. 8G shows an image 94f captured by the imaging device 21 after the male connector 2a is rotationally moved by the unit control amount to the rotational degree of freedom in the roll direction from the reference state.
  • the change information generating unit 56 acquires, from the image processing unit 53, the coordinates on the image of each feature point of the male connector 2a extracted from each of the frame 84 and the images 94a to 94f.
  • the change information generating unit 56 outputs information indicating a mapping for converting the coordinates of the feature points 4a 'to 4g' of the male connector 2a extracted from the frame 84 into the coordinates of the feature points 4a to 4g extracted from the image 94a. , And the first change information corresponding to the translational degree of freedom in the Y direction.
  • the change information generation unit 56 converts information indicating a mapping that transforms the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94b according to the translation degree of freedom in the X direction. Is generated as first change information.
  • the change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94c into the first information corresponding to the translation degree in the Z direction. Generated as change information.
  • the change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94d into the first information corresponding to the rotational degree of freedom in the pitch direction. Generated as change information.
  • the change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94e into the first information corresponding to the rotational degree of freedom in the yaw direction. Generated as change information.
  • the change information generation unit 56 converts information indicating a mapping that transforms the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94f according to the first degree of rotation corresponding to the rotational degree of freedom in the roll direction. Generated as change information. In this way, the change information generating unit 56 generates the first change information set 571 corresponding to the k-th frame 84 of the first reference moving image.
  • the change information generation unit 56 generates the first change information set 571 corresponding to the remaining frames in the teaching range of the first reference moving image by the same method.
  • the calculating unit 58 includes a plurality of degrees of freedom for bringing the state of the target on the real image captured by the imaging devices 21 and 22 closer to the state of the target on target frames of the first reference moving image and the second reference moving image, respectively. Is calculated.
  • the calculation unit 58 acquires the first change information set 571 and the second change information set 572 corresponding to the target frame from the change information storage unit 57, and stores the acquired first change information set 571 and second change information set 572 in the acquired first change information set 571 and second change information set 572.
  • the control amount is calculated based on the control amount.
  • the first change information and the second change information include an object on the image before controlling the target robot by the unit control amount and an object on the image after controlling the target robot by the unit control amount.
  • the calculation unit 58 acquires from the image processing unit 53 the coordinates on the image of the feature points of the object extracted from the real image and the coordinates on the image of the feature points of the object extracted from the target frame.
  • the calculating unit 58 calculates a control amount of each of a plurality of degrees of freedom for mapping the target on the real image to the target on the target frame based on the first change information and the second change information.
  • FIG. 9 is a diagram illustrating a method of calculating the control amount by the calculation unit of the first control unit.
  • the calculating unit 58 obtains, from the image processing unit 53, the coordinates on the image of the feature points 4a 'to 4g' of the male connector 2a extracted from the target frame of the first reference moving image. Further, the calculation unit 58 acquires from the image processing unit 53 the coordinates on the image of the characteristic points 4a to 4g of the male connector 2a extracted from the real image obtained by the imaging of the imaging device 21.
  • the number of feature points is not limited to seven.
  • the calculation unit 58 calculates the difference vectors 61a to 61g of each feature point.
  • the difference vectors 61a to 61g are vectors having the feature points 4a to 4g as starting points and the feature points 4a 'to 4g' as end points, respectively.
  • the calculation unit 58 calculates the average x component ⁇ x1 and y component ⁇ y1 of the difference vectors 61a to 61g.
  • the x component and the y component are indicated by the coordinate system of the image.
  • the calculation unit 58 calculates the average x component of the difference vector between the feature point extracted from the real image obtained by the imaging of the imaging device 22 and the feature point extracted from the target frame of the second reference moving image. Calculate ⁇ x2 and y component ⁇ y2.
  • the calculation unit 58 calculates the control amounts of the three translational degrees of freedom such that the average of the difference vectors of the plurality of feature points is eliminated. Specifically, the calculating unit 58 uses the ⁇ x1, ⁇ y1, ⁇ x2, and ⁇ y2, the first change information set 571, and the second change information set 572 to translate the hand 31a in the X, Y, and Z directions. The control amount of each degree is calculated.
  • the male connector 2a When the hand 31a translates in any of the degrees of freedom of translation in the X, Y, and Z directions, the male connector 2a translates in a certain direction on the images obtained by the imaging devices 21 and 22. Therefore, the first change information corresponding to the degree of freedom of translation in the first change information set 571 indicates a mapping that converts an arbitrary point on the image into a point translated in a certain direction. Similarly, the second change information corresponding to the degree of freedom of translation in the second change information set 572 indicates a mapping that converts an arbitrary point on the image into a point translated in a certain direction.
  • the first change information corresponding to the translation degree of freedom in the X direction of the first change information set 571 corresponding to the target frame indicates a mapping that converts the point (x, y) into a point (x + dX1_1, y + dY1_1).
  • the first change information corresponding to the degree of freedom of translation in the Y direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX1_2, y + dY1_2).
  • the first change information corresponding to the degree of freedom of translation in the Z direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX1_3, y + dY1_3).
  • the second change information corresponding to the degree of freedom of translation in the X direction of the second change information set 572 corresponding to the target frame converts an arbitrary point (x, y) on the image into a point (x + dX2_1, y + dY2_1).
  • the second change information corresponding to the degree of freedom of translation in the Y direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX2_2, y + dY2_2).
  • the second change information corresponding to the translation degree of freedom in the Z direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX2_3, y + dY2_3).
  • the calculation unit 58 calculates coefficients a1, a2, and a3 by solving the following four linear equations (1) to (4).
  • the first change information and the second change information indicate the amount of change in the state of the male connector 2a on the image when the robot 30a is controlled by the unit control amount. Therefore, the calculation unit 58 calculates the control amount of the translational freedom in the X direction by a1 times the unit control amount, the control amount of the translational freedom in the Y direction by a2 times the unit control amount, and the control amount of the translational freedom in the Z direction. Is a3 times the unit control amount.
  • the control amounts of these translation degrees of freedom are control amounts for bringing the state of the male connector 2a on the real image closer to the state of the male connector 2a on the target frame by the average of the difference vectors of the plurality of feature points.
  • the calculation unit 58 calculates the control amounts of the three rotational degrees of freedom.
  • the calculation unit 58 subtracts the above average x component ( ⁇ x1 or ⁇ x2) and y component ( ⁇ y1 or ⁇ y2) from the difference vector of each feature point.
  • the calculating unit 58 calculates the control amounts of the three rotational degrees of freedom in which the residual of the difference vector of each feature point is closest to zero.
  • the calculation unit 58 starts a search algorithm with a solution in which the control amounts of the rotational degrees of freedom in the pitch direction, the yaw direction, and the roll direction are 0 as the current solution.
  • the calculation unit 58 simulates a change in the residual of the difference vector of each feature point when the robot 30a is controlled according to each of a plurality of solutions near the current solution.
  • the calculation unit 58 replaces the neighboring solution with the current solution when there is a neighboring solution in which the residual of the difference vector of each feature point is closer to 0 than the current solution based on the simulation result.
  • the calculation unit 58 searches for a solution in which the residual of the difference vector becomes an extreme value by repeating this process.
  • the command unit 59 generates a control command for moving the target robot by the control amount calculated by the calculation unit 58, and outputs the generated control command to the target robot controller.
  • the target robot controller is the robot controller 40a in the first controller 55a, and the robot controller 40b in the second controller 55b.
  • End determination section The end determination unit 60 calculates a deviation between the state of the target object on the real image and the state of the target object on the last frame of the teaching range, and when the calculated deviation is less than a predetermined threshold value, Is determined to end. When determining that the control of the target robot is to be ended, the end determination unit 60 outputs an end notification.
  • the deviation is, for example, an average of distances between corresponding feature points of the target object extracted from the real image and the final frame.
  • the threshold value is set according to the accuracy required for the state of the object.
  • the threshold is the threshold Tha in the first controller 55a, and the threshold Thb in the second controller 55b.
  • the threshold value Tha and the threshold value Thb may be the same or different.
  • FIG. 10 is a flowchart illustrating an example of a flow of a change information generation process performed by the change information generation unit.
  • FIG. 10 shows a flow of processing of the change information generation unit 56 of the first control unit 55a.
  • the change information generation unit 56 of the second control unit 55b may generate the change information according to the same method as in FIG.
  • the process of generating change information is performed as advance preparation.
  • step S1 the control device 50 causes the robot 30a to perform a fixed operation of holding the male connector 2a conveyed to a predetermined position by the hand 31a and moving the male connector 2a from above to below. Thereby, the male connector 2a is moved within the field of view of the imaging devices 21 and 22.
  • step S2 the change information generation unit 56 controls the robot 30a such that the same images as the first frames of the teaching ranges of the first reference moving image and the second reference moving image are captured by the imaging devices 21 and 22, respectively.
  • the subroutine of step S2 will be described later.
  • step S3 the change information generating unit 56 sets k to the frame number of the first frame of the teaching range.
  • step S4 the change information generator 56 selects one of the six degrees of freedom.
  • step S5 the change information generation unit 56 generates a control command for moving the hand 31a by the unit control amount in the positive direction of the selected degree of freedom, and outputs the control command to the robot controller 40a.
  • step S6 the change information generator 56 acquires the latest actual image from the imaging devices 21 and 22 after the hand 31a has moved by the unit control amount.
  • step S7 the change information generating unit 56 obtains, from the image processing unit 53, the coordinates of the characteristic points of the male connector 2a extracted from the real image obtained in step S6.
  • step S ⁇ b> 8 the change information generating unit 56 acquires the coordinates of the characteristic points of the male connector 2 a extracted from the k-th frame of the first reference moving image and the second reference moving image from the image processing unit 53.
  • step S9 the change information generating unit 56 sets the first change information corresponding to the k-th frame and the degree of freedom selected in step S4 based on the coordinates obtained in step S8 and the coordinates obtained in step S7.
  • the second change information is generated. That is, the change information generating unit 56 displays a mapping for converting the coordinates of the feature points extracted from the real image by the imaging device 21 into the coordinates of the feature points extracted from the k-th frame of the first reference moving image. 1 Generate change information. Further, the change information generating unit 56 displays a mapping for converting the coordinates of the feature points extracted from the real image by the imaging device 22 into the coordinates of the feature points extracted from the k-th frame of the second reference moving image. 2 Change information is generated.
  • step S10 the change information generating unit 56 generates a control command for returning the hand 31a to the original state (the state before the latest step S5), and outputs the control command to the robot controller 40a.
  • the male connector 2a returns to the state before step S5.
  • step S11 the change information generation unit 56 determines whether there is any unselected degree of freedom. If there is an unselected degree of freedom (YES in step S11), the process of generating change information returns to step S4. As a result, steps S4 to S10 are repeated, and the first change information and the second change information are generated for each of the six degrees of freedom.
  • step S12 the change information generation unit 56 determines whether k is the frame number of the last frame in the teaching range.
  • step S13 the change information generating unit 56 generates the same image as the (k + 1) th frame of the first reference moving image and the second reference moving image in the imaging device 21,
  • the robot 30a is controlled so as to be imaged by each of the robots 22.
  • the change information generation unit 56 changes the state of the male connector 2a on the k-th frame to the state of the male connector 2a on the (k + 1) -th frame by the same method as the processing content of the calculation unit 58 described above.
  • a control amount with six degrees of freedom for approaching is calculated. That is, for each of the plurality of feature points extracted from the k-th frame, the change information generation unit 56 sets the difference vector having the feature point as a starting point and the corresponding feature point extracted from the (k + 1) -th frame as an end point. Ask for.
  • the change information generation unit 56 calculates a control amount having six degrees of freedom based on the difference vector for each feature point and the first change information and the second change information corresponding to the k-th frame.
  • the change information generation unit 56 generates a control command indicating the calculated control amount, and outputs the control command to the robot controller 40a.
  • step S14 the change information generation unit 56 adds 1 to k. After step S14, the process returns to step S4. As a result, also for the (k + 1) th frame, the first change information and the second change information for each of the six degrees of freedom are generated.
  • step S12 If k is the frame number of the last frame (YES in step S12), the first change information set 571 and the second change information set 572 have been generated for all frames, and the change information generation processing ends.
  • FIG. 11 is a flowchart showing the flow of the processing of the subroutine of step S2 shown in FIG.
  • step S21 the change information generation unit 56 acquires from the image processing unit 53 the coordinates of the characteristic points of the male connector 2a extracted from the first frame of the teaching range of the first reference moving image and the second reference moving image.
  • step S22 the change information generation unit 56 acquires the latest real image from the imaging devices 21 and 22.
  • step S23 the change information generating unit 56 acquires from the image processing unit 53 the coordinates of the characteristic points of the male connector 2a extracted from the actual image acquired in step S22.
  • step S24 the change information generation unit 56 determines whether the deviation between the state of the male connector 2a on the real image acquired in step S22 and the state of the male connector 2a on the first frame is less than the threshold Tha.
  • the deviation is, for example, an average of distances between corresponding feature points of the male connector 2a extracted from the real image and the first frame.
  • the change information generation unit 56 determines that the same image as the first frame is captured by the imaging devices 21 and 22, respectively, and ends the process.
  • step S25 the change information generating unit 56 selects one of the six degrees of freedom.
  • step S26 the change information generator 56 selects one of the positive direction and the negative direction as the control direction.
  • step S27 the change information generating unit 56 generates a control command for moving the hand 31a by the unit control amount in the selected control direction with the selected degree of freedom, and outputs the control command to the robot controller 40a.
  • step S28 the change information generation unit 56 acquires the actual image from the imaging devices 21 and 22 after the hand 31a moves by the unit control amount.
  • step S29 the change information generating unit 56 obtains, from the image processing unit 53, the coordinates of the characteristic points of the male connector 2a extracted from the actual image obtained in step S28.
  • step S30 the change information generator 56 calculates a deviation between the state of the male connector 2a on the real image acquired in step S28 and the state of the male connector 2a on the first frame.
  • step S31 the change information generation unit 56 generates a control command for returning the hand 31a to the original state (the state before the latest step S27), and outputs the control command to the robot controller 40a. Thereby, the male connector 2a returns to the state before step S27.
  • step S32 the change information generator 56 determines whether there is an unselected control direction. If there is an unselected control direction (YES in step S32), the process returns to step S26.
  • step S33 the change information generation unit 56 determines whether there is an unselected degree of freedom. If there is an unselected degree of freedom (YES in step S33), the process returns to step S25.
  • step S34 the change information generating unit 56 moves the hand 31a by the unit control amount in the degree of freedom and control direction corresponding to the minimum deviation.
  • the robot 30a is controlled via the controller 40a. After step S34, the process returns to step S22.
  • FIG. 12 is a flowchart illustrating an example of a process flow of controlling the target robot such that the state of the target object changes along the first reference moving image and the second reference moving image.
  • step S41 the control device 50 determines whether or not the end notification has been output from the end determination units 60 of all the control units (the first control unit 55a and the second control unit 55b). When the end notification has been output from the end determination units 60 of all the control units (YES in step S41), the process ends.
  • step S42 the control device 50 acquires the actual images captured by the imaging devices 21 and 22. Step S42 is performed for each imaging cycle.
  • step S43 the image processing unit 53 detects all objects (male connector 2a and female connector 2b) from the real image and the target frame by template matching, and extracts the coordinates of the feature points of each object.
  • step S44 the target frame selection unit 54 selects a target frame from the first reference moving image and the second reference moving image.
  • step S45 the target frame selection unit 54 specifies a target that includes the target frame within the teaching range and controls the specified target (at least one of the first control unit 55a and the second control unit 55b). And outputs a control instruction to.
  • step S46 the control unit (at least one of the first control unit 55a and the second control unit 55b) that has received the control instruction controls the target robot.
  • the process returns to step S41. If NO in step S4, a series of processes in steps S42 to S46 is repeated for each imaging cycle. Further, at this time, if NO in step S41, step S42 of acquiring the next actual image may be started while the target robot is being controlled in step S46. Thus, the target robot is continuously controlled according to the latest actual image without stopping the operation of the target robot. As a result, the state of the object can be changed quickly.
  • FIG. 13 is a flowchart showing the flow of the processing of the subroutine of step S44 shown in FIG.
  • FIG. 14 is a diagram illustrating the relationship between the closest frame and the target frame.
  • step S51 the target frame selection unit 54 acquires from the image processing unit 53 the coordinates of the feature points of all the objects extracted from each frame of the first reference moving image and the second reference moving image.
  • step S52 the target frame selection unit 54 obtains, from the image processing unit 53, the coordinates of the feature points of all the objects extracted from the real images captured by the imaging devices 21 and 22.
  • step S53 the target frame selection unit 54 determines whether the first target frame selection has been completed.
  • step S54 the target frame selecting unit 54 determines that the previous target frame is the last frame in the teaching range corresponding to any one of the objects. It is determined whether or not there is.
  • step S55 the target frame selecting unit 54 calculates a deviation between the state on the real image and the state on the final frame for the target corresponding to the teaching range to which the final frame belongs, and determines whether the deviation is less than the threshold. Determine whether or not.
  • the deviation is, for example, an average of distances between corresponding feature points of the object in the real image and the final frame.
  • step S56 the target frame selecting unit 54 selects the same frame as the previous target frame as the target frame. After step S56, the process ends.
  • step S55 If the deviation is less than the threshold (YES in step S55), it is determined that the state of the target object has reached the state on the last frame, and the process proceeds to step S57. If the first target frame selection has not been completed (NO in step S53), and if the previous target frame is not the last frame (NO in step S54), the process proceeds to step S57.
  • step S57 the target frame selection unit 54 calculates the deviation between the state of all objects on the real image and the state of all objects on each frame, and specifies the frame with the minimum deviation as the closest frame.
  • the deviation is, for example, an average of distances between corresponding feature points of the object in the real image and each frame.
  • the target frame selection unit 54 determines the state of all the objects on the real image captured by the imaging device 21 and the state of all the objects on the k-th frame of the first reference moving image. Calculate one deviation. Further, the target frame selection unit 54 calculates a second deviation between the states of all the objects on the real image captured by the imaging device 22 and the states of all the objects on the k-th frame of the second reference moving image. calculate. The target frame selection unit 54 calculates the average of the first deviation and the second deviation as the deviation corresponding to the k-th frame.
  • step S58 the target frame selection unit 54 determines whether there is a last frame in any of the teaching ranges up to a frame that is a predetermined number later than the closest frame.
  • step S59 target frame selecting section 54 selects the final frame as a target frame.
  • the target frame selecting unit 54 selects the last frame having the smallest frame number among the plurality of final frames.
  • step S60 the target frame selecting unit 54 selects a frame that is a predetermined number later than the closest frame as the target frame. After step S60, the target frame selection process ends.
  • FIG. 15 is a flowchart showing the flow of the processing of the subroutine of step S46 shown in FIG.
  • step S61 the end determination unit 60 determines whether the target frame is the last frame in the teaching range.
  • step S62 the end determination unit 60 determines whether the deviation between the state of the target on the real image and the state of the target on the final frame is less than a threshold. Is determined.
  • step S63 the termination determination unit 60 outputs a termination notification. After step S63, the process ends.
  • step S61 If the target frame is not the last frame (NO in step S61) and if the deviation is equal to or larger than the threshold (NO in step S62), the process proceeds to step S64.
  • step S64 the calculation unit 58 brings the state of the target on the real image closer to the state of the target on the target frame based on the first change information set 571 and the second change information set 572 corresponding to the target frame. For each of a plurality of degrees of freedom for the calculation.
  • step S65 the command unit 59 generates a control command indicating the calculated control amount, and outputs the control command to the target robot controller. After step S65, the process ends.
  • the first control unit 55a and the second control unit 55b each control the target robot according to the flow shown in FIG.
  • the states of the male connector 2a and the female connector 2b on the actual image approach the states of the male connector 2a and the female connector 2b on the target frame, respectively.
  • the robots 30a and 30b are controlled.
  • the states of the male connector 2a and the female connector 2b change in a coordinated manner according to the target frame.
  • the control system 1 includes the robots 30a and 30b, the imaging devices 21 and 22 for imaging the male connector 2a and the female connector 2b, and the control device 50 for controlling the robots 30a and 30b.
  • the robots 30a and 30b change the states of the male connector 2a and the female connector 2b, respectively.
  • the imaging devices 21 and 21 are installed at fixed positions different from the robots 30a and 30b, and image the male connector 2a and the female connector 2b supported by the robots 30a and 30b, respectively.
  • the control device 50 stores a first reference moving image and a second reference moving image showing samples of the male connector 2a and the female connector 2b.
  • the first reference moving image and the second reference moving image include a plurality of frames arranged in chronological order.
  • the control device 50 acquires change information for each of the male connector 2a and the female connector 2b.
  • the change information corresponding to the male connector 2a indicates the relationship between the control amount of the robot 30a and the change amount of the state of the male connector 2a on the images of the imaging devices 21 and 22.
  • the change information corresponding to the female connector 2b indicates the relationship between the control amount of the robot 30b and the change amount of the state of the female connector 2b on the images of the imaging devices 21 and 22.
  • the control device 50 performs a first process of acquiring a real image captured by the imaging devices 21 and 22, a second process of selecting a target frame from a plurality of frames, and the robot 30 a, based on the real image and the target frame. And a third process for controlling each of the switches 30b.
  • the control device 50 calculates and calculates a control amount of the robot 30a for bringing the state of the male connector 2a on the real image closer to the state of the male connector 2a on the target frame based on the change information corresponding to the male connector 2a.
  • the robot 30a is controlled according to the control amount thus set.
  • the control device 50 calculates a control amount of the robot 30b for bringing the state of the female connector 2b on the actual image closer to the state of the female connector 2b on the target frame, based on the change information corresponding to the female connector 2b.
  • the robot 30b is controlled according to the control amount thus set.
  • the states of the male connector 2a and the female connector 2b on the actual image can be respectively changed to the states of the male connector 2a and the female connector 2b on the target frame. That is, the states of the male connector 2a and the female connector 2b change in conjunction with each other according to the target frame.
  • the control device 50 repeatedly executes a series of processes including the first to third processes, and starts the first process of the next series of processes while performing the third process.
  • the robots 30a and 30b are continuously controlled according to the latest actual image without stopping the operations of the robots 30a and 30b.
  • the states of the male connector 2a and the female connector 2b can be changed quickly.
  • FIG. 16 is a diagram showing another example of the reference moving image that is a sample of the connection between the male connector and the female connector.
  • the male connector 2a moves toward the female connector 2b along a direction parallel to the substrate 5.
  • the state of connection to the female connector 2b is shown.
  • the male connector 2a moves along the L-shaped path A.
  • the frame 74 shows the male connector 2a located above the substrate 5.
  • the frame 75 shows the male connector 2a reaching the upper surface of the substrate 5.
  • the frame 76 shows the male connector 2a connected to the female connector 2b.
  • the male connector 2a goes to the state shown in the frame 76 without passing through the state shown in the frame 75.
  • the male connector 2a faces from the direction inclined with respect to the female connector 2b. Therefore, there is a possibility that the pins of the male connector 2a cannot be inserted into the insertion holes of the female connector 2b.
  • the target frame selection unit 54 of the present modification receives the specification of a frame to be passed (hereinafter, referred to as a “pass-necessary frame”), and always selects the specified pass-necessary frame as a target frame.
  • the target frame selection unit 54 receives a pass-necessary frame, the target frame selection unit 54 receives designation of an object to pass the state indicated by the pass-necessary frame.
  • the target frame selection unit 54 displays a screen for prompting the user to specify a frame that must pass, on the display unit 532 (see FIG. 4).
  • the worker confirms each frame of the first reference moving image and the second reference moving image, and operates the input device 534 (see FIG. 4) to pass the required pass frame and the target to pass through the state indicated by the pass required frame.
  • An object (a target object corresponding to a required passing frame) is specified.
  • FIG. 17 is a flowchart illustrating an example of the flow of a target frame selection process according to the first modification of the first embodiment.
  • the flowchart shown in FIG. 17 is different from the flowchart shown in FIG. 13 only in that steps S74, S78, and S79 are included instead of steps S54, S58, and S59. Therefore, only steps S74, S78, and S79 will be described.
  • step S74 the target frame selection unit 54 determines whether or not the previous target frame is the last frame or a mandatory pass frame.
  • step S78 the target frame selecting unit 54 determines whether or not there is at least one of the last frame and the pass-by-pass frame until a predetermined number of frames later than the closest frame.
  • step S79 the target frame selection unit 54 selects at least one of the last frame and the pass-necessary frame as the target frame. Note that if there are a plurality of frames that are at least one of the last frame and the required pass frame by a predetermined number of frames after the closest frame, the target frame selecting unit 54 sets the frame having the smallest frame number among the plurality of frames. Select
  • the required-passing frame is always selected as the target frame. Further, when the pass-necessary frame is selected as the target frame, the target frame is updated after the deviation between the state of the target on the real image and the target state on the pass-necessary frame becomes less than the threshold. This makes it possible to reliably change the state of the target object corresponding to the passing essential frame by the state on the passing essential frame.
  • the target frame selection unit 54 is configured to control the control system when the target object corresponding to the required pass frame does not match the state on the required pass frame even after the lapse of the specified time since the required pass frame is selected as the target frame. May be determined to be abnormal.
  • FIG. 18 is a flowchart showing an example of the flow of the abnormality determination process.
  • the target frame selecting unit 54 resets the timer when selecting a pass-through essential frame as a target frame.
  • step S82 the target frame selection unit 54 determines that the deviation between the state on the real image captured by the imaging devices 21 and 22 and the state on the passing essential frame is smaller than the threshold for the target corresponding to the passing essential frame. It is determined whether or not. If the deviation is less than the threshold (YES in step S82), the abnormality determination processing ends.
  • step S83 the target frame selecting unit 54 determines whether the timer value exceeds a specified time.
  • step S83 If the timer value does not exceed the specified time (NO in step S83), the abnormality determination process returns to step S82.
  • step S84 the target frame selection unit 54 determines that some abnormality has occurred in the control system 1. Thereby, the countermeasure against the abnormality of the control system 1 can be started quickly. After step S84, the abnormality determination processing ends.
  • control device 50 may notify the display unit 532 of the occurrence of the abnormality, or may stop the control of the robots 30a and 30b.
  • control system 1 connects (assembles) the male connector 2a and the female connector 2b.
  • control system 1 may assemble another two objects.
  • FIG. 19 is a schematic diagram illustrating an object of the control system according to the second modification of the first embodiment.
  • the control system 1 according to the second modification causes the upper case 6a and the lower case 6b to engage with each other in a production line of an industrial product or the like.
  • the upper case 6a and the lower case 6b have a substantially rectangular box shape in plan view.
  • the upper case 6a is arranged so as to open downward.
  • the lower case 6b is arranged so as to open upward.
  • 2Two engaging claws 7a and 7b are formed on the upper end of the lower case 6b on the near side in the drawing.
  • Two engaging claws are also formed at the upper end of the lower case 6b on the far side in the drawing.
  • the upper case 6a moves downward from above the lower case 6b and engages with four engagement claws of the lower case 6b.
  • the upper case 6a is gripped by the hand 31a (see FIG. 1) of the robot 30a.
  • the lower case 6b is mounted on a stage 31b (see FIG. 1) of the robot 30b.
  • the robot 30a further includes control rods 32a to 32d (see FIG. 19) for changing the state (the shape here) of the upper case 6a with two degrees of freedom.
  • the robot 30a can apply a force corresponding to the control amount to the control rods 32a and 32b facing each other in a direction to approach each other.
  • the robot 30a can apply a force according to the control amount to the control rods 32c and 32d facing each other in a direction to approach each other.
  • control rods 32c and 32d are in contact with the other two opposite side walls of the upper case 6a, respectively. Therefore, when a force that approaches the control rods 32c and 32d is applied, the upper case 6a is deformed. The amount of deformation differs depending on the control amount.
  • the imaging devices 21 and 22 are arranged at positions where the engagement claws 7a and 7b on the lower side of the lower case 6b can be imaged. However, the imaging devices 21 and 22 cannot image the two engagement claws on the back side of the lower case 6b in the drawing. Therefore, the control system 1 according to the second modification of the first embodiment further includes imaging devices 23 and 24. The imaging devices 23 and 24 are arranged at positions where two engagement claws on the back side of the lower case 6b in the drawing can be imaged.
  • the control device 50 of the second modification stores four reference moving images respectively corresponding to the imaging devices 21 to 24.
  • the change information generating unit 56 may generate four change information sets respectively corresponding to the four reference moving images.
  • the robot 30a changes the position and orientation of the upper case 6a with six degrees of freedom, and changes the shape of the upper case 6a with two degrees of freedom. Therefore, the change information generation unit 56 generates the first change information and the second change information corresponding to each of the eight degrees of freedom for the upper case 6a.
  • a moving image showing how the upper case 6a is engaged with the lower case 6b after deforming the shape of the upper case 6a so as to easily engage with the four engaging claws of the lower case 6b is prepared in advance.
  • the calculation unit 58 calculates the control amounts for the control rods 32a and 32b and the control amounts for the control rods 32c and 32d for bringing the shape of the upper case 6a on the real image closer to the shape of the upper case 6a on the target frame. calculate.
  • the upper case 6a can be deformed in the same manner as the reference moving image.
  • the upper case 6a can be easily engaged with the lower case 6b.
  • the calculation unit 58 calculates the control amount of each of the plurality of degrees of freedom for bringing the state of the target on the real image captured by the imaging devices 21 and 22 closer to the state of the target on the target frame by another method. It may be calculated.
  • the change information generation unit 56 sets each of the plurality of degrees of freedom for bringing the state of the object on the kth frame closer to the state of the object on the (k + 1) th frame. Calculate the control amount.
  • the calculating unit 58 stores the control amount calculated for each two consecutive frames as an inter-frame control amount, and uses the inter-frame control amount to change the state of the target on the real image on the target frame.
  • the control amount of each of the plurality of degrees of freedom for approaching the state of the target object may be calculated.
  • the calculation unit 58 is configured to bring the state of the target on the real image closer to the state of the target on the closest frame based on the first change information and the second change information corresponding to the closest frame. Is calculated for each of the plurality of degrees of freedom. Further, the calculation unit 58 calculates the sum ⁇ of the inter-frame control amounts from the closest frame to the target frame. The calculating unit 58 may calculate the sum of the control amount ⁇ and the sum ⁇ of the inter-frame control amounts as a control amount for bringing the state of the target on the real image closer to the state of the target on the target frame.
  • the calculation unit 58 is configured by a known model predictive control (“Adachi,“ Basics of Model Predictive Control ”, Journal of the Robotics Society of Japan, July 2014, Vol. 32, No. 6, p. 9-12) (Non-Patent Document By performing 2)), control amounts of a plurality of degrees of freedom may be calculated.
  • the target frame selection unit 54 selects a plurality of frames included in the predicted horizon period in the teaching range as target frames.
  • the calculation unit 58 performs control during the control horizon period so as to minimize the deviation between the state of the object on the target frame and the state of the object on the images captured by the imaging devices 21 and 22 during the predicted horizon period. Calculate the amount.
  • the image processing unit 53 may detect the target from the target image using the 3D-CAD data of the target.
  • the reference moving image (first reference moving image, second reference moving image) may be created by CG (Computer @ Graphics).
  • the change information generation unit 56 changes the information indicating the mapping that converts the coordinates of the feature points of the target object extracted from the target frame into the coordinates of the corresponding feature points extracted from the captured image. Generate as information. However, the change information generating unit 56 generates, as the change information, information indicating the coordinates of the feature points of the target object extracted from the captured image and the mapping to be converted to the coordinates of the corresponding feature points extracted from the target frame. You may.
  • the control device 50 may display the first reference moving image and the second reference moving image on the display unit 532, receive a moving image editing instruction from an operator, and edit the moving image. For example, the worker may delete unnecessary frames from the first reference moving image and the second reference moving image.
  • the robot 30a may change the state (here, the size) of an object that expands and contracts, such as a balloon, for example.
  • the control device 50 acquires change information indicating a change in the size of the target object when the robot 30a is controlled by the unit control amount. Then, the control device controls the robot 30a based on the change information such that the size of the target on the real image approaches the size of the target on the target frame.
  • the male connector 2a and the female connector 2b are connected by moving the male connector 2a toward the female connector 2b.
  • the male connector 2a may be placed on the stage 31b, and the female connector 2b may be gripped by the hand 31a.
  • the male connector 2a and the female connector 2b are connected by moving the female connector 2b toward the male connector 2a.
  • the reference moving image storage unit 51 of the control device 50 stores the reference moving image.
  • an external device of the control device 50 may store the reference moving image.
  • the reference moving image storage unit 51 may store, instead of or in addition to the reference moving image, the coordinates and characteristic amounts of the feature points of each object extracted from each frame of the reference moving image. Thus, the processing of the image processing unit 53 for each frame of the reference moving image can be omitted.
  • FIG. 20 is a schematic diagram illustrating an outline of a control system according to the second embodiment.
  • the control system 1A solders the electric wires 8e to the pads 8f of the substrate 9 using the soldering iron 8c and the solder feeder 8d in a production line of an industrial product or the like.
  • the control system 1A is different from the control system 1 shown in FIG. 1 in that the robots 30c to 30f, the robot controllers 40c to 40f, and the control device 50A are replaced with the robots 30a, 30b, the robot controllers 40a, 40b, and the control device 50. It is different in that it is prepared.
  • the imaging devices 21 and 22 image the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f as objects from different directions.
  • the robot 30c is a mechanism for changing the state (the position and the posture here) of the soldering iron 8c, and is, for example, a vertical articulated robot.
  • the robot 30c has a hand 31c at its tip for holding the soldering iron 8c, and changes the position and posture of the hand 31c with six degrees of freedom.
  • the robot 30d is a mechanism for changing the state (here, position and posture) of the solder feeder 8d, and is, for example, a vertical articulated robot.
  • the robot 30d has a hand 31d holding the solder feeder 8d at the tip, and changes the position and posture of the hand 31d with six degrees of freedom.
  • the robot 30e is a mechanism for changing the state (here, position and posture) of the electric wire 8e, and is, for example, a vertical articulated robot.
  • the robot 30e has a hand 31e holding the electric wire 8e at the tip, and changes the position and the posture of the hand 31e with six degrees of freedom.
  • the robot 30f is a mechanism for changing the state (in this case, position and orientation) of the pad 8f on the substrate 9, and is, for example, an XY ⁇ stage.
  • the robot 30f has a stage 31f on which the substrate 9 is placed, and changes the position and posture of the stage 31f with three degrees of freedom.
  • the robots 30c to 30e have the same configuration as the robot 30a shown in FIG.
  • the robot 30f has the same configuration as the robot 30b shown in FIG. 1 except that the target placed on the stage 31f is different.
  • the robot controllers 40c to 40e control the operations of the robots 30c to 30e, respectively, according to the control commands received from the control device 50A.
  • the robot controllers 40c to 40e have the same configuration as the robot controller 40a shown in FIG.
  • the robot controller 40f controls the operation of the robot 30f according to the control command received from the control device 50A.
  • the robot controller 40f has the same configuration as the robot controller 40b shown in FIG.
  • the control device 50A controls the robots 30c to 30f via the robot controllers 40c to 40f, respectively.
  • the control device 50A stores a first reference moving image and a second reference moving image showing samples of the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f.
  • the first reference moving image is a moving image when viewed from the position of the imaging device 21.
  • the second reference moving image is a moving image when viewed from the position of the imaging device 22.
  • FIG. 21 is a diagram illustrating an example of a reference moving image (first reference moving image or second reference moving image) according to the second embodiment.
  • FIG. 21 shows frames 77 to 83 of the reference moving image.
  • the frame 77 is a frame when the state of the pad 8f has reached a desired state.
  • the frame 78 is a frame when the electric wire 8e reaches the pad 8f.
  • the frame 79 is a frame when the tip of the soldering iron 8c contacts the pad 8f.
  • the frame 80 is a frame when the tip of the solder feeder 8d contacts the tip of the soldering iron 8c.
  • the frame 81 is a frame when the solder supplied from the solder feeder 8d and melted (hereinafter referred to as “molten solder 8g”) has a desired size.
  • the frame 82 is a frame when the solder feeder 8d is moved away from the pad 8f.
  • the frame 83 is a frame when the soldering iron 8c is moved away from the pad 8f.
  • the control device 50A acquires change information indicating a change in the state of the soldering iron 8c on the images obtained from the imaging devices 21 and 22 when the robot 30c is controlled by the unit control amount for each of the six degrees of freedom. .
  • the control device 50A acquires change information indicating a change in the state of the solder feeder 8d on the images obtained from the imaging devices 21 and 22 when the robot 30d is controlled by the unit control amount for each of the six degrees of freedom.
  • the control device 50A acquires change information indicating a change in the state of the electric wire 8e on the images obtained from the imaging devices 21 and 22 when the robot 30e is controlled by the unit control amount for each of the six degrees of freedom.
  • the control device 50A acquires change information indicating a change in the state of the pad 8f on the images obtained from the imaging devices 21 and 22 when the robot 30f is controlled by the unit control amount for each of the three degrees of freedom.
  • the control device 50A changes the state of the target on the real images captured by the imaging devices 21 and 22 to the state of the target on target frames of the first reference moving image and the second reference moving image, as in the first embodiment.
  • the target robot is controlled via the target robot controller so as to approach each other.
  • the objects are the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f.
  • the target robot controllers are the robot controllers 40c to 40f.
  • the target robots are the robots 30c to 30f.
  • control device 50A determines the first deviation between the state of the first object on the real image and the state of the first object on the first target frame, the state of the second object on the real image, and the second deviation. A second deviation from the state of the second object on the target frame is calculated.
  • the control device 50A controls the robots 30c to 30f such that the time when the first deviation is less than the first threshold value and the time when the second deviation is less than the second threshold value satisfy prescribed conditions. This makes it possible to control the time when the states of the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f reach the target frame.
  • control device 50A has a hardware configuration as shown in FIG. 4, as in the first embodiment. Therefore, detailed description of the hardware configuration of the control device 50A is omitted.
  • FIG. 22 is a block diagram showing a functional configuration of the control device according to the second embodiment.
  • the control device 50A includes a target frame selection unit 54A instead of the target frame selection unit 54, as compared with the control device 50 shown in FIG. 5, and includes a first control unit 55a and a second control unit 55a.
  • the difference is that a first control unit 55c, a second control unit 55d, a third control unit 55e, and a fourth control unit 55f are provided instead of the unit 55b.
  • the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image showing samples of four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f).
  • the teaching range selection unit 52 selects a teaching range for each of the four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f).
  • the image processing unit 53 detects four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f) from the target image.
  • the image processing unit 53 also detects the molten solder 8g (see FIG. 21) on the pad 8f from the target image as a target.
  • the image processing unit 53 creates a template of the molten solder 8g in advance, and extracts a feature point and a feature amount of the molten solder 8g from the target image.
  • the state (here, the size) of the molten solder 8g changes according to the amount of the molten solder. Therefore, it is preferable that the image processing unit 53 performs color extraction and labeling from the target image, and extracts SIFT, which is not easily affected by enlargement / reduction, as a feature amount. Furthermore, it is preferable that the image processing unit 53 extracts SIFT that is not easily affected by color enlargement / reduction as a feature amount. Thereby, the image processing unit 53 can easily detect the molten solder 8g from the target image.
  • the target frame selection unit 54A selects a target frame from the first reference moving image and the second reference moving image, similarly to the target frame selection unit 54 of the first embodiment. However, similarly to the first modification of the first embodiment, the target frame selection unit 54A specifies a pass-necessary frame and specifies an object to pass through the state indicated by the pass-necessary frame (the target ) Is accepted.
  • the target frame selecting unit 54A can receive the time difference between two consecutive passing required frames.
  • first related frame the first frame of two consecutive passing essential frames
  • second related frame the subsequent frame
  • the target frame selection unit 54A receives designation of a first related frame, a second related frame, and a time difference.
  • the target frame selecting unit 54A selects the first related frame as the target frame, and then selects the second related frame as the next target frame. That is, when the deviation between the state on the real image and the state on the first related frame of the object corresponding to the first related frame is less than the threshold, the target frame selecting unit 54A sets the second related frame as the target frame. select.
  • FIG. 23 is a diagram showing an example of a screen for designating a frame that must pass, a related frame, and a time difference.
  • the control device 50A displays a screen as shown in FIG. 23 on the display unit 532, and accepts designation of a required passage frame, an object corresponding to the required passage frame, a first related frame, a second related frame, and a time difference.
  • the operator operates the input device 534 to specify a required pass frame, an object corresponding to the required pass frame, a first related frame, a second related frame, and a time difference.
  • the teaching range selection unit 52 selects the frames 83 and 82 as the last frames of the teaching range corresponding to the pad 8f and the electric wire 8e.
  • the target frame selection unit 54A receives the frames 79 and 83 as the indispensable passage frames corresponding to the target object “solder iron 8c”.
  • the target frame selecting unit 54A receives the frames 80 and 82 as the passage essential frames corresponding to the target object “solder feeder 8d”.
  • the target frame selection unit 54A receives the frame 81 as a passing essential frame corresponding to the target object “melted solder 8g”.
  • the target frame selection unit 54A receives an instruction to set the frames 79 and 80, which are two consecutive passing essential frames, as the first related frame and the second related frame, respectively, and to set the time difference to “3 seconds”. Further, the target frame selecting unit 54A receives an instruction to set the frames 81 and 82, which are two consecutive passing essential frames, as the first related frame and the second related frame, respectively, and to set the time difference to “0.5 seconds”. .
  • the first control unit 55c controls the robot 30c via the robot controller 40c to change the state of the soldering iron 8c.
  • the second controller 55d controls the robot 30d via the robot controller 40d to change the state of the solder feeder 8d.
  • the third control unit 55e controls the robot 30e via the robot controller 40e to change the state of the electric wire 8e.
  • the fourth control unit 55f controls the robot 30f via the robot controller 40f to change the state of the pad 8f.
  • Each of the first control unit 55c, the second control unit 55d, the third control unit 55e, and the fourth control unit 55f includes a change information generation unit 56, a change information storage unit 57, A section 58, a command section 59, and an end determination section 60 are provided (see FIG. 7).
  • the target object, the target robot, and the target robot controller in each unit are different from those in the first embodiment.
  • the object is the soldering iron 8c in the first control unit 55c, the solder feeder 8d in the second control unit 55d, the electric wire 8e in the third control unit 55e, and the pad 8f in the fourth control unit 55f.
  • the target robot is the robot 30c in the first control unit 55c, the robot 30d in the second control unit 55d, the robot 30e in the third control unit 55e, and the robot 30f in the fourth control unit 55f.
  • the target robot controller is the robot controller 40c in the first controller 55c, the robot controller 40d in the second controller 55d, the robot controller 40e in the third controller 55e, and the robot controller 40f in the fourth controller 55f. It is.
  • the calculating unit 58 has the following functions in addition to the functions of the first embodiment. That is, when the target frame selecting unit 54 receives designation of the first related frame, the second related frame, and the time difference, the calculating unit 58 adjusts the control amount so as to satisfy the time difference.
  • the calculating unit 58 adjusts the control amount as follows.
  • the calculation unit 58 calculates the time at which the deviation between the state on the real image and the state on the first related frame becomes less than the threshold for the object corresponding to the first related frame by the time difference specified by the time difference. Calculated as expected arrival time.
  • the calculation unit 58 calculates a control amount of each degree of freedom for bringing the state of the object corresponding to the second related frame on the real image closer to the state of the object on the second related frame.
  • the calculation unit 58 adjusts the control amount by multiplying the calculated control amount by the imaging cycle / (scheduled arrival time ⁇ current time).
  • control device 50A controls the target robot so as to change the state of the target along the reference moving image according to the flowcharts shown in FIGS.
  • target frame selection unit 54A performs a target frame selection process according to the flowchart shown in FIG.
  • FIG. 24 is a flowchart showing an example of the flow of a target frame selection process in the second embodiment.
  • the flowchart shown in FIG. 24 differs from the flowchart shown in FIG. 17 only in that steps S81 and S82 are included. Therefore, steps S81 and S82 will be described below.
  • step S55 If YES in step S55, the process proceeds to step S81.
  • the target frame selection unit 54A determines whether or not the target frame is a first related frame indicated by a specified condition.
  • step S82 the target frame selecting unit 54A selects the second related frame corresponding to the first related frame as the target frame. After step S82, the target frame selection process ends.
  • step S81 If the target frame is the first related frame (YES in step S81), the process of selecting a target frame moves to step S57.
  • the time at which the deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than the threshold value, and the state of the second object on the real image
  • the time when the deviation from the state of the second object on the second target frame is less than the threshold satisfies the prescribed condition.
  • the target robot is set as follows. Controlled.
  • the time when the deviation between the state of the soldering iron 8c on the real image and the state of the soldering iron 8c on the frame 79 is less than the threshold value Tha is t1
  • the state of the solder feeder 8d on the real image and the solder on the frame 80 are
  • the time when the deviation from the state of the feeder 8d becomes smaller than the threshold Thd is defined as t2.
  • the control device 50A controls the robot 30d so as to satisfy the specified condition that (t2 ⁇ t1) is 3 seconds.
  • the solder feeder 8d can be brought into contact with the soldering iron 8c after the pad 8f is sufficiently heated.
  • the time at which the deviation between the state of the molten solder 8g on the real image and the state of the molten solder 8g on the frame 81 is less than the threshold Thg is defined as t3, and the state of the solder feeder 8d on the real image and the solder feeder 8d on the frame 82 are set.
  • the time at which the deviation from the state described above becomes smaller than the threshold Thd is defined as t4.
  • the control device 50A controls the robot 30d so as to satisfy the specified condition that (t4 ⁇ t3) is 0.5 second.
  • the state of the molten solder 8g indicates the size of the molten solder 8g.
  • the target frame is updated to the frame 82 after the deviation between the state of the molten solder 8g on the actual image and the state of the molten solder 8g on the frame 81 becomes less than the threshold Thg. Then, after the molten solder 8g reaches a desired size, the solder feeder 8d can be immediately separated from the soldering iron 8c to maintain the size of the molten solder 8g.
  • the target robot is controlled as follows.
  • the frame 83 is selected as the target frame. Therefore, after the solder feeder 8d is sufficiently separated from the soldering iron 8c, the soldering iron 8c is moved away from the pad 8f. Thereby, unintended contact between the soldering iron 8c and the solder feeder 8d can be avoided.
  • control system 1A solders the electric wire 8e to the pad 8f using the soldering iron 8c and the solder feeder 8d.
  • the control system 1A may assemble another four objects.
  • FIG. 25 is a schematic diagram showing an object of the control system according to the first modification of the second embodiment.
  • the control system 1A according to the first modification joins the cylindrical members 10e and 10f with screws 10c and 10d in an industrial product production line or the like.
  • Screw holes 11a and 11b are formed in the cylindrical member 10e, and screw holes 12a and 12b are formed in the cylindrical member 10f. With the screw hole 11a and the screw hole 12a overlapping and the screw hole 11b and the screw hole 12b overlapping, the screw 10c is inserted into the screw holes 11a and 12a, and the screw 10d is inserted into the screw holes 11b and 12b.
  • the screw 10c is gripped by the hand 31c (see FIG. 20) of the robot 30c.
  • the screw 10d is gripped by the hand 31d of the robot 30d.
  • the cylindrical member 10e is gripped by the hand 31e of the robot 30e.
  • the cylindrical member 10f is mounted on a stage 31f of the robot 30f.
  • the imaging devices 21 and 22 are arranged at positions where the screw holes 11a and 12a and the screw 10c can be imaged. However, the imaging devices 21 and 22 cannot image the screw holes 11b and 12b and the screw 10d. Therefore, the control system 1A according to the first modification further includes imaging devices 23 and 24. The imaging devices 23 and 24 are arranged at positions where the screw holes 11b and 12b and the screw 10d can be imaged.
  • the control device 50A stores four reference moving images corresponding to the imaging devices 21 to 24, respectively.
  • FIG. 26 is a diagram illustrating an example of the reference moving image according to the first modification of the second embodiment.
  • FIG. 26 shows frames 84 to 86 of the reference moving image.
  • the frame 84 is a frame when the cylindrical member 10f reaches a desired position and posture.
  • the frame 85 is a frame when the screw hole 11a of the cylindrical member 10e overlaps the screw hole 12a of the cylindrical member 10f.
  • the frame 86 is a frame immediately before the screw 10c is inserted into the screw holes 11a and 12a.
  • the change information generation unit 56 of the first modification may generate four change information sets respectively corresponding to the four reference moving images. Then, the calculation unit 58 may calculate the control amount of each degree of freedom of the target robot based on the four change information sets.
  • each of the cylindrical members 10e and 10f may be further provided with two screw holes. In this case, the cylindrical members 10e and 10f are joined by four screws.
  • FIG. 27 is a diagram illustrating an example of the arrangement of four imaging devices.
  • the imaging device 21 is arranged at a position where two screws 10c and 10g can be imaged. That is, the screws 10c and 10g exist in the visual field range 21a of the imaging device 21.
  • the imaging device 22 is arranged at a position where the two screws 10c and 10h can be imaged. That is, the screws 10c and 10h exist in the visual field range 22a of the imaging device 22.
  • the imaging device 23 is arranged at a position where two screws 10d and 10g can be imaged. That is, the screws 10d and 10g exist in the visual field range 23a of the imaging device 23.
  • the imaging device 24 is arranged at a position where the two screws 10d and 10h can be imaged. That is, the screws 10d and 10h exist in the visual field range 24a of the imaging device 24.
  • FIG. 28 is a schematic diagram illustrating an object of the control system according to the second modification of the second embodiment.
  • the control system 1A according to the second modification uses a welding torch 13c and a welding rod 13d to weld two cylindrical members 13e and 13f to each other in an industrial product production line or the like.
  • the welding torch 13c is gripped by the hand 31c (see FIG. 20) of the robot 30c.
  • the welding rod 13d is gripped by the hand 31d of the robot 30d.
  • the cylindrical member 13e is gripped by the hand 31e of the robot 30e.
  • the cylindrical member 13f is mounted on a stage 31f of the robot 30f.
  • the target frame selection unit 54A may receive a time difference “no designation” for the first related frame and the second related frame. In this case, a frame between the first related frame and the second related frame is not selected as the target frame and is skipped. When the designation of the time difference “no designation” is received, the calculation unit 58 does not adjust the control amount.
  • the target frame selection unit 54A may receive designation of a plurality of continuous frames in the reference moving image as an operation group. For example, when the same operation is repeated a plurality of times, the operator designates a frame group corresponding to the operation as an operation group. Further, the operator also specifies the number of repetitions of the operation group.
  • the target frame selection unit 54A may specify the frames included in the operation group as target frames to be repeated by the number of repetitions.
  • FIG. 29 is a schematic diagram illustrating an outline of a control system according to the third embodiment.
  • the third embodiment unlike the first and second embodiments, one of a plurality of objects is installed at a fixed position, and the state of the imaging devices 21 and 22 is changed by a robot.
  • the control system 1B sequentially processes the processing target portions 15, 16 of the large member 14j using the processing tools 14h, 14i in a production line of an industrial product or the like.
  • the large member 14j is, for example, a housing of a large device, an automobile body, or the like.
  • the processing tools 14h and 14i are, for example, a drill, an electric file, or the like.
  • the control system 1B is different from the control system 1 shown in FIG. 1 in that instead of the robots 30a, 30b, the robot controllers 40a, 40b, and the control device 50, the robots 30h, 30i, 30j, the robot controllers 40h, 40i, 40j, The difference is that a control device 50B is provided.
  • the robot 30h is a mechanism for changing the state (here, position and posture) of the processing tool 14h, and is, for example, a vertical articulated robot.
  • the robot 30h has a hand 31h that supports the processing tool 14h at the tip, and changes the position and posture of the hand 31h with a plurality of degrees of freedom. Further, the robot 30h includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
  • the robot 30i is a mechanism for changing the state (here, position and orientation) of the processing tool 14i, and is, for example, a vertical articulated robot.
  • the robot 30i has a hand 31i at its tip for supporting the processing tool 14i, and changes the position and orientation of the hand 31i with a plurality of degrees of freedom. Further, the robot 30i includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
  • the robot 30j is a mechanism for changing the states (here, positions and postures) of the imaging devices 21 and 22, and is, for example, a vertical articulated robot.
  • the robot 30j has a hand 31j that supports the imaging devices 21 and 22 at the tip, and changes the position and posture of the hand 31j with a plurality of degrees of freedom. Further, the robot 30j includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
  • the pedestals 33h, 33i, and 33j move along the common rail.
  • a rail is provided for each of the pedestals 33h, 33i, and 33j, and each of the pedestals 33h, 33i, and 33j may move along the corresponding rail.
  • the robot controllers 40h, 40i, and 40j perform operation control of the robots 30h, 30i, and 30j, respectively, according to control commands received from the control device 50B.
  • the robot controllers 40h, 40i, and 40j change the states of the hands 31h, 31i, and 31j, respectively, and move the pedestals 33h, 33i, and 33j, respectively, according to a control command from the control device 50B.
  • the control device 50B has a hardware configuration as shown in FIG. 4, as in the first embodiment. Therefore, a detailed description of the hardware configuration of the control device 50B will be omitted.
  • FIG. 30 is a block diagram showing a functional configuration of the control device according to the third embodiment.
  • the control device 50B is different from the control device 50 shown in FIG. 5 in that a first control unit 55h and a second control unit 55i are used instead of the first control unit 55a and the second control unit 55b.
  • a third control unit 55j is used instead of the first control unit 55a and the second control unit 55b.
  • the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image showing samples of the processing tools 14h and 14i and the large member 14j.
  • the first reference moving image and the second reference moving image indicate, for example, the following first to third scenes in order.
  • the first scene is a scene in which the processing tools 14h and 14i process the processing target portion 15 in a state where the processing target portion 15 of the large member 14j is at a fixed position on the image.
  • the second scene is a scene in which the large member 14j moves on the image, and the processing target portion 16 of the large member 14j moves to a fixed position on the image.
  • the third scene is a scene in which the processing tools 14h and 14i process the processing target portion 16 in a state where the processing target portion 16 of the large member 14j is at a fixed position on the image.
  • the teaching range selection unit 52 selects a teaching range for each of the processing tool 14h, the processing tool 14i, and the large member 14j.
  • the image processing unit 53 detects the processing tool 14h, the processing tool 14i, and the large member 14j from the target image. Since the size of the large member 14j is large, the field of view of the imaging devices 21 and 22 includes only a part of the large member 14j. Therefore, the image processing unit 53 detects a pattern formed on the surface of the large member 14j.
  • the first control unit 55h controls the robot 30h via the robot controller 40h to change the state of the processing tool 14h.
  • the second control unit 55i controls the robot 30i via the robot controller 40i to change the state of the processing tool 14h.
  • the third control unit 55j controls the robot 30j via the robot controller 40j to change the states of the imaging devices 21 and 22.
  • Each of the first control unit 55h, the second control unit 55i, and the third control unit 55j includes a change information generation unit 56, a change information storage unit 57, a calculation unit 58, and a command unit, as in the first embodiment. 59 and an end determination unit 60 (see FIG. 7).
  • the target object is the processing tool 14h in the first control unit 55h, the processing tool 14i in the second control unit 55i, and the large member 14j in the third control unit 55j.
  • the target robot is the robot 30h in the first control unit 55h, the robot 30i in the second control unit 55i, and the robot 30j in the third control unit 55j.
  • the target robot controller is the robot controller 40h in the first controller 55h, the robot controller 40i in the second controller 55i, and the robot controller 40j in the third controller 55j.
  • the change information generating unit 56 of the first control unit 55h and the second control unit 55i determines the unit control amount of the target robot and the target object on the real image captured by the imaging device 21.
  • the first change information indicating the relationship with the change amount of the state is generated.
  • the change information generation unit 56 of the first control unit 55h and the second control unit 55i when the robot 30j is not operating, controls the unit control amount of the target robot and the actual control image on the real image captured by the imaging device 22.
  • the second change information indicating the relationship with the amount of change in the state of the object is generated.
  • the change information generation unit 56 of the first control unit 55h performs the first change information and the second change information in the state of the imaging devices 21 and 22 in which the processing target portion 15 of the large member 14j is at a fixed position on an image. Generate change information.
  • the change information generation unit 56 of the second control unit 55i generates the first change information and the second change information in the state of the imaging devices 21 and 22 where the processing target portion 16 of the large member 14j is at a fixed position on the image. .
  • the state of the large member 14j is not changed by the robot 30j. However, since the state of the imaging devices 21 and 22 is changed by the robot 30j, the state of the large member 14j on the real image of the imaging devices 21 and 22 is changed. Therefore, the change information generation unit 56 of the third control unit 55j performs the first change indicating the relationship between the unit control amount of the robot 30j and the change amount of the state of the large member 14j on the real image captured by the imaging device 21. Generate information. In addition, the change information generation unit 56 of the third control unit 55j performs the second change indicating the relationship between the unit control amount of the robot 30j and the change amount of the state of the large member 14j on the real image captured by the imaging device 22. Generate information.
  • the processing contents of the calculation unit 58, the command unit 59, and the end determination unit 60 in the third control unit 55j are the same as those of the first control unit 55a and the second control unit 55b in the first embodiment.
  • the calculation unit 58 in the first control unit 55h and the second control unit 55i determines that the state of the target on the real image approaches the state of the target on the target frame only when the following start conditions are satisfied.
  • the control amount of the target robot is calculated. Start condition: The deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame is less than the threshold Thj.
  • the processing contents of the command unit 59 and the termination determination unit 60 in the first control unit 55h and the second control unit 55i are the same as those of the first control unit 55a and the second control unit 55b in the first embodiment.
  • the control device 50B controls the target robot such that the state of the target object changes in accordance with the first reference moving image and the second reference moving image according to the flowchart in FIG. 12 as in the first embodiment. .
  • the third control unit 55j performs the processing of the subroutine of step S46 shown in FIG. 12 according to the flowchart shown in FIG. 15, as in the first embodiment.
  • first control unit 55h and second control unit 55i perform the processing of the subroutine of step S46 shown in FIG. 12 according to the flowchart shown in FIG.
  • FIG. 31 is a flowchart showing the flow of processing of the first control unit and the second control unit of the third embodiment. As shown in FIG. 31, the processing flow of the first control unit 55h and the second control unit 55i according to the third embodiment is different from the flowchart shown in FIG. 15 in that step S90 is provided. Therefore, only step S90 will be described.
  • step S90 it is determined whether or not the deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame is less than the threshold Thj. If the deviation is equal to or larger than the threshold Thj (NO in step S90), the process ends. If the deviation is less than the threshold Thj (YES in step S90), steps S61 to S65 are performed.
  • control device 50B controls only the robot 30j.
  • control device 50B controls each of robots 30h to 30j.
  • the robots 30h to 30j are controlled as follows.
  • the robots 30 h and 30 i are not moved until the imaging devices 21 and 22 move so that the processing target portion 15 of the large member 14 j is at a fixed position on the image. It will be in a stopped state.
  • the first control unit 55h and the second control unit 55i change the state of the target on the real image to the position of the target on the target frame. Control the target robot to approach the state.
  • the processing tools 14h and 14i process the processing target portion 15.
  • the third control unit 55j controls the robot 30j such that the state of the large member 14j on the actual image approaches the state of the large member 14j on the target frame.
  • the robot 30j since the state of the large member 14j is constant, the robot 30j hardly operates, and the states of the imaging devices 21 and 22 are substantially constant.
  • the robot 30j When the target frame is selected from the second scene, the robot 30j is controlled so that the state of the large member 14j on the real image changes, and the imaging devices 21 and 22 move.
  • the robots 30h and 30i are in a stopped state until the imaging devices 21 and 22 move so that the processing target portion 16 of the large member 14j is at a fixed position on the image.
  • the first control unit 55h and the second control unit 55i The target robot is controlled so that the state of the object approaches the state of the target on the target frame.
  • the processing tools 14h and 14i process the processing target portion 16.
  • the third control unit 55j controls the robot 30j such that the state of the large member 14j on the actual image approaches the state of the large member 14j on the target frame.
  • the robot 30j since the state of the large member 14j is constant, the robot 30j hardly operates, and the states of the imaging devices 21 and 22 are substantially constant.
  • FIG. 32 is a schematic diagram showing an outline of a part of a control system according to a modification of the third embodiment.
  • the robots 30h and 30i may be installed on a pedestal 33j included in the robot 30j. In this case, the robots 30h and 30i move integrally with the robot 30j.
  • the first to third embodiments and the modified examples include the following disclosure.
  • the i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object, i is an integer of 1 to N-1,
  • the N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device,
  • the other of the N-th object and the imaging device (21 to 24) is installed at a fixed position,
  • the control device (50, 50A, 50B) acquires change information for each of the first to Nth objects
  • the control device (50, 50A, 50B) A first process of acquiring a real image captured by the imaging device (21 to 24); A second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects; Performing a third process for controlling each of the first to Nth robots based on the actual image and the target frame; In the third processing, the control device (50, 50A, 50B) sets the state of the j-th object on the real image to the target based on the change information corresponding to the j-th object.
  • a control amount of the j-th robot (30a to 30f, 30h to 30j) for approximating the state of the j-th object on the frame is calculated, and the j-th robot (30a to 30f) is calculated according to the calculated control amount. , 30h to 30j), and a control system (1, 1A, 1B).
  • the control device (50, 50A, 50B) includes a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame.
  • the control device (50, 50A, 50B) selects a first target frame from the reference moving image, and then selects a second target frame from the plurality of frames.
  • the control device (50, 50A, 50B) determines that a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold. And a second time at which the deviation between the state of the second object on the real image and the state of the second object on the second target frame is less than a second threshold.
  • the imaging devices (21 to 24) image the (N + 1) th object (8g) together with the first to Nth objects,
  • the reference moving image includes the (N + 1) th object,
  • the control device (50A) sets the target frame after the deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the target frame becomes less than a threshold value.
  • the control system (1A) according to Configuration 1, which is updated.
  • the imaging devices (21 to 24) image the (N + 1) th object (8g) together with the first to Nth objects,
  • the reference moving image includes the (N + 1) th object,
  • the control device (50A) selects a first target frame from the reference moving image, and then selects a second target frame from the plurality of frames,
  • the control device (50A) includes a first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold. And a second time at which the deviation between the state of the first object on the real image and the state of the first object on the second target frame is less than a second threshold satisfies a prescribed condition.
  • the control system (1A) according to Configuration 1, which controls the first robot (30d) as described above.
  • the control device (50, 50A) controls at least one of the first to N-th objects on the real image during a period from when the target frame is selected to when a specified time has elapsed.
  • the control system (1, 1A) according to configuration 1, wherein it is determined that an abnormality has occurred in the control system when a deviation between a state and a state of the at least one object on the target frame is not less than a threshold. ).
  • the N-th robot (30j) changes the state of the imaging devices (21, 22),
  • the control device (50B) performs the third operation.
  • the control system (50B) according to Configuration 1, wherein each of the first to Nth robots (30h to 30j) is controlled.
  • Control device (Configuration 8)
  • the control device (50, 50A, 50B) repeatedly executes a series of processing including the first processing to the third processing, and performs the first processing of the next series of processing while performing the third processing.
  • the control device (50, 50A, 50B) selects a frame included in a predicted horizon period among the reference moving images as the target frame,
  • the control device (50, 50A) includes a state of the j-th object on a frame included in a predicted horizon period of the reference moving image, and an image of the imaging device during the predicted horizon period.
  • the control system (1, 1A, 1B) according to Configuration 1, wherein a control amount of the j-th robot during a control horizon period is calculated so as to minimize a deviation from the state of the j-th object. .
  • the i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object, i is an integer of 1 to N-1,
  • the N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device (21 to 24),
  • the other of the N-th object and the imaging device (21 to 24) is installed at a fixed position
  • the control method includes: A first step of acquiring change information for each of the first to Nth objects;
  • the change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the
  • First to Nth objects are obtained by using imaging devices (21 to 24) for imaging the first to Nth objects (2a, 2b, 6a, 6b, 8c to 8f, 10c to 10f, 13c to 13f).
  • N is an integer of 2 or more;
  • the i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object, i is an integer of 1 to N-1,
  • the N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device (21 to 24),
  • the other of the N-th object and the imaging device (21 to 24) is installed at a fixed position
  • the control method includes: A first step of acquiring change information for each of the first to Nth objects;
  • the change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device, j is an integer from 1 to N;
  • the control method includes: A second step of acquiring a real image captured by the imaging device (21 to 24); A third step of selecting a target frame from a reference moving image indicating a sample of the first to
  • 1, 1A, 1B control system 2a male connector, 2b female connector, 3 wire, 4a, 4g feature point, 5, 9 board, 6a upper case, 6b lower case, 7a, 7b engaging claw, 8c soldering iron, 8d solder feeder, 8e electric wire, 8f pad, 8g molten solder, 10c, 10d, 10g, 10h screw, 10e, 10f, 13e, 13f cylindrical member, 11a, 11b, 12a, 12b screw hole, 13c welding torch, 13d welding rod , 14h, 14i processing tool, 14j large member, 15, 16 processing target part, 21-24 imaging device, 21a-24a field of view, 30a-30f, 30h-30j robot, 31a, 31c-31e, 31h-31j hand, 31b, 31f stage, 32a to 32d control rod, 33h, 3 i, 33j pedestal, 34 rail, 40a to 40f, 40h to 40j robot controller, 50, 50A, 50B control device, 51 reference moving image storage unit, 52 teaching

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Manufacturing Of Electrical Connectors (AREA)

Abstract

L'invention concerne un dispositif de commande qui réalise : un premier traitement qui obtient une image réelle capturée par un dispositif d'imagerie ; un deuxième traitement qui sélectionne une trame cible à partir d'une vidéo de référence ; et un troisième traitement qui commande chacun des 1er à Nième robots sur la base de l'image actuelle et de la trame cible. Dans le troisième traitement, le dispositif de commande : calcule une quantité de commande pour un jème robot pour amener l'état d'un jème objet cible dans l'image réelle en direction d'un état de modèle pour le nième objet cible dans la trame cible sur la base d'informations de changement correspondant au jème objet cible ; et commande le jème robot en fonction de la quantité de commande calculée. Par conséquent, un système de commande peut être obtenu, qui est capable de modifier de manière coordonnée l'état d'une pluralité d'objets cibles.
PCT/JP2019/026959 2018-07-23 2019-07-08 Système de commande, procédé de commande et programme Ceased WO2020022041A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-137704 2018-07-23
JP2018137704A JP7115096B2 (ja) 2018-07-23 2018-07-23 制御システム、制御方法およびプログラム

Publications (1)

Publication Number Publication Date
WO2020022041A1 true WO2020022041A1 (fr) 2020-01-30

Family

ID=69180459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026959 Ceased WO2020022041A1 (fr) 2018-07-23 2019-07-08 Système de commande, procédé de commande et programme

Country Status (2)

Country Link
JP (1) JP7115096B2 (fr)
WO (1) WO2020022041A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3974119A1 (fr) * 2020-09-23 2022-03-30 Liebherr-Verzahntechnik GmbH Dispositif automatisé de raccordement de prises
CN114421258A (zh) * 2022-01-26 2022-04-29 中国铁建电气化局集团有限公司 一种信号传输线的自动焊接方法
CN115134529A (zh) * 2022-06-29 2022-09-30 广联达科技股份有限公司 一种多视角展示项目模型的方法、设备及可读存储介质
WO2023209974A1 (fr) * 2022-04-28 2023-11-02 株式会社ニコン Dispositif de commande, système de commande, système robotisé, procédé de commande, et programme informatique

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6763493B1 (ja) * 2019-07-19 2020-09-30 日本ゼオン株式会社 保存安定性に優れるアクリルゴムシート
US11548150B2 (en) * 2020-05-29 2023-01-10 Mitsubishi Electric Research Laboratories, Inc. Apparatus and method for planning contact-interaction trajectories
CN112894830A (zh) * 2021-03-05 2021-06-04 西安热工研究院有限公司 一种机器人对机房跳线的智能接线系统及接线方法

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6211905A (ja) * 1985-07-10 1987-01-20 Hitachi Ltd ロボツトの径路ならい制御装置
JPH04129951A (ja) * 1989-09-04 1992-04-30 Ricoh Co Ltd 自動原稿送り装置
JPH04167002A (ja) * 1990-10-30 1992-06-15 Kiyouhou Seisakusho:Kk 圧入ロボットのnc制御方法
JP2002224977A (ja) * 2001-01-30 2002-08-13 Nec Corp ロボット制御装置、ロボットの制御方法及びロボット
JP2005085111A (ja) * 2003-09-10 2005-03-31 Fumio Miyazaki 機械システムの制御方法及び装置
JP2013146844A (ja) * 2012-01-23 2013-08-01 Seiko Epson Corp 教示画像生成装置、教示画像生成方法および教示画像生成プログラムならびにロボット制御装置、ロボット制御方法およびロボット制御プログラム
JP2015150636A (ja) * 2014-02-13 2015-08-24 ファナック株式会社 ビジュアルフィードバックを利用したロボットシステム
JP2015223649A (ja) * 2014-05-27 2015-12-14 株式会社安川電機 ギヤ組み込みシステムおよびギヤ組み込み方法
JP2017004036A (ja) * 2015-06-04 2017-01-05 株式会社寺岡精工 商品販売処理装置
WO2017018113A1 (fr) * 2015-07-29 2017-02-02 株式会社オートネットワーク技術研究所 Dispositif de simulation de manipulation d'objet, système de simulation de manipulation d'objet, procédé destiné à la simulation de manipulation d'objet, procédé de fabrication destiné à un objet et programme de simulation de manipulation d'objet
JP2018015856A (ja) * 2016-07-29 2018-02-01 セイコーエプソン株式会社 ロボット、ロボット制御装置、及びロボットシステム

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6211905A (ja) * 1985-07-10 1987-01-20 Hitachi Ltd ロボツトの径路ならい制御装置
JPH04129951A (ja) * 1989-09-04 1992-04-30 Ricoh Co Ltd 自動原稿送り装置
JPH04167002A (ja) * 1990-10-30 1992-06-15 Kiyouhou Seisakusho:Kk 圧入ロボットのnc制御方法
JP2002224977A (ja) * 2001-01-30 2002-08-13 Nec Corp ロボット制御装置、ロボットの制御方法及びロボット
JP2005085111A (ja) * 2003-09-10 2005-03-31 Fumio Miyazaki 機械システムの制御方法及び装置
JP2013146844A (ja) * 2012-01-23 2013-08-01 Seiko Epson Corp 教示画像生成装置、教示画像生成方法および教示画像生成プログラムならびにロボット制御装置、ロボット制御方法およびロボット制御プログラム
JP2015150636A (ja) * 2014-02-13 2015-08-24 ファナック株式会社 ビジュアルフィードバックを利用したロボットシステム
JP2015223649A (ja) * 2014-05-27 2015-12-14 株式会社安川電機 ギヤ組み込みシステムおよびギヤ組み込み方法
JP2017004036A (ja) * 2015-06-04 2017-01-05 株式会社寺岡精工 商品販売処理装置
WO2017018113A1 (fr) * 2015-07-29 2017-02-02 株式会社オートネットワーク技術研究所 Dispositif de simulation de manipulation d'objet, système de simulation de manipulation d'objet, procédé destiné à la simulation de manipulation d'objet, procédé de fabrication destiné à un objet et programme de simulation de manipulation d'objet
JP2018015856A (ja) * 2016-07-29 2018-02-01 セイコーエプソン株式会社 ロボット、ロボット制御装置、及びロボットシステム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
2004, Retrieved from the Internet <URL:http://ishikawa-net.ac.jp/lab/E/y_kawai/www/paper/2004/EEhokuriku04.pdf> [retrieved on 20190731] *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3974119A1 (fr) * 2020-09-23 2022-03-30 Liebherr-Verzahntechnik GmbH Dispositif automatisé de raccordement de prises
CN114421258A (zh) * 2022-01-26 2022-04-29 中国铁建电气化局集团有限公司 一种信号传输线的自动焊接方法
CN114421258B (zh) * 2022-01-26 2024-01-12 中国铁建电气化局集团有限公司 一种信号传输线的自动焊接方法
WO2023209974A1 (fr) * 2022-04-28 2023-11-02 株式会社ニコン Dispositif de commande, système de commande, système robotisé, procédé de commande, et programme informatique
CN115134529A (zh) * 2022-06-29 2022-09-30 广联达科技股份有限公司 一种多视角展示项目模型的方法、设备及可读存储介质

Also Published As

Publication number Publication date
JP7115096B2 (ja) 2022-08-09
JP2020015101A (ja) 2020-01-30

Similar Documents

Publication Publication Date Title
JP7115096B2 (ja) 制御システム、制御方法およびプログラム
JP6965844B2 (ja) 制御システム、解析装置および制御方法
JP5165160B2 (ja) ロボットアームの制御装置及び制御方法、ロボット、ロボットアーム制御プログラム、並びに、集積電子回路
US12179350B2 (en) Dual arm robot teaching from dual hand human demonstration
CN112109075A (zh) 控制系统和控制方法
JP7190552B1 (ja) ロボット教示システム
CN111604942A (zh) 物体检测装置、控制装置以及物体检测用计算机程序
CN108081268A (zh) 机器人控制系统、机器人、程序以及机器人控制方法
JP2002018754A (ja) ロボット装置及びその制御方法
JP2012254518A (ja) ロボット制御システム、ロボットシステム及びプログラム
JP7674464B2 (ja) 視覚センサの出力から得られる3次元位置情報を用いるシミュレーション装置
CN109814434B (zh) 控制程序的校准方法及装置
KR20130075712A (ko) 레이저비전 센서 및 그 보정방법
US12358134B2 (en) Programming device
CN116323115A (zh) 控制装置、机器人臂系统以及机器人臂装置的控制方法
CN119610128B (zh) 基于机器视觉和数字孪生技术的机械臂动作识别与控制方法
Nguyen et al. Revolutionizing robotized assembly for wire harness: A 3D vision-based method for multiple wire-branch detection
WO2020022040A1 (fr) Système de commande, procédé de commande et programme
US20230150142A1 (en) Device and method for training a machine learning model for generating descriptor images for images of objects
JP2023059863A (ja) 人の両手の実演による双腕ロボットの教示
Kalitsios et al. Vision-enhanced system for human-robot disassembly factory cells: introducing a new screw dataset
JP7669231B2 (ja) 実演による教示における両手検出
JP7177239B1 (ja) マーカ検出装置及びロボット教示システム
JPWO2018096669A1 (ja) レーザ加工装置、レーザ加工方法、及びレーザ加工プログラム
Jing et al. FPC-BTB detection and positioning system based on optimized YOLOv5

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19840079

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19840079

Country of ref document: EP

Kind code of ref document: A1