[go: up one dir, main page]

WO2016189740A1 - Système de robot, gabarit d'apprentissage et procédé d'apprentissage - Google Patents

Système de robot, gabarit d'apprentissage et procédé d'apprentissage Download PDF

Info

Publication number
WO2016189740A1
WO2016189740A1 PCT/JP2015/065439 JP2015065439W WO2016189740A1 WO 2016189740 A1 WO2016189740 A1 WO 2016189740A1 JP 2015065439 W JP2015065439 W JP 2015065439W WO 2016189740 A1 WO2016189740 A1 WO 2016189740A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
axis
detection
detection target
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/065439
Other languages
English (en)
Japanese (ja)
Inventor
昌稔 古市
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yaskawa Electric Corp
Original Assignee
Yaskawa Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yaskawa Electric Corp filed Critical Yaskawa Electric Corp
Priority to PCT/JP2015/065439 priority Critical patent/WO2016189740A1/fr
Publication of WO2016189740A1 publication Critical patent/WO2016189740A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/677Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for conveying, e.g. between different workstations

Definitions

  • the present disclosure relates to a robot system, a teaching jig, and a teaching method.
  • Patent Document 1 discloses a system that includes a plurality of arms, each of which teaches a transfer robot that transfers a substrate such as a semiconductor wafer.
  • This disclosure is intended to provide a robot system, a teaching jig, and a teaching method that can perform teaching work efficiently.
  • a robot system includes a robot having a first hand and a second hand rotatable around a first axis, and a controller for controlling the robot, the controller moving the second hand
  • the first relative value indicating the relative difference between the distance from the first hand to the first axis and the distance from the second hand to the first axis is used.
  • the teaching method according to the present disclosure includes a first detection mechanism that is detachably attached to one of a robot having a first hand and a second hand that can rotate about a first axis, and is detachable to the other hand.
  • the first detection mechanism has a first detection target and a variable mechanism that can change the distance from the first detection target to the first axis
  • the second detection mechanism Has at least one sensor including a first sensor that detects the first detection target in accordance with a change in the distance from the first detection target to the first axis.
  • a teaching method uses a robot system including a robot having a first hand and a second hand rotatable around a first axis, and a controller for controlling the robot. And causing the controller to obtain a first relative value indicating a relative difference between the distance to the second hand and the distance from the second hand to the first axis.
  • FIG. 6 is a plan view showing a first detection mechanism and a second detection mechanism during detection of a first relative value. It is a flowchart which shows the recording procedure of a teaching value. It is a perspective view which shows the example of arrangement
  • the robot system 1 automatically performs work on the workpiece W.
  • the work content is conveyance of the work W and the work W is a circular substrate (for example, a semiconductor wafer).
  • the circular shape includes notches (for example, notches or orientation flats), protrusions, and the like formed on a part of the periphery, and is circular if most of the periphery has the same circumference.
  • the robot system 1 includes a robot 10 and a controller 100 for controlling the robot 10.
  • the robot 10 has a plurality of hands that can rotate around the first axis.
  • the robot 10 includes a base 11, an elevating unit 12, a first arm 13, a second arm 14, and hands 20 and 30.
  • the base 11 is fixed to the floor surface of the placement area of the robot 10.
  • the elevating unit 12 protrudes vertically upward from the base 11 and can be elevated along the vertical axis Ax1.
  • the first arm 13 is connected to the upper end of the elevating unit 12.
  • the first arm 13 extends in the horizontal direction from the upper end of the elevating unit 12 and can swing around the axis Ax1.
  • the second arm 14 is connected to the tip of the first arm 13.
  • the second arm 14 extends in the horizontal direction from the tip of the first arm 13 and can swing around an axis Ax2 parallel to the axis Ax1.
  • the hands 20 and 30 are connected to the tip of the second arm 14. Each of the hands 20 and 30 projects horizontally from the tip of the second arm 14 and can rotate about an axis Ax3 (corresponding to the “first axis”) parallel to the axis Ax2.
  • the hand 30 may be provided below the hand 20 or may be provided on the hand 20.
  • the robot 10 includes an actuator that moves the lifting unit 12 up and down along the axis Ax1, an actuator that swings the first arm 13 around the axis Ax1, an actuator that swings the second arm 14 around the axis Ax2, and a hand 20 around the axis Ax3.
  • an actuator for rotating and an actuator for rotating the hand 30 around the axis Ax3 illustration of these actuators is omitted.
  • Examples of the actuator include an electric actuator using an electric motor as a power source.
  • the hands 20 and 30 each hold a workpiece W.
  • the elevating unit 12, the first arm 13 and the second arm 14 convey the work W held by the hands 20 and 30.
  • the configuration of the robot 10 is not limited to that illustrated here, and any configuration may be used as long as it has a plurality of hands that can rotate around a common axis (corresponding to the “first axis”). Good.
  • the robot 10 may have three or more hands that can rotate around a common axis.
  • the hand 20 includes a base portion 21, finger portions 22 and 23, claw portions 22 a and 23 a, and a movable claw portion 24.
  • the base portion 21 is a horizontal plate-like portion, is connected to the distal end portion of the second arm 14 and protrudes in the horizontal direction.
  • the finger parts 22 and 23 are divided into two branches from the base part 21 and further project.
  • an axis passing through the middle part of the finger parts 22 and 23 and orthogonal to the axis Ax3 is defined as a center axis CL1 of the hand 20.
  • the claw portions 22a and 23a are provided at the tip portions of the finger portions 22 and 23, respectively, and protrude upward from the finger portions 22 and 23.
  • the movable claw portion 24 is provided on the base portion 21 so as to be reciprocable along the central axis CL1, and faces the claw portions 22a and 23a.
  • the hand 20 further includes an actuator for reciprocating the movable claw portion 24 along the central axis CL1, but illustration of this actuator is omitted.
  • the actuator include an electric actuator using an electric motor as a power source, a pressure-driven actuator using a fluid pressure as a power source, and the like.
  • the base 21 and the finger parts 22 and 23 support the workpiece W.
  • the claw portions 22a and 23a and the movable claw portion 24 surround the periphery of the workpiece W.
  • the movable claw portion 24 presses the workpiece W against the claw portions 22a and 23a. Thereby, the workpiece
  • the hand 30 also has a base portion 31, finger portions 32 and 33, claw portions 32 a and 33 a, and a movable claw portion 34.
  • an axis passing through the middle part of the finger parts 32 and 33 and orthogonal to the axis Ax3 is defined as a center axis CL2 of the hand 30.
  • the movable claw portion 34 reciprocates along the central axis CL2.
  • the configurations of the hands 20 and 30 are not limited to those illustrated here.
  • the hands 20 and 30 may be configured in any manner as long as the work W can be held.
  • the hands 20 and 30 may be configured to suck and hold the workpiece W instead of gripping the workpiece W by the claw portion.
  • the controller 100 holds the workpiece W by the hands 20 and 30 and controls the robot 10 to convey the workpiece W by moving the hands 20 and 30 holding the workpiece W.
  • the controller 100 controls any one of the plurality of hands as the first hand and controls the other hand as the second hand.
  • the hand 20 is controlled as the first hand and the hand 30 is controlled as the second hand.
  • the controller 100 is configured to use the first relative value indicating the relative difference between the distance from the hand 20 to the axis Ax3 and the distance from the hand 30 to the axis Ax3 when the robot 10 is controlled to move the hand 30.
  • the distance from the hand 20 to the axis Ax3 is a distance in the direction along the central axis CL1, for example, and is a distance from a predetermined part of the hand 20 to the axis Ax3.
  • the distance from the hand 30 to the axis Ax3 is, for example, a distance in the direction along the central axis CL2, and is a distance from a predetermined part of the hand 30 to the axis Ax3.
  • the “predetermined part” may be any part of the hands 20 and 30 as long as the hand 20 and the hand 30 are the same.
  • Specific examples of the “predetermined portion” include an intermediate position CP1 of the claw portions 22a and 23a and an intermediate position CP2 of the claw portions 32a and 33a.
  • the first relative value is generated due to processing errors and assembly errors of the constituent members of the hands 20 and 30.
  • the controller 100 controls the robot 10 to move the hand 30, the controller 100 further uses a second relative value indicating a relative difference between the angle of the hand 20 around the axis Ax3 and the angle of the hand 30 around the axis Ax3. It may be configured.
  • the angle of the hands 20 and 30 around the axis Ax3 is, for example, the rotation angle of the central axes CL1 and CL2 with respect to the reference line.
  • the reference line can be arbitrarily set. For example, a line that forms 180 ° with respect to the second arm 14 may be used as the reference line.
  • the relative difference between the angle of the hand 20 around the axis Ax3 and the angle of the hand 30 around the axis Ax3 makes the angle of the hand 20 around the axis Ax3 and the angle of the hand 30 around the axis Ax3 coincide with the same target angle value.
  • the angle formed by the hands 20 and 30 (for example, the angle formed by the central axes CL1 and CL2).
  • the second relative value is generated due to processing errors and assembly errors of the constituent members of the hand 20 and the hand 30.
  • the controller 100 may be configured to further use a teaching value for correcting a positioning error of the hand 20 when the robot 10 is controlled to move the hand 30.
  • the controller 100 may be configured to acquire the first relative value, the second relative value, and the teaching value.
  • the robot system 1 may further include a console 200 as a user interface of the controller 100.
  • the console 200 is connected to the controller 100, acquires input information from the user to the controller 100, and displays output information from the controller 100 to the user.
  • Specific examples of the console 200 include a keyboard, a mouse, and a monitor.
  • the console 200 may be a so-called teaching pendant in which a display unit and an input unit are integrated, or may be a touch panel display.
  • the controller 100 includes a relative value recording unit 111, a teaching value recording unit 112, and a control unit 113 as functional modules.
  • the relative value recording unit 111 records the first relative value and the second relative value.
  • the relative value recording unit 111 may be configured to acquire a first relative value and a second relative value for recording.
  • the relative value recording unit 111 may be configured to acquire the first relative value and the second relative value from the console 200, and as described later in the description of the teaching procedure, the relative value recording unit 111 receives the second relative value from the control unit 113. You may be comprised so that a relative value may be acquired.
  • the teaching value recording unit 112 records the teaching value.
  • the teaching value recording unit 112 may be configured to acquire a teaching value for recording.
  • the teaching value recording unit 112 may be configured to acquire the teaching value from the console 200, or may be configured to acquire the teaching value from the control unit 113 as described later in the description of the teaching procedure. It may be.
  • the control unit 113 controls the robot 10 to move the hand 20 using the teaching value recorded in the teaching value recording unit 112. For example, the control unit 113 calculates a movement path of the hand 20 for conveying the workpiece W based on the teaching value. Next, the control unit 113 calculates an operation target value of each movable unit for moving the hand 20 along the calculated movement route. Next, the control unit 113 controls the robot 10 to operate each movable unit according to the operation target value.
  • the control unit 113 causes the robot 10 to move the hand 30 using the teaching value recorded in the teaching value recording unit 112 and the first relative value and the second relative value recorded in the relative value recording unit 111. Control. For example, the control unit 113 calculates the movement path of the hand 30 for conveying the workpiece W based on the teaching value. Next, the control unit 113 sets an operation target value of each movable unit for moving the hand 30 along the calculated movement route. At this time, the control unit 113 uses the first relative value and the second relative value.
  • the control unit 113 calculates the distance D2 by adding the first relative value Dr to the distance D1 from the hand 20 to the axis Ax3. Thereafter, the control unit 113 replaces the distance D1 with the distance D2, and calculates the operation target value of each movable unit by performing the same calculation as in the case of moving the hand 20. Thereafter, as shown in FIG. 3, the control unit 113 subtracts the second relative value ⁇ r from the target angle value (the target operation value of the hand 30) ⁇ 1 about the axis Ax3 to calculate the target angle value ⁇ 2. . Next, the control unit 113 replaces the angle target value ⁇ 1 with the angle target value ⁇ 2, and then controls the robot 10 to operate each movable unit according to the operation target value.
  • the first relative value Dr is added to the distance D1 when the first relative value Dr is based on the hand 20 side.
  • the distance D2 is calculated by subtracting the first relative value Dr from the distance D1.
  • the second relative value ⁇ r is subtracted from the target angle value ⁇ 2 when the second relative value ⁇ r is based on the hand 20 side.
  • the angle target value ⁇ 2 is calculated by adding the second relative value ⁇ r to the angle target value ⁇ 1.
  • “addition” and “subtraction” are “addition” and “subtraction” when one side is positive and each value has a positive / negative sign, and “addition” and “subtraction” in absolute values May be different.
  • the angle target value ⁇ 2 is expressed by an absolute value
  • the angle target value ⁇ 2 is set to the angle target value ⁇ 1 and the second relative value A value obtained by adding ⁇ r (see FIG. 3).
  • the hardware of the controller 100 includes, for example, one or a plurality of control computers.
  • the controller 100 includes, for example, a circuit 120 illustrated in FIG. 4 as a hardware configuration.
  • the circuit 120 includes a processor 121, a memory 122, a storage 123, an input / output port 124, and a driver 125.
  • the driver 125 is a circuit for driving various actuators of the robot 10.
  • the input / output port 124 inputs / outputs signals to / from the driver 125 in addition to inputting / outputting external signals.
  • the processor 121 executes the program in cooperation with at least one of the memory 122 and the storage 123 and executes input / output of signals via the input / output port 124, thereby configuring the above-described functional module.
  • the hardware configuration of the controller 100 is not necessarily limited to that which constitutes a functional module by executing a program.
  • the controller 100 may constitute these functional modules by a dedicated logic circuit or an ASIC (Application Specific Integrated Circuit) in which the controller 100 is integrated.
  • ASIC Application Specific Integrated Circuit
  • the robot system 1 may further include a first detection mechanism and a second detection mechanism for detecting the first relative value as an example of the teaching jig.
  • the first detection mechanism 40 and the second detection mechanism 50 will be described in detail with reference to FIGS.
  • the first detection mechanism 40 is provided in one of the first hand and the second hand, and the second detection mechanism 50 is provided in the other hand.
  • the first detection mechanism 40 may be provided detachably with respect to the one hand
  • the second detection mechanism 50 may be provided detachably with respect to the other hand.
  • the first detection mechanism 40 is detachably provided on the hand 30 controlled as the second hand
  • the second detection mechanism 50 is detachably provided on the hand 20 controlled as the first hand.
  • the first detection mechanism 40 is detachably provided on the hand 30 controlled as the second hand
  • the second detection mechanism 50 is detachably provided on the hand 20 controlled as the first hand.
  • the first detection mechanism 40 includes a first detection target and a variable mechanism that can change the distance from the first detection target to the axis Ax3.
  • the distance from the first detection target to the axis Ax3 is, for example, a distance in a direction along the center axis CL2.
  • the first detection mechanism 40 may further include a second detection target.
  • the first detection mechanism 40 includes, for example, a support plate 41, a first convex portion 42, a variable mechanism VS1, a second convex portion 43, a scale plate 44, and a positioning convex portion. 46.
  • the support plate 41 is, for example, a disc having the same diameter as the workpiece W, and can be held by the hand 30 in the same manner as the workpiece W is held.
  • the support plate 41 is supported by the base portion 31 and the finger portions 32 and 33 and is held by the claw portions 32 a and 33 a and the movable claw portion 34.
  • the first detection mechanism 40 is attached to and detached from the hand 30 by holding and releasing the support plate 41 by the hand 30.
  • “up and down” in the description of the first detection mechanism 40 means up and down in a state of being attached to the hand 30.
  • the first convex portion 42 is provided on the upper surface of the support plate 41, and has a cylindrical outer peripheral surface 42 a perpendicular to the support plate 41.
  • the first convex portion 42 is located closer to the claw portions 22 a and 23 a with respect to the center of the support plate 41.
  • the outer peripheral surface 42a functions as the first detection target.
  • the variable mechanism VS1 can change the distance from the outer peripheral surface 42a to the axis Ax3.
  • the variable mechanism VS1 may constrain the outer peripheral surface 42a so that the change in the distance from the outer peripheral surface 42a to the axis Ax3 is accompanied by the movement of the outer peripheral surface 42a around the axis Ax3.
  • the variable mechanism VS1 holds the first convex portion 42 so as to be rotatable around an axis Ax4 (second axis) parallel to the axis Ax3, and the axis Ax4 is decentered with respect to the central axis CL3 of the outer peripheral surface 42a. (See FIG. 7).
  • the first convex portion 42 is rotated around the axis Ax4
  • the distance from the outer peripheral surface 42a to the axis Ax3 changes with the movement of the outer peripheral surface 42a around the axis Ax3.
  • the constraining structure of the outer peripheral surface 42a is not limited to that exemplified here.
  • the shape of the outer peripheral surface 42a in plan view may be an ellipse, and the axis Ax4 may coincide with the central axis CL3 of the outer peripheral surface 42a.
  • the change in the distance from the outer peripheral surface 42a to the axis Ax3 is accompanied by the movement of the outer peripheral surface 42a around the axis Ax3.
  • variable mechanism VS1 guides the first convex portion 42 along a line inclined to the central axis CL1 in a plan view, instead of holding the first convex portion 42 so as to be rotatable around the axis Ax4. It may be configured to. Also in this case, the change in the distance from the outer peripheral surface 42a to the axis Ax3 is accompanied by the movement of the outer peripheral surface 42a around the axis Ax3.
  • the scale plate 44 is a disc centered on the axis Ax4, and is integrated with the lower portion of the first convex portion 42. That is, the scale plate 44 rotates together with the first convex portion 42. As shown in FIG. 7, a scale 45 is provided on the upper surface of the scale plate 44 and the upper surface of the support plate 41 so that the rotation angle of the first convex portion 42 can be visually recognized. A numerical value indicating the rotation angle of the first convex portion 42 may be displayed along with the scale 45. As will be described later, since the rotation angle of the first convex portion 42 correlates with the first relative value, the first relative value corresponding to the rotation angle is used instead of the numerical value indicating the rotation angle of the first convex portion 42. A numerical value may be displayed.
  • the second convex portion 43 is provided on the upper surface of the support plate 41 and has a cylindrical outer peripheral surface 43 a perpendicular to the support plate 41.
  • the second convex portion 43 is located at the center of the support plate 41.
  • the outer peripheral surface 43a functions as the second detection target.
  • the positioning convex portion 46 is provided on the lower surface of the support plate 41.
  • the positioning convex portion 46 is disposed so as to contact one of the finger portions 32 and 33 when the first convex portion 42 is appropriately disposed. For this reason, it becomes possible to arrange
  • the second detection mechanism 50 includes at least one sensor including a first sensor that detects the first detection target in accordance with a change in the distance from the first detection target to the axis Ax3.
  • the at least one sensor may further include a second sensor that detects a second detection target in accordance with a change in relative angle around the axis Ax3 of the hands 20 and 30.
  • the second detection mechanism 50 includes a first sensor 51, a second sensor 52, and a support plate 53.
  • the support plate 53 includes a base portion 54 and finger portions 55 and 56 that are divided into two portions and project from the base portion 54.
  • a circumferential portion 54 a having the same diameter as the workpiece W is formed on the opposite side of the finger portions 55 and 56.
  • Circumferential portions 55a and 56a that are concentric with the circumferential portion 54a and have the same diameter are formed at the tips of the finger portions 55 and 56, respectively.
  • the width of the support plate 53 in the direction in which the finger portions 55 and 56 are arranged may be equal to the width of the hand 20 in the direction in which the finger portions 22 and 23 are arranged.
  • the support plate 53 is disposed on the base portion 21 and the finger portions 22 and 23 so that the circumferential portions 55a and 56a face the claw portions 22a and 23a, respectively, and the circumferential portion 54a faces the movable claw portion 24. .
  • the support plate 53 is held by pressing the support plate 53 against the claw portions 22a and 23a by the movable claw portion 24.
  • the second detection mechanism 50 is attached to and detached from the hand 20 by holding and releasing the support plate 53 on the hand 20.
  • “up and down” in the description of the second detection mechanism 50 means up and down in a state where the second detection mechanism 50 is attached to the hand 20.
  • the first sensor 51 is, for example, a transmissive optical sensor, and detects an object between the light projecting unit and the light receiving unit according to a light receiving state of light emitted from the light projecting unit to the light receiving unit.
  • the first sensor 51 includes a light projecting unit 51a, a light receiving unit 51b, and an amplifier 51c.
  • the light projecting portion 51a and the light receiving portion 51b are provided at the tip portions of the finger portions 55 and 56, respectively, and face each other.
  • the light projecting unit 51a emits light toward the light receiving unit 51b.
  • the amplifier 51c is provided on the base 54, and is connected to the light projecting unit 51a and the light receiving unit 51b via an optical fiber (not shown).
  • the amplifier 51c sends the outgoing light to the light projecting unit 51a via the optical fiber, and receives the light incident on the light receiving unit 51b via the optical fiber.
  • the second sensor 52 is, for example, a transmissive optical sensor, and detects an object between the light projecting unit and the light receiving unit according to a light receiving state of light emitted from the light projecting unit to the light receiving unit.
  • the second sensor 52 includes a light projecting unit 52a, a light receiving unit 52b, and an amplifier 52c.
  • the light projecting unit 52a and the light receiving unit 52b are provided at the distal end portion and the base portion 54 of the projecting portion 57, and face each other.
  • the light projecting unit 52a emits light toward the light receiving unit 52b.
  • the amplifier 52c is provided on the base 54, and is connected to the light projecting unit 52a and the light receiving unit 52b via an optical fiber (not shown).
  • the amplifier 52c sends the outgoing light to the light projecting unit 52a via the optical fiber, and receives the light incident on the light receiving unit 52b via the optical fiber.
  • the first sensor 51 and the second sensor 52 may be connected to the controller 100.
  • the control unit 113 of the controller 100 may be configured to acquire detection results output from the first sensor 51 and the second sensor 52.
  • the first sensor 51 and the second sensor 52 are not limited to transmissive optical sensors.
  • the first sensor 51 and the second sensor 52 may be any sensors as long as the first detection target and the second detection target can be detected.
  • the first sensor 51 and the second sensor 52 may be a reflection type optical sensor or a contact type sensor such as a dial gauge.
  • the first detection mechanism 40 and the second detection mechanism 50 are used in a state where the hand 20 and the hand 30 are overlapped.
  • the first sensor 51 detects the outer peripheral surface 42a in accordance with a change in the distance from the outer peripheral surface 42a to the axis Ax3. For example, if the incident light from the light projecting unit 51a to the light receiving unit 51b is blocked at a predetermined ratio by the outer peripheral surface 42a entering between the light projecting unit 51a and the light receiving unit 51b away from the axis Ax3, the first sensor 51 Detects the outer peripheral surface 42a.
  • the predetermined ratio can be set as appropriate, for example, 50%.
  • the second sensor 52 detects the outer peripheral surface 43a according to a change in the relative angle around the axis Ax3 of the hands 20 and 30. Specifically, the second sensor 52 detects the outer peripheral surface 43a in response to the outer peripheral surface 43a entering between the light projecting unit 52a and the light receiving unit 52b in accordance with the change in the relative angle. For example, the second sensor 52 detects the outer peripheral surface 43a when the incident light to the light receiving unit 52b is blocked at a predetermined rate by the outer peripheral surface 43a entering between the light projecting unit 52a and the light receiving unit 52b.
  • the predetermined ratio can be set as appropriate, for example, 50%.
  • the robot system 1 may further include a third detection mechanism for causing the teaching value recording unit 112 (see FIG. 1) to acquire a teaching value as an example of the teaching jig.
  • the third detection mechanism 60 will be described in detail with reference to FIG.
  • the third detection mechanism 60 is disposed at a place where the hand 20 should be moved.
  • the third detection mechanism 60 is disposed at a place where the workpiece W to be held by the hand 20 is disposed.
  • the third detection mechanism 60 has a third detection target and a fourth detection target.
  • the third detection mechanism 60 includes a third convex portion 61, a fourth convex portion 62, and a support plate 63.
  • the support plate 63 is a plate having a size that can be disposed at a place where the workpiece W is disposed.
  • “up and down” in the description of the third detection mechanism 60 means up and down in a state in which the support plate 63 is disposed at a position where the workpiece W is disposed.
  • the third convex portion 61 is provided on the upper surface of the support plate 63 and has a columnar outer peripheral surface 61 a perpendicular to the support plate 63.
  • the fourth convex portion 62 is provided on the upper surface of the support plate 63 at a position different from the third convex portion 61.
  • a circumferential surface 62 a concentric with the outer circumferential surface 61 a is formed on a portion of the circumferential surface of the fourth convex portion 62 facing the opposite side of the third convex portion 61.
  • the fourth convex portion 62 is arranged in accordance with the second detection mechanism 50.
  • the 4th convex part 62 arrange
  • the circumferential surface 62 a is arranged so as to be detected by the first sensor 51.
  • the support plate 63 may be circular or polygonal.
  • the support plate 63 and the outer peripheral surface 61a may be concentric.
  • the third detection mechanism 60 is used with the second detection mechanism 50 attached to the hand 20.
  • the outer peripheral surface 61 a functions as a third detection target and is a detection target by the second sensor 52.
  • the circumferential surface 62 a functions as a fourth detection target and is a detection target by the first sensor 51.
  • the teaching procedure for the robot system 1 includes causing the controller 100 to acquire the first relative value.
  • the teaching procedure may further include causing the controller 100 to acquire the second relative value.
  • a procedure for causing the controller 100 to acquire both the first relative value and the second relative value will be described in detail.
  • this procedure includes steps S01 to S06.
  • step S01 the operator operates the robot system 1 so as to temporarily align the angles of the hands 20 and 30 around the axis Ax3.
  • the operator inputs a command to the console 200 to make the angles of the hands 20 and 30 around the axis Ax3 coincide with the same angle target value.
  • the controller 100 controls the robot 10.
  • the control unit 113 controls the robot 10 so that the angles of the hands 20 and 30 coincide with the same angle target value.
  • step S02 the operator attaches the second detection mechanism 50 and the first detection mechanism 40 to the hands 20 and 30 after the execution of step S01.
  • the operator inputs an instruction to the console 200 to press the support plate 41 toward the claw portions 32 a and 33 a by the movable claw portion 34.
  • the controller 100 controls the robot 10.
  • the control unit 113 controls the robot 10 so that the movable claw unit 34 presses the support plate 41 toward the claw units 32a and 33a.
  • the controller 100 controls the robot 10.
  • the control unit 113 controls the robot 10 such that the movable claw unit 24 presses the support plate 53 against the claw units 22a and 23a.
  • Steps S03 and S04 are started when the operator inputs a command to acquire the second relative value to the console 200 and executed by the controller 100.
  • the control unit 113 controls the robot 10 to change the relative angle around the axis Ax3 of the hands 20, 30 by rotating at least one of the hands 20, 30 around the axis Ax3 (FIG. 12). reference).
  • the “relative angle” here means a relative angle when it is assumed that the second relative value is zero.
  • step S04 the control unit 113 acquires the second relative value based on the relative angle when the outer peripheral surface 43a is detected by the second sensor 52, and records the second relative value in the teaching value recording unit 112.
  • the control unit 113 acquires a value obtained by reversing the sign of the relative angle with respect to the hand 20 side as a second relative value with the hand 20 side as a reference.
  • this procedure changes the relative angle around the axis Ax3 of the hands 20 and 30, and sets the second relative value on the controller 100 based on the relative angle when the outer peripheral surface 43a is detected by the second sensor 52. Including the acquisition.
  • step S05 an operator rotates the 1st convex part 42 (refer FIG. 13). Thereby, the distance from the outer peripheral surface 42a to the axis Ax3 changes. The operator rotates the first convex portion 42 until the light beam LF1 from the light projecting portion 51a toward the light receiving portion 51b is blocked by the outer peripheral surface 42a and the first sensor 51 detects the outer peripheral surface 42a.
  • step S06 the operator causes the controller to acquire the first relative value based on the position of the outer peripheral surface 42a when detected by the first sensor 51.
  • the operator determines the difference between the position of the outer peripheral surface 42a to be detected by the first sensor 51 and the position of the outer peripheral surface 42a when actually detected by the first sensor 51 when the first relative value is zero.
  • the controller 100 acquires the first relative value.
  • the relative value recording unit 111 acquires and records the first relative value.
  • this procedure changes the distance from the outer peripheral surface 42a to the axis Ax3, and causes the controller 100 to acquire the first relative value based on the position of the outer peripheral surface 42a when detected by the first sensor 51.
  • the position of the outer peripheral surface 42a means the position in the direction along the center axis line CL2. This procedure may be executed before the factory shipment of the robot system 1 or may be executed when the robot system 1 is installed after the factory shipment.
  • the teaching procedure for the robot system 1 may further include causing the controller 100 to acquire a teaching value for correcting the positioning error of the hand 20.
  • a procedure for causing the controller 100 to acquire the teaching value will be described in detail.
  • this procedure includes steps S11 to S15.
  • step S ⁇ b> 11 the operator attaches the second detection mechanism 50 to the hand 20 and arranges the third detection mechanism 60.
  • the procedure for mounting the second detection mechanism 50 is the same as that in step S02.
  • the operator places the third detection mechanism 60 at a place where the hand 20 should be moved.
  • a place where the hand 20 should be moved a place where the work W to be held by the hand 20 is arranged can be mentioned.
  • the operator arranges the third detection mechanism 60 in the slot 71 of the cassette 70 for accommodating a plurality of workpieces W in multiple stages (see FIG. 15).
  • the operator arranges the fourth protrusion 62 on the back side of the third protrusion 61.
  • step S12 the operator operates the robot system 1 so that the hand 20 is placed on the third detection mechanism 60.
  • the operator inputs a command for moving the hand 20 toward the third detection mechanism 60 to the console 200.
  • the controller 100 controls the robot 10.
  • the control unit 113 controls the robot 10 so as to move the hand 20 in accordance with a command input to the console 200.
  • the operator continues to operate the robot system 1 until the hand 20 is positioned on the third detection mechanism 60 and the third convex portion 61 and the fourth convex portion 62 are arranged between the finger portions 55 and 56.
  • Steps S13 to S15 are started when the operator inputs a command for acquiring a teaching value to the console 200, and is executed by the controller 100.
  • the control unit 113 controls the robot 10 to rotate the hand 20 about the axis Ax3 (see FIG. 16).
  • the control unit 113 controls the robot 10 to stop the hand 20 when the light beam LF2 from the light projecting unit 52a toward the light receiving unit 52b is blocked by the outer peripheral surface 61a and the second sensor 52 detects the outer peripheral surface 61a.
  • step S14 the control unit 113 controls the robot 10 to move the hand 20 along the central axis CL1 (see FIG. 17).
  • the control unit 113 stops the hand 20 so that the hand 20 is stopped. To control.
  • step S15 the teaching for correcting the positioning error of the hand 20 based on the state of the robot 10 when the outer circumferential surface 61a is detected by the second sensor 52 and the circumferential surface 62a is detected by the first sensor 51.
  • the value is acquired by the control unit 113 and recorded in the teaching value recording unit 112.
  • the control unit 113 calculates the position of the hand 20 stopped through steps S13 and S14, and records this in the teaching value recording unit 112 as a teaching value.
  • the position of the hand 20 calculated by the control unit 113 is, for example, the position of the hand 20 in the coordinate system with the robot 10 as a reference.
  • the present procedure is based on the state of the robot 10 when the hand 20 is moved, and the indicator whose position relative to the target position is known is detected by at least one sensor of the second detection mechanism 50. Obtaining a teaching value for correcting 20 positioning errors.
  • This procedure is executed when the robot system 1 is installed after the factory shipment of the robot system 1. This procedure may be executed before or after steps S01 to S06.
  • the robot system 1 includes the robot 10 having the hands 20 and 30 rotatable around the axis Ax3 and the controller 100 for controlling the robot 10.
  • the controller 100 is configured to use the first relative value indicating the relative difference between the distance from the hand 30 to the axis Ax3 and the distance from the hand 20 to the axis Ax3 when the robot 10 is controlled to move the hand 30. Has been.
  • the target position of the hand 20 if the target position of the hand 20 is accurately determined, the target position of the hand 30 can also be accurately calculated using the first relative value. For this reason, it is not necessary to individually execute teaching for determining the target position for the hands 20 and 30. Therefore, teaching work can be performed efficiently.
  • the controller 100 controls the robot 10 to move the hand 30, the controller 100 further uses a second relative value indicating a relative difference between the angle of the hand 20 around the axis Ax3 and the angle of the hand 30 around the axis Ax3. It may be configured. In this case, since the target position of the hand 30 can be calculated based on the relative difference between the declination direction and the radial direction about the axis Ax3, the positioning accuracy of the hand 30 is improved.
  • the robot system 1 may further include a first detection mechanism 40 provided in one of the hands 20 and 30 and a second detection mechanism 50 provided in the other hand.
  • the first detection mechanism 40 includes an outer peripheral surface 42a as a first detection target, and a variable mechanism VS1 that can change the distance from the outer peripheral surface 42a to the axis Ax3.
  • the second detection mechanism 50 includes at least one sensor including a first sensor 51 that detects the outer peripheral surface 42a in accordance with a change in the distance from the outer peripheral surface 42a to the axis Ax3.
  • the controller 100 may be configured to further execute obtaining the first relative value.
  • the first relative value can be easily detected based on the position of the outer peripheral surface 42a when detected by the first sensor 51, and can be acquired by the controller.
  • the teaching work can be performed more efficiently.
  • the first detection mechanism 40 may further include an outer peripheral surface 43a as a second detection target, and at least one sensor of the second detection mechanism 50 changes the relative angle around the axis Ax3 of the hands 20 and 30. Accordingly, a second sensor 52 that detects the outer peripheral surface 43a may be further included.
  • the controller 100 may be configured to further execute obtaining the second relative value. In this case, the second relative value can also be easily detected and acquired by the controller. Therefore, since both the first relative value and the second relative value can be easily taught, the teaching work can be performed more efficiently.
  • the controller 100 controls the robot 10 to change the relative angle around the axis Ax3 of the hands 20 and 30, and calculates the second relative value based on the relative angle when the outer peripheral surface 43a is detected by the second sensor 52. You may get it. In this case, since the second relative value is automatically acquired, the teaching work can be performed more efficiently.
  • the first detection mechanism 40 may be provided detachably with respect to the one hand, and the second detection mechanism 50 may be provided detachably with respect to the other hand.
  • the configuration of the hands 20 and 30 can be simplified.
  • variable mechanism VS1 may constrain the outer peripheral surface 42a so that the change in the distance from the outer peripheral surface 42a to the axis Ax3 is accompanied by the movement of the outer peripheral surface 42a around the axis Ax3. In this case, since the operation amount for changing the distance from the outer peripheral surface 42a to the axis Ax3 is increased, fine adjustment of the distance is facilitated.
  • variable mechanism VS1 may hold the outer peripheral surface 42a so as to be rotatable around an axis Ax4 parallel to the axis Ax3, and the axis Ax4 may be eccentric with respect to the central axis CL3 of the outer peripheral surface 42a.
  • a configuration that increases the amount of operation for changing the distance from the outer peripheral surface 42a to the axis Ax3 can be realized with a simple rotation mechanism.
  • the controller 100 may be configured to further use a teaching value for correcting a positioning error of the hand 20 when controlling the robot 10 to move the hand 30.
  • the positioning error of the hand 30 can also be corrected by using the teaching value for correcting the positioning error of the hand 20 and the first relative value. Therefore, the teaching work can be performed more efficiently.
  • the second detection mechanism 50 is provided in the hand 20, and the controller 100 controls the robot 10 to move the hand 20, and an index whose position relative to the target position is known is determined by at least one sensor of the second detection mechanism 50.
  • the robot 10 is further controlled to move the hand 30 by further obtaining a teaching value for correcting the positioning error of the hand 20 based on the state of the robot 10 detected.
  • the teaching value for correcting the positioning error of 20 may be further used. In this case, since the teaching value is automatically acquired, the teaching operation can be performed more efficiently.
  • This disclosure can be used for a robot system that automatically executes work on a workpiece.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système de robot (1), lequel système comporte un robot (10) ayant des mains (20, 30) qui sont aptes à tourner autour d'une ligne d'axe (Ax3), et un dispositif de commande (100) pour commander le robot (10). Le dispositif de commande (100) est configuré de telle sorte que, lors de la commande du robot (10) de manière à déplacer la main (30), une première valeur relative représentant la différence relative entre la distance à partir de la main (30) jusqu'à la ligne d'axe (Ax3) et la distance à partir de la main (20) jusqu'à la ligne d'axe (Ax3) est utilisée.
PCT/JP2015/065439 2015-05-28 2015-05-28 Système de robot, gabarit d'apprentissage et procédé d'apprentissage Ceased WO2016189740A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/065439 WO2016189740A1 (fr) 2015-05-28 2015-05-28 Système de robot, gabarit d'apprentissage et procédé d'apprentissage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/065439 WO2016189740A1 (fr) 2015-05-28 2015-05-28 Système de robot, gabarit d'apprentissage et procédé d'apprentissage

Publications (1)

Publication Number Publication Date
WO2016189740A1 true WO2016189740A1 (fr) 2016-12-01

Family

ID=57392660

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/065439 Ceased WO2016189740A1 (fr) 2015-05-28 2015-05-28 Système de robot, gabarit d'apprentissage et procédé d'apprentissage

Country Status (1)

Country Link
WO (1) WO2016189740A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020098851A (ja) * 2018-12-18 2020-06-25 株式会社安川電機 搬送システム及び搬送制御方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002313872A (ja) * 2001-04-16 2002-10-25 Tokyo Electron Ltd 基板搬送手段のティーチング方法
JP2008502498A (ja) * 2004-06-09 2008-01-31 ブルックス オートメーション インコーポレイテッド 二重スカラアーム
JP2008302496A (ja) * 2006-07-04 2008-12-18 Panasonic Corp ロボットアームの制御装置及び制御方法、ロボット、及びロボットアームの制御プログラム
JP2009136981A (ja) * 2007-12-07 2009-06-25 Daihen Corp ロボット制御装置
JP2013163231A (ja) * 2012-02-09 2013-08-22 Daihen Corp ロボット制御装置
JP2014099576A (ja) * 2012-10-15 2014-05-29 Tokyo Electron Ltd 搬送機構の位置決め方法、被処理体の位置ずれ量算出方法及び搬送機構のティーチングデータの修正方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002313872A (ja) * 2001-04-16 2002-10-25 Tokyo Electron Ltd 基板搬送手段のティーチング方法
JP2008502498A (ja) * 2004-06-09 2008-01-31 ブルックス オートメーション インコーポレイテッド 二重スカラアーム
JP2008302496A (ja) * 2006-07-04 2008-12-18 Panasonic Corp ロボットアームの制御装置及び制御方法、ロボット、及びロボットアームの制御プログラム
JP2009136981A (ja) * 2007-12-07 2009-06-25 Daihen Corp ロボット制御装置
JP2013163231A (ja) * 2012-02-09 2013-08-22 Daihen Corp ロボット制御装置
JP2014099576A (ja) * 2012-10-15 2014-05-29 Tokyo Electron Ltd 搬送機構の位置決め方法、被処理体の位置ずれ量算出方法及び搬送機構のティーチングデータの修正方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020098851A (ja) * 2018-12-18 2020-06-25 株式会社安川電機 搬送システム及び搬送制御方法
CN111332833A (zh) * 2018-12-18 2020-06-26 株式会社安川电机 运送系统以及运送控制方法
JP7103200B2 (ja) 2018-12-18 2022-07-20 株式会社安川電機 搬送システム及び搬送制御方法

Similar Documents

Publication Publication Date Title
US10395956B2 (en) Substrate transfer apparatus and method of teaching substrate transfer robot
JP6637494B2 (ja) ロボットの教示方法及びロボット
JP2015168012A (ja) 教示ジグ、教示システムおよび教示方法
CN105382828A (zh) 机器人系统、机器人示教方法及机器人示教装置
JP2009090406A (ja) ロボットのターゲット位置検出装置
US20170066137A1 (en) Control device, robot, and robot system
JP2015119066A (ja) 検出システムおよび検出方法
WO2011001675A1 (fr) Dispositif pour l'apprentissage d'un robot et procédé pour l'apprentissage d'un robot
TW201702160A (zh) 搬送系統、搬送機器人及其教導方法
JP2010162611A (ja) 相対ティーチング方法
US10864643B2 (en) Substrate conveying apparatus
JP2016107378A (ja) 産業用ロボットおよび産業用ロボットの教示方法
KR102308091B1 (ko) 기판 반송 장치 및 기판 반송 로봇과 기판 재치부의 위치 관계를 구하는 방법
JP6603289B2 (ja) ロボット、ロボットシステム、およびロボットの座標系設定方法
KR102477371B1 (ko) 반송 시스템 및 반송 제어 방법
KR102560896B1 (ko) 로봇의 위치 보정 방법 및 로봇
JP2013144325A (ja) ロボット装置、故障検出方法
JP2011018828A (ja) 位置認識装置及び位置認識方法並びに位置決め装置
WO2016189740A1 (fr) Système de robot, gabarit d'apprentissage et procédé d'apprentissage
KR102783064B1 (ko) 기판 반송 로봇의 제어 장치 및 관절 모터의 제어 방법
JP2012152898A (ja) ロボットのターゲット位置検出装置、半導体装置およびターゲット位置検出方法
WO2022163084A1 (fr) Dispositif d'alignement et procédé d'alignement
TWI661914B (zh) 機器人點位調節方法與系統
KR102508280B1 (ko) 틸팅 헤드용 회전중심 보정장치
WO2019064890A1 (fr) Dispositif de transport de substrat et procédé de recherche d'axe de rotation d'unité de transport de substrat

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15893374

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15893374

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP