[go: up one dir, main page]

WO2021048579A1 - Système et machine-outil - Google Patents

Système et machine-outil Download PDF

Info

Publication number
WO2021048579A1
WO2021048579A1 PCT/IB2019/001004 IB2019001004W WO2021048579A1 WO 2021048579 A1 WO2021048579 A1 WO 2021048579A1 IB 2019001004 W IB2019001004 W IB 2019001004W WO 2021048579 A1 WO2021048579 A1 WO 2021048579A1
Authority
WO
WIPO (PCT)
Prior art keywords
posture
robot
axis
camera
machine tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2019/001004
Other languages
English (en)
Japanese (ja)
Inventor
大場勇太
櫻井努
長末秀樹
中川昌昭
コータ ウィーバー
アーナンド パルワル
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DMG Mori Co Ltd
Skylla Technologies
Original Assignee
DMG Mori Co Ltd
Skylla Technologies
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DMG Mori Co Ltd, Skylla Technologies filed Critical DMG Mori Co Ltd
Priority to PCT/IB2019/001004 priority Critical patent/WO2021048579A1/fr
Priority to US17/642,573 priority patent/US12358148B2/en
Priority to CN202080064275.5A priority patent/CN114728414A/zh
Priority to JP2021526426A priority patent/JP7482364B2/ja
Priority to EP20864229.8A priority patent/EP4013578A4/fr
Priority to PCT/US2020/050073 priority patent/WO2021050646A1/fr
Publication of WO2021048579A1 publication Critical patent/WO2021048579A1/fr
Anticipated expiration legal-status Critical
Priority to JP2024059506A priority patent/JP2024096756A/ja
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Definitions

  • This disclosure consists of a machine tool that processes a work machine, a robot that works on the machine tool, and a transfer device that can move to a work position set for the machine tool. Regarding the system to be used.
  • one robot moved by an automatic guided vehicle can perform work such as attaching / detaching a work to a plurality of machine tools.
  • work such as attaching / detaching a work to a plurality of machine tools.
  • the degree of freedom of the machine tool layout is increased, so the machine tool layout can be set to a layout that can further improve production efficiency. ..
  • the equipment cost can be reduced. be able to.
  • the automatic guided vehicle since the automatic guided vehicle has a structure of self-propelling using wheels, its positioning accuracy of stopping at the working position is not necessarily high. Therefore, in order for the robot to perform accurate work on the machine tool, it is set at the posture of the robot when the automatic guided vehicle is positioned at the work position and at the time of so-called teaching, which is a control standard. It is necessary to compare with the reference posture of the robot, detect the amount of error, and correct the working posture of the robot according to the amount of error.
  • a position correction method as disclosed in Japanese Patent Application Laid-Open No. 2016-221622 (Japanese Patent Application Publication, Patent Document 2 below) has been conventionally used.
  • a visual target consisting of two calibration markers is placed on the outer surface of the machine tool, and the visual target is measured by a camera provided on the movable part of the robot. Based on the image obtained by imaging and the position and orientation of the camera, the relative positional relationship between the robot and the machine tool is measured, and based on the measured positional relationship, the robot's working posture. Is to correct.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 201 7-1 32002
  • Patent Document 2 Japanese Patent Application Laid-Open No. 2016-221622 [Summary of Invention]
  • the automatic guided vehicle is configured to move by the movement of wheels with a relatively high degree of freedom, so the mounting surface on which the robot is mounted is It has the characteristic that it easily tilts with respect to the floor surface, and that the tilt easily changes according to changes in the posture of the robot to be mounted, in other words, according to changes in the position of the center of gravity of the robot. ..
  • a visual target which is a calibration marker, is arranged on the outer surface of the machine tool, and the robot is outside the machine tool. Even if the position correction amount (posture correction amount) of the robot is acquired, the obtained position correction amount is used to attach / detach the work piece, which is executed when the robot hand is inside the machine tool. Regarding the movement, the posture of the robot cannot be corrected accurately.
  • the robot hand cannot be accurately positioned with respect to the chuck.
  • the chuck is described above.
  • the movement allowance (stroke) of the grip is very small, that is, the clearance between the work and the chuck is very small, the work is surely applied to the chuck. There is a possibility that it cannot be grasped.
  • the present disclosure has a machine tool that performs predetermined processing on the work, a camera that captures an image, and an action unit that acts on the work, and the machine tool has a function.
  • an operation program that includes a robot that performs work, a transfer device that is equipped with the robot and is configured to be movable to a work position set for the machine tool, and a preset operation command.
  • a control device configured to sequentially take one or more working postures for operating the working unit is provided, and the working start posture, the imaging posture, and the working posture are the teaching operations of the robot.
  • This is a preset system, in which the identification figure is formed on a predetermined plane and arranged in the machining area of the machine tool, and the control device is used during the teaching operation.
  • the image of the identification figure captured by the camera is stored in advance as a reference image, and when the robot is operated according to the operation program, the transport device is used.
  • the above work ⁇ 0 2021/048579 The image of the identification figure captured by the camera after shifting the robot from the work start posture to the imaging posture while moving to the position (: 17132019/001004), and Based on the reference image, the amount of error in the position of the camera between the current posture of the robot and the posture during the teaching operation, which is orthogonal to each other set in a plane parallel to the identification figure.
  • the amount of positional error of the camera in the two axial directions and the amount of rotation error of the camera around the vertical axis orthogonal to the plane are estimated, and the action in the working posture is based on each estimated error amount.
  • the present invention relates to a system configured to calculate a correction amount for a part and correct the position of the action part in the working posture based on the calculated correction amount.
  • the operation of the robot is controlled according to the operation program, and after the transfer device is moved to the work position, the robot starts the operation from the work start posture, and then is provided in the machine tool.
  • the camera By facing the camera with respect to the identification figure for posture correction, the camera takes an imaging posture for capturing the identification figure, and then operates so as to sequentially take one or more working postures.
  • the work start posture, the imaging posture, and the work posture are set in advance by teaching the robot.
  • the identification figure is formed on a predetermined plane and is arranged in the machining area of the machine tool. Also, when the robot takes the imaging posture, the camera faces the identification figure, that is, the camera is in a posture in which the lens and the identification figure are substantially parallel to each other.
  • the control device stores in advance the image of the identification figure captured by the camera as a reference image in a state where the robot is shifted to the imaging posture during the teaching operation.
  • the image of the identification figure captured by the camera and the reference image in a state where the robot is shifted from the work start posture to the imaging posture. Based on, the amount of error in the position of the camera between the current posture of the robot and the posture during the teaching operation, which is orthogonal to each other set in a plane parallel to the identification figure.
  • the amount of positional error of the camera in the biaxial direction and the amount of rotation error of the camera around the vertical axis orthogonal to the plane are estimated, and based on each estimated amount of error, with respect to the acting part in the working posture.
  • the correction amount is calculated, and the position of the working portion in the working posture is corrected based on the calculated correction amount.
  • the working posture of the robot is corrected by using the identification figure placed in the machining area of the machine tool that the robot actually works on.
  • the work posture can be corrected accurately, so that the robot can perform the work accurately even in a work that requires high operation accuracy.
  • the robot that operates according to the operation program is configured to execute this in one operation when the identification figure is imaged by the camera, so it is shorter than before. Accurate correction can be performed in time.
  • a robot having a camera for capturing an image including an identification figure provided in a machine tool and an acting part acting on a work is mounted on the robot.
  • the second position of the action part on the X-axis and the V-axis in the cubic element space of the X-axis, the V-axis, and the axis is corrected by making the action part perform linear movement and rotational movement.
  • a machine tool for machining a work in which the working portion acts on the work inside the machine tool after the second position is corrected, and the identification figure is arranged in the machining area of the machine tool.
  • the working posture of the robot is corrected by using the identification figure arranged in the machining area of the machine tool in which the robot actually works. Therefore, the working posture can be corrected accurately, so that the robot can perform the work with high accuracy even in the work requiring high operation accuracy.
  • FIG. 1 is a plan view showing a schematic configuration of a system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing a configuration of a system according to the present embodiment.
  • FIG. 3 is a perspective view showing an automatic guided vehicle and a robot according to the present embodiment.
  • FIG. 4 is an explanatory diagram for explaining the imaging posture of the robot according to the present embodiment.
  • FIG. 5 is an explanatory diagram showing an identification figure according to the present embodiment.
  • FIG. 6 is an explanatory diagram for explaining a correction amount calculation method in the present embodiment.
  • FIG. 7 is an explanatory diagram for explaining a correction amount calculation method in the present embodiment.
  • FIG. 8 is an explanatory diagram for explaining a correction amount calculation method in the present embodiment.
  • FIG. 9 is an explanatory diagram for explaining position correction in the present embodiment.
  • the system 1 of this example is a machine tool 10 and a material stocker 2 0 as a peripheral device and a product stocker 2 1 and an automatic guided vehicle 3 5 ,
  • the robot 2 5 mounted on this automatic guided vehicle 35, the camera 3 1 mounted on the robot 2 5, and the control device 4 0 that controls the robot 2 5 and the automatic guided vehicle 3 5 Will be done.
  • the machine tool 1 0 is provided with a spindle 1 1 to which a chuck 1 2 for gripping a work ( ⁇ ') is mounted, and the spindle 1 1 is in the vertical direction. It is a so-called vertical N0 (numerical control) lathe installed along the line, and it is possible to perform turning on the work ( ⁇ ').
  • a tool presetter 1 3 equipped with a contactor 1 4 and a support bar 1 5 for supporting the contactor 1 4 is provided in the vicinity of the spindle 1 1, and the support bar 1 5 is located on the axis of the spindle 1 1.
  • the display board 16 is provided so as to be located on a horizontal plane.
  • Fig. 4 shows the state where the support bar 1 5 and the contact 1 4 have advanced into the machining area, but the support bar 1 5 and the contact 1 1 With the contacts 1 4 and the display board 1 6 retracted in the storage area, the contacts 1 4 and the display board 1 6 are separated from the application area by closing the shutter 1 7 with the 4 retracted.
  • the identification figure of this example has a matrix structure in which a plurality of square pixels are arranged two-dimensionally, and each pixel is displayed in white or black. In Figure 5, the black pixels are shaded. Some of these identification figures are called entry markers and entry pr 1 1 chome & 8. If the identification figure is small, a lens may be provided on the identification figure so that the enlarged image can be captured by the camera 31 described later.
  • the material stocker 20 is arranged on the left side of the machine tool 10 in FIG. 1, and a plurality of materials processed by the machine tool 10 (a device for stocking a work before machining).
  • the product stocker 2 1 is arranged on the right side of the machine tool 10 in Fig. 1, and is a plurality of products or semi-finished products processed by the machine tool 10 (processed work'). It is a device that stocks.
  • the automatic guided vehicle 3 5 is equipped with the robot 2 5 on a mounting surface 3 6 which is the upper surface thereof, and an operation panel which can be carried by an operator. 3 7 is attached.
  • the operation panel 37 is equipped with an input / output unit for inputting / outputting data, an operation unit for manually operating the automatic guided vehicle 35 and the robot 25, and a display capable of displaying a screen.
  • the automatic guided vehicle 35 is equipped with a sensor (for example, a distance measurement sensor using laser light) that can recognize its own position in the factory, and the control device 40 Under the control of the above-mentioned machine tool 10, the material stocker 20 and the product stocker 2 1 are configured to run on an automated guided vehicle in the factory including the area where the machine tool 10 and the product stocker 2 1 are arranged. It goes through each working position set for each of the machine tool 10 and the material stocker 2 0 and the product stocker 2 1.
  • a sensor for example, a distance measurement sensor using laser light
  • the robot 2 5 is articulated with three arms, a first arm 2 6, a second arm 27, and a third arm 28. It is a type robot, and a hand 2 9 as an end effector is attached to the tip of the 3rd arm 28, and one camera 3 1 is attached via a support bar 30.
  • the control device 4 0 includes an operation program storage unit 4 1, a moving position storage unit 4 2, an operation posture storage unit 4 3, a map information storage unit 4 4, and a reference image. It consists of a storage unit 4 5, a manual operation control unit 4 6, an automatic operation control unit 4 7, a map information generation unit 48, a position recognition unit 4 9, a correction amount calculation unit 50, and an input / output interface 5 1. Then, the control device 40 uses the machine tool 10, the material stocker 2 0, the product stocker 2 1, the robot 25, the camera 3 1, and the automatic guided vehicle 3 5 through the input / output interface 5 1. And it is connected to the operation panel 3 7.
  • the control device 40 is composed of a computer including 0 11, scale] ⁇ , scale ⁇ ] ⁇ , etc., and is composed of the manual operation control unit 4 6, the automatic operation control unit 4 7, and a map.
  • the functions of the information generation unit 48, the position recognition unit 49, the correction amount calculation unit 50, and the input / output interface 5 1 are realized by a computer program, and the processing described later is executed.
  • the operation program storage unit 41, the movement position storage unit 4 2, the operation posture storage unit 4 3, the map information storage unit 4 4 and the reference image storage unit 4 5 are composed of appropriate storage media such as R AM.
  • control device 40 is attached to the automatic guided vehicle 35, and is connected to the machine tool 10, the material stocker 2 0 and the product stocker 2 1 by appropriate communication means, and the robot 25, the camera 3 1, automatic guided vehicle 3 5 and operation panel 3 7 are connected by wire or wireless.
  • the present invention is not limited to this aspect, and the control device 40 may be arranged at an appropriate position other than the automatic guided vehicle 35. In this case, the control device 40 is appropriately connected to each part by communication means. ⁇ 0 2021/048579 ⁇ (: 17132019/001004
  • the manual operation control unit 4 6 is a functional unit that operates the automatic guided vehicle 35, the robot 25, and the camera 3 1 according to an operation signal input from the operation panel 3 7 by the operator. Is. That is, the operator can manually operate the automatic guided vehicle 35, the robot 25, and the camera 3 1 using the operation panel 3 7 under the control of the manual operation control unit 46.
  • the operation program storage unit 4 1 generates an automatic driving program for automatically driving the automatic guided vehicle 35 and the robot 2 5 at the time of production, and map information in a factory to be described later. It is a function unit that stores a program for generating a map for operating the automatic guided vehicle 35 at the time of operation.
  • the automatic operation program and the map generation program are, for example, input from the input / output unit provided on the operation panel 37 and stored in the operation program storage unit 41.
  • this automatic driving program includes a command code regarding the moving position, moving speed, and orientation of the automatic guided vehicle 35 as a target position for the automatic guided vehicle 35 to move. Includes a command code for the operation of the robot 2 5 in sequence and a command code for the operation of the camera 31.
  • the map generation program includes a command code for running the automatic guided vehicle 35 without a track throughout the factory so that the map information generation unit 48 can generate map information.
  • the map information storage unit 4 4 stores map information including arrangement information of machines, equipment, devices, etc. (devices, etc.) arranged in the factory where the automatic guided vehicle 35 runs. It is a functional unit, and this map information is generated by the map information generation unit 48.
  • the map information generation unit 4 8 generates a map stored in the operation program storage unit 4 1 under the control of the automatic operation control unit 4 7 of the control device 40, which will be described in detail later.
  • the spatial information in the factory is acquired from the distance data detected by the sensor, and the plane shape of the equipment installed in the factory is recognized. , For example, the position of a specific device arranged in the factory based on the plan shape of the device registered in advance, in this example, the machine tool 10 and the material stocker 20 and the product stocker 2 1. , Plane shape, etc. (arrangement information) is recognized.
  • the map information generation unit 4 8 stores the obtained spatial information and the arrangement information of the devices and the like in the map information storage unit 4 4 as map information in the factory.
  • the position recognition unit 4 9 is an automatic guided vehicle in the factory based on the distance data detected by the sensor and the map information in the factory stored in the map information storage unit 4 4. It is a functional unit that recognizes the position of the vehicle 3 5, and based on the position of the automatic guided vehicle 3 5 recognized by this position recognition unit 4 9, the operation of the automatic guided vehicle 3 5 is the automatic operation control unit 4 7 Is controlled by.
  • the moving position storage unit 4 2 is a moving position as a specific target position for the automatic guided vehicle 3 5 to move, and is a specific moving position corresponding to a command code in the operation program. It is a functional unit that memorizes the moving position, and this moving position includes each working position set for the machine tool 10, the material stocker 20 and the product stocker 2 1 described above. For example, this movement position is determined after the unmanned vehicle 35 is manually operated by the operation panel 37 under the control of the manual operation control unit 46 and moved to each target position. , The position data recognized by the position recognition unit 4 9 is set by the operation of storing the position data in the movement position storage unit 4 2. This operation is a so-called teaching operation.
  • the operation posture storage unit 4 3 is a posture (operation posture) of the robot 2 5 that changes sequentially when the robot 2 5 operates in a predetermined order, and is the operation program.
  • Command code inside ⁇ 0 2021/048579 ⁇ (: 17132019/001004 This is a functional unit that stores data related to the operating posture corresponding to the mode. The data related to this operating posture is under the control of the manual operation control unit 46.
  • the specific operating posture of the robot 2 5 is set in the material stocker 2 0 machine tool 10 and the product stocker 2 1, respectively.
  • the work start posture take-out start posture
  • the unprocessed work stored in the material stocker 20 are handed 2 9
  • Each work posture (each take-out posture) for grasping and taking out from the material stocker 20 and the posture when the take-out is completed (the take-out complete posture, which is the same as the take-out start posture in this example). Is set as the take-out operation posture.
  • the work taking-out operation posture for example, the work start posture before entering the machine machine 10
  • the hand 2 9 and the camera 3 1 enter the machining area of the machine machine 10
  • the camera 3 1 is made to face the identification figure provided on the support bar 15 and the identification figure is imaged by the camera 3 1 (imaging posture) (see Fig. 4)
  • the chuck of the machine tool 10 The machined work gripped by the chuck 1 2 is held by the chuck 1 2 by moving the hand 2 9 to the chuck 1 2 side in the posture in which the hand 2 9 is opposed to the machined work gripped by 1 2 (preparation posture for taking out).
  • the posture of gripping the workpiece with the hand 2 9 (holding posture), the posture of separating the hand 2 9 from the chuck 1 2 and removing the processed work ⁇ 'from the chuck 1 2 (removal posture), the hand 2 9 And each posture of the posture (work completion posture) in which the camera 3 1 is pulled out from the production machine 10 is set.
  • the posture of the camera 3 1 when the camera 3 1 is made to face the horizontal identification figure is a posture in which the lens and the identification figure are substantially parallel to each other.
  • the hand 2 9 and the camera 3 1 are made to enter the machining area of the machine tool 10.
  • the posture (imaging posture) (see Fig. 4) in which the camera 3 1 faces the identification figure installed on the support bar 1 5 and the identification figure is imaged by the camera 3 1, and the chuck 1 2 of the machine tool 10
  • the posture in which the pre-machine tool gripped by the hand 2 9 faces each other (preparation posture), and the posture in which the hand 2 9 is moved to the chuck 1 2 side so that the pre-machine tool can be gripped by the chuck 1 2.
  • Working posture Hand 2 9 separated from chuck 1 2
  • Separation posture Hand 2 9 and Camera 3 1 pulled out of machine tool 10
  • the work start posture (storage start posture) when the work is started in the product stocker 2 1 and the processed work gripped by the hand 2 9 are held in the product stocker 2 1.
  • the automatic driving control unit 4 7 uses either the automatic driving program or the map generation program stored in the operation program storage unit 41, and the automatic guided vehicle 3 5 according to the program. It is a functional part that operates the robot 2 5 and the camera 3 1. At that time, the data stored in the moving position storage unit 4 2 and the operating posture storage unit 4 3 are used as necessary.
  • the automatic guided vehicle 3 5 is in the working position set with respect to the machine tool 10 and the robot 2 5 is in the imaging posture during the teaching operation.
  • This is a functional unit that stores an image obtained by capturing an identification figure provided on the support bar 1 5 of the Lupresetter 1 3 with the camera 3 1 as a reference image.
  • the correction amount calculation unit 50 is controlled by the automatic operation control unit 47, and the robot 2 5 is in accordance with the automatic operation program stored in the operation program storage unit 41.
  • the robot 2 5 is in the imaging posture and the identification figure is imaged by the camera 31 when the robot is automatically operated, the image of the current identification figure obtained during the automatic operation and the reference figure are obtained.
  • the reference image image captured during the teaching operation
  • Fig. 6 shows an image of the identification figure captured by the camera 31 during the teaching operation, that is, the reference image.
  • the rectangular line shown by the solid line is the field of view of the camera 31, in other words, the outline of the reference image.
  • Fig. 7 shows the image of the current identification figure obtained during automatic operation with a solid line.
  • the rectangular line shown by the solid line is the outline of the current image
  • the rectangular line shown by the alternate long and short dash line is the outline of the reference image.
  • Figure 7 shows that there is a discrepancy between the reference image and the current image due to the discrepancy between the imaging posture during the teaching operation of Robot 25 and the current imaging posture. ing.
  • the X-axis and y-axis are two axes that are parallel to the identification figure and are directly intersecting with each other, and are the coordinate system of robot 25 during the teaching operation.
  • Rz is orthogonal to the X-axis and y-axis.
  • the x t axis, y t axis, X axis, and y axis are set in the horizontal plane (the same applies to the X'axis and y'axis described later).
  • the correction amount calculation unit 50 analyzes the current image in the same manner, and the graphic coordinate system set from the identification graphic on the frame of the camera 3 1 ( 1 axis 1 7 1). Based on the axis coordinate system), according to the above conversion formula, the current position of the camera 3 1 in the coordinate system of Robot 2 5 (X-axis 1 y-axis coordinate system) during the teaching operation (x curr , y cur r , rz cur r ) is calculated.
  • the correction amount calculation unit 50 has a position error amount AX, Ay and a position error between the position at the time of teaching operation of the camera 31 in the X-axis 1-y-axis coordinate system and the current position.
  • Estimate the amount of rotation error Arz by the following formula 1 — formula 3.
  • the rotation angle r z around the z-axis can be obtained by extracting two appropriate points with a predetermined distance from the image and calculating the angle with respect to the coordinate axes.
  • the coordinate system of the current Robot 25 is the X'axis 1'axis coordinate system, and as shown in Fig. 8, teaching Considering that there is a translational error of X, 17 between the X- axis uni-axis coordinate system, which is the coordinate system of Robot 2 5 of the time, the current camera 3 in the X'axis 1 7'-axis coordinate system.
  • the position of 1 (X', 7 ') is calculated by the following formula 4.
  • the translation error 1 X is calculated by the following formula 5 which is a modification of the above formula 4.
  • the correction amount calculation unit 50 calculates the translation error amount I X , according to this mathematical formula 5, and calculates the translation error amount X , 17 and the rotation error amount ⁇ ⁇ 2 as described above. The amount of correction in the working posture.
  • the automatic operation control unit 47 is in a working posture when the robot 25 works on the machine tool 10, for example, the taking-out preparation posture, the gripping posture, and the work-taking posture in the work taking-out operation posture.
  • the removal posture, the mounting preparation posture, the mounting posture, and the separation posture in the work mounting operation posture the hand 2 9 of the robot 2 5 is based on the correction amount calculated by the correction amount calculation unit 50. Correct the position.
  • position and position X. , Is the set position of hand 2 9 in the X-axis uniaxial coordinate system, which is the robot coordinate system during the teaching operation, and hand 2 9 is set for chuck 1 2 from the position X? In the imaging posture.
  • Position X. , V?. It is set to be positioned at. If there is no misalignment between the automatic guided vehicle 3 5 and the robot 2 5 positioned at the work position described above, c.
  • the position X ′′ 5 is the position of the hand 2 9 in the imaging posture in the X'axis uniaxial coordinate system, which is the robot coordinate system at the time of the current operation.
  • the automatic operation control unit 47 of the control device 40 under the control of the automatic operation control unit 47 of the control device 40, the automatic operation program stored in the operation program storage unit 41 is executed, and this automatic operation program is executed. Therefore, for example, the automatic guided vehicle 35 and the robot 25 operate as follows.
  • the automatic guided vehicle 3 5 moves to the work position set for the machine tool 10 and the robot 2 5 takes the work start posture of the work take-out operation described above.
  • the machine tool 10 has completed the predetermined machining, and the door cover has been opened so that the robot 25 can enter the machining area, and the automatic operation control unit 47 Upon receiving the command, it is assumed that the support bar 15 of the tool preset evening 13 is advanced into the machining area.
  • the robot 25 shifts to the imaging posture, and the identification diagram provided on the support bar 15 is imaged by the camera 31. Then, when the identification figure is imaged by the camera 31 in this way, the correction amount calculation unit 50 displays the image of the identification figure and the reference image stored in the reference image storage unit 45. Based on the image, according to the above formulas 1-3, the robot 2 5 ⁇ 0 2021/048579 ⁇ (: 17132019/001004 The amount of position error between the imaging posture during the eating operation and the current imaging posture ⁇ X and the rotation error amount 2 are estimated, and each estimated error amount Based on the above formula 4-15, the amount of correction for robot 2 5 1 7 and rotation error
  • the automatic operation control unit 47 performs the subsequent work take-out operation posture, that is, the above-mentioned take-out preparation posture and gripping posture.
  • the position of the hand 2 9 in the removal posture and the work completion posture is corrected according to Equation 9, and the rotation position around the axis is corrected, and the machined workpiece gripped by the chuck 1 2 of the machine tool 10 is c. Grip it to the machine tool 2 9 and take it out from the machine tool 10.
  • the chuck 1 2 is opened by transmitting a chuck opening command from the automatic operation control unit 47 to the machine tool 10.
  • the automatic operation control unit 4 7 moves the automatic guided vehicle 3 5 to the working position set for the product stocker 2 1 and at the robot 25 for the product.
  • the storage start posture when starting work in the stocker 2 1 each storage posture for storing the processed work gripped in the hand 2 9 in the product stocker 2 1 and the storage completion posture when the storage is completed. Take them in sequence and store the processed workpieces gripped by the hand 2 9 in the product stocker 2 1.
  • the automatic operation control unit 4 7 moves the automatic guided vehicle 3 5 to the working position set for the material stocker 2 0, and at the robot 25 the material concerned.
  • the take-out posture and the take-out completion posture when the take-out is completed are sequentially taken, and the hand 29 is made to grip the work before machining.
  • the automatic operation control unit 4 7 again moves the automatic guided vehicle 3 5 to the working position set for the machine tool 10 and the work described above in the robot 25. Get ready to start the installation operation. Then, the robot 25 is moved to the imaging posture, and the identification figure provided on the support bar 15 is imaged by the camera 31. Then, when the identification figure is imaged by the camera 31 in this way, the correction amount calculation unit 50 displays the image of the identification figure and the reference image stored in the reference image storage unit 45. Based on the above equations 1 to 3, the amount of positional error between the imaging posture during the teaching operation of Robot 2 5 and the current imaging posture, ⁇ 7, and the amount of rotation error ⁇ ⁇ 2 are estimated and estimated. Based on each of the error amounts, the translational error correction amounts 1, 17 and the rotation error correction amount ⁇ ⁇ 2 for the subsequent workpiece mounting motion posture of Robot 25 are calculated according to the above-mentioned formulas 4-15. Will be done.
  • the automatic operation control unit 4 7 uses the correction amount calculated by the correction amount calculation unit 50 to obtain the subsequent work mounting operation postures of the robot 25, that is, described above.
  • the position of the hand 2 9 in the mounting preparation posture, the mounting posture, the separation posture, and the work completion posture is corrected according to Equation 9, and the rotation position around the axis is corrected, and the robot 25 is gripped by the hand 2 9.
  • the automatic operation control unit 47 After attaching the pre-machining work to the chuck 1 2 of the machine tool 10 and then letting it move out of the machine.
  • the automatic operation control unit 47 sends a machining start command to the machine tool 10 to cause the machine tool 10 to perform the machining operation.
  • the automatic operation control unit 47 sends a chuck closing command to the machine tool 10 to close the chuck 1 2 and the chuck 1 2 The work before machining is gripped by.
  • the robot 2 5 uses the identification figure placed in the processing area of the machine tool 10 that the robot actually works on. Since the working posture of 25 is corrected, the working posture can be corrected accurately, and as a result, Robot 25 can perform the work accurately even in the work that requires high motion accuracy. Can be executed.
  • the robot 2 5 that operates according to the operation program is configured to execute this in one operation when the identification figure is imaged by the camera 3 1, so that it is compared with the conventional one. Therefore, accurate correction can be performed in a short time.
  • the discrimination figure since a plurality of pixels are assumed to have a Ma Torikusu structure arranged in a two-dimensional, the position error amount ⁇ X, ⁇ 7 and, The amount of rotation error ⁇ ⁇ 2 can be estimated with high accuracy and high repeatability.
  • the correction amount calculation unit 5 0 is configured to estimate the amount of position error in the two axial directions and calculate the corresponding correction amount
  • the automatic operation control unit 47 is the calculated amount of correction in the axial direction. Based on this, it may be configured to correct the axial position in each working position of Robot 25.
  • the amount of position error in the two-axis direction can be calculated, for example, from the magnification of the basic image size and the current image size.
  • an embodiment using an automatic guided vehicle 35 has been illustrated, but the present invention is not limited to this. It may be a transport device that can be pushed and moved by a person like a general trolley. A robot 25 is mounted on this transfer device, and the transfer device is manually transported to the working position of the machine tool 10 and the machine tool 10 is made to attach / detach the workpiece by the robot 25. It may be an embodiment.
  • a vertical lathe was illustrated as a machine tool, but it is not limited to this, and in addition to horizontal lathes, vertical and horizontal machining centers, tool spindles and workpieces. Any previously known machine tool, such as a multi-tasking machine with a spindle, can be applied.
  • the X-axis and the axes are set in the horizontal plane and the two axes are set in the vertical direction in the robot coordinate system.
  • the direction of the coordinate axes can be set arbitrarily.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

La présente invention comprend : une machine-outil 10 ; un robot 25 comportant un appareil de prise de vues 31 ; et un dispositif de transport 35 sur lequel le robot 25 est monté, un numéro d'identification étant disposé à l'intérieur d'une zone d'usinage de la machine-outil 10.
PCT/IB2019/001004 2019-09-11 2019-09-11 Système et machine-outil Ceased WO2021048579A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
PCT/IB2019/001004 WO2021048579A1 (fr) 2019-09-11 2019-09-11 Système et machine-outil
US17/642,573 US12358148B2 (en) 2019-09-11 2020-09-10 Robot-mounted moving device, system, and machine tool
CN202080064275.5A CN114728414A (zh) 2019-09-11 2020-09-10 装有机器人的移动装置、系统和机床
JP2021526426A JP7482364B2 (ja) 2019-09-11 2020-09-10 ロボット搭載移動装置及びシステム
EP20864229.8A EP4013578A4 (fr) 2019-09-11 2020-09-10 Dispositif mobile monté sur robot, système et machine-outil
PCT/US2020/050073 WO2021050646A1 (fr) 2019-09-11 2020-09-10 Dispositif mobile monté sur robot, système et machine-outil
JP2024059506A JP2024096756A (ja) 2019-09-11 2024-04-02 ロボット搭載移動装置、及びその制御方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2019/001004 WO2021048579A1 (fr) 2019-09-11 2019-09-11 Système et machine-outil

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
USPCT/US2019/050641 Continuation-In-Part 2019-09-11 2019-09-11

Related Child Applications (2)

Application Number Title Priority Date Filing Date
USPCT/US2019/050641 Continuation-In-Part 2019-09-11 2019-09-11
US17/642,573 Continuation-In-Part US12358148B2 (en) 2019-09-11 2020-09-10 Robot-mounted moving device, system, and machine tool

Publications (1)

Publication Number Publication Date
WO2021048579A1 true WO2021048579A1 (fr) 2021-03-18

Family

ID=74866149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/001004 Ceased WO2021048579A1 (fr) 2019-09-11 2019-09-11 Système et machine-outil

Country Status (2)

Country Link
JP (2) JP7482364B2 (fr)
WO (1) WO2021048579A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (zh) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 一种推车式机器人上下料自动定位方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024068115A (ja) * 2022-11-04 2024-05-17 Dmg森精機株式会社 ロボット搭載移動装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5572102A (en) * 1995-02-28 1996-11-05 Budd Canada Inc. Method and apparatus for vision control of welding robots
US20090096148A1 (en) * 2007-10-10 2009-04-16 Denso Corporation Workpiece grasper and workpiece transfer apparatus using the same
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
US20160082598A1 (en) * 2013-05-17 2016-03-24 Loxin 2002, S.L. Head and automated mechanized method with vision
CN109927012A (zh) * 2019-04-08 2019-06-25 清华大学 移动抓取机器人和自动取货方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3998741B2 (ja) * 1995-07-12 2007-10-31 ファナック株式会社 ロボットの移動制御方法
JPH11156764A (ja) * 1997-11-28 1999-06-15 Denso Corp 移動ロボット装置
JP2010162635A (ja) 2009-01-14 2010-07-29 Fanuc Ltd 自走式ロボットの位置および姿勢の補正方法
DE102009031459A1 (de) * 2009-03-25 2010-09-30 Mvt Montage- Und Vorrichtungstechnik Gmbh Anlage zur Montage und/oder Bearbeitung von Werkstücken
JP2015095020A (ja) * 2013-11-11 2015-05-18 株式会社デンソーアイティーラボラトリ 画像付二次元コード生成装置、画像付二次元コード生成方法、及び画像付二次元コード生成プログラム
JP6196563B2 (ja) 2014-02-18 2017-09-13 株式会社Ihi ロボットの遠隔操作システムと方法
JP6235664B2 (ja) 2015-09-14 2017-11-22 ファナック株式会社 ロボットの機構パラメータを校正するために使用される計測装置
JP2017071033A (ja) 2015-10-09 2017-04-13 キヤノン株式会社 作業用基準物体、作業用基準物体の製造方法、ロボットアームの調整方法、ビジョンシステム、ロボット装置、及び指標用部材
JP6664830B2 (ja) 2015-10-13 2020-03-13 富士電機株式会社 製造システム
DE102015120058B3 (de) * 2015-11-19 2017-03-02 Kuka Roboter Gmbh Ankoppeleinrichtung und Verfahren zum Ankoppeln einer mobilen Prozesseinrichtung
JP2017208032A (ja) * 2016-05-20 2017-11-24 株式会社テララコード研究所 真贋判定方法
JP6490037B2 (ja) 2016-10-04 2019-03-27 ファナック株式会社 移動可能な台車に支持されたロボットを備えるロボットシステム
CN109937386B (zh) 2016-11-16 2022-04-26 株式会社牧野铣床制作所 机床系统
JP6538751B2 (ja) * 2017-05-18 2019-07-03 ファナック株式会社 プログラミング装置及びロボット制御方法
CN107689061A (zh) 2017-07-11 2018-02-13 西北工业大学 用于室内移动机器人定位的规则图形码及定位方法
JP6669713B2 (ja) 2017-11-28 2020-03-18 ファナック株式会社 ロボットおよびロボットシステム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5572102A (en) * 1995-02-28 1996-11-05 Budd Canada Inc. Method and apparatus for vision control of welding robots
US20090096148A1 (en) * 2007-10-10 2009-04-16 Denso Corporation Workpiece grasper and workpiece transfer apparatus using the same
US20160082598A1 (en) * 2013-05-17 2016-03-24 Loxin 2002, S.L. Head and automated mechanized method with vision
US20160059419A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Robot apparatus and method for controlling robot apparatus
CN109927012A (zh) * 2019-04-08 2019-06-25 清华大学 移动抓取机器人和自动取货方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (zh) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 一种推车式机器人上下料自动定位方法
CN116175256B (zh) * 2023-04-04 2024-04-30 杭州纳志机器人科技有限公司 一种推车式机器人上下料自动定位方法

Also Published As

Publication number Publication date
JP2022530589A (ja) 2022-06-30
JP7482364B2 (ja) 2024-05-14
JP2024096756A (ja) 2024-07-17

Similar Documents

Publication Publication Date Title
JP6785931B1 (ja) 生産システム
US12358148B2 (en) Robot-mounted moving device, system, and machine tool
CN111182178B (zh) 摄像装置以及机床
JP7133604B2 (ja) ロボット搭載移動装置及びシステムの位置決め制御方法
CN111470309A (zh) 跟随机器人及作业机器人系统
JP2024096756A (ja) ロボット搭載移動装置、及びその制御方法
US12304076B2 (en) Setting method using teaching operation
WO2022091767A1 (fr) Procédé de traitement d'images, dispositif de traitement d'images, dispositif de transfert de type monté sur un robot et système
JP7093881B1 (ja) システム、及び自動搬送車
JP6832408B1 (ja) 生産システム
JP6906724B1 (ja) 工作機械
JP6937444B1 (ja) ロボットシステムの位置決め精度測定方法
JP7015949B1 (ja) 貼着位置測定装置及びこれを備えた工作機械
JP2024068115A (ja) ロボット搭載移動装置
JPH04269147A (ja) ワーク供給装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19945112

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19945112

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP