[go: up one dir, main page]

WO2021111868A1 - Coordinate system alignment method, alignment system, and alignment device for robot - Google Patents

Coordinate system alignment method, alignment system, and alignment device for robot Download PDF

Info

Publication number
WO2021111868A1
WO2021111868A1 PCT/JP2020/042870 JP2020042870W WO2021111868A1 WO 2021111868 A1 WO2021111868 A1 WO 2021111868A1 JP 2020042870 W JP2020042870 W JP 2020042870W WO 2021111868 A1 WO2021111868 A1 WO 2021111868A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
space coordinate
measuring instrument
operating space
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/042870
Other languages
French (fr)
Japanese (ja)
Inventor
勇政 坪井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sick KK
Original Assignee
Sick KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sick KK filed Critical Sick KK
Priority to EP20896987.3A priority Critical patent/EP4070923A4/en
Priority to JP2021562554A priority patent/JP7273185B2/en
Priority to US17/781,141 priority patent/US12290924B2/en
Publication of WO2021111868A1 publication Critical patent/WO2021111868A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • G01B11/27Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39024Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39045Camera on end effector detects reference pattern

Definitions

  • the present invention relates to a technique for aligning a coordinate system as a reference for controlling the movement of a robot that executes a predetermined movement with respect to an object and a coordinate system of a three-dimensional measuring instrument attached to the robot.
  • An industrial robot is used on the factory production line to improve the efficiency of product assembly and processing.
  • An industrial robot includes an arm whose tip can be moved three-dimensionally, and a jig attached to the tip to perform a predetermined treatment on an object to be processed. Further, a three-dimensional measuring instrument is attached to the tip of the arm, and for example, the object to be processed is three-dimensionally measured by an optical cutting method to specify the position and orientation thereof.
  • the coordinate system of the industrial robot is set in advance by the manufacturer at the time of manufacturing, etc., and the operation of the arm is controlled based on the coordinate system.
  • the three-dimensional measuring instrument outputs a measurement result based on the coordinate system of the three-dimensional measuring instrument itself. Therefore, in order to move the jig to the position of the object to be processed whose position and orientation are specified by the 3D measuring instrument, the coordinate system of the robot and the coordinate system of the 3D measuring instrument are aligned (aligned). Or it is also called calibrating).
  • the problem to be solved by the present invention is to provide a technique capable of easily aligning a robot coordinate system even if the person is not an expert.
  • the first aspect of the present invention made to solve the above problems is that the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally, and the optical cutting method can be executed.
  • This is a method of aligning the coordinate system of a three-dimensional measuring instrument whose position and orientation with respect to the operating point are invariant.
  • the first reference point located at the origin in the operating space coordinate system, which is the coordinate system of the operating space of the operating point, and the second reference point and the third reference point located on two straight lines orthogonal to the first reference point, respectively.
  • each side is parallel to each of the three orthogonal axes.
  • a rectangular parallelepiped reference object fixed to the above is irradiated with a sheet-shaped slit light from the three-dimensional measuring instrument to acquire a projection line to the reference object by the slit light. Based on the profile of the projection line, the posture of the three-dimensional measuring instrument with respect to the reference object is obtained. It is characterized by having a step of moving the three-dimensional measuring instrument so that the posture of the three-dimensional measuring instrument falls within a predetermined reference posture range.
  • the above posture means an inclination or orientation with respect to a reference such as an axis of a coordinate system or a predetermined surface.
  • a three-dimensional measuring instrument capable of performing the optical cutting method is a three-dimensional measuring instrument that irradiates an object with slit light whose attitude with respect to the coordinate system of the measuring instrument is known, and data of projection lines on each surface of the object by the slit light. Refers to a measuring instrument capable of acquiring information on the three-dimensional shape of the object. Further, the operating space refers to a space in which the operating point of the robot operates.
  • the operating point of the robot is moved to each of the third reference points, the position of each reference point is taught, and the relationship between the robot coordinate system and the operating space coordinate system is determined. That is, the first reference point and the second reference point define the first axis of the operating space coordinate system, the first reference point and the third reference point define the second axis, and further, they are orthogonal to these two axes.
  • a third axis is defined as the axis.
  • each side is parallel to each of the three orthogonal axes.
  • a fixed rectangular parallelepiped reference object is irradiated with slit light from a three-dimensional measuring instrument, and projection lines on each surface of the reference object by the slit light are acquired.
  • the posture of the three-dimensional measuring instrument with respect to the reference object is obtained based on the acquired profile of the projection line.
  • the profile of this projection line includes the length of the projection line on the upper surface of the reference object, and the projection line on the upper surface of the reference object and the projection on the surface of the plate-like member (the surface on which the reference object is placed).
  • the three-dimensional measuring instrument is moved so that the posture of the three-dimensional measuring instrument falls within a predetermined reference posture range. A specific method for obtaining the attitude of the three-dimensional measuring instrument from the profile of the projection line will be described later.
  • the reference object is arranged on a plane including the two axes of the operating space coordinate system (this is referred to as the X axis and the Y axis) so as to be parallel to the Z axis.
  • the sheet-shaped slit light has a width in the X'axis direction of the measuring instrument coordinate system and is irradiated in the Z'axis direction, and the posture of the three-dimensional measuring instrument becomes substantially parallel to the reference object (that is,).
  • the reference posture range is set so that the three axes of the robot coordinate system and the measuring instrument coordinate system are substantially parallel to each other will be described.
  • the length of the slit light emitted to the upper surface of the reference object is a predetermined value based on the height of the reference object.
  • these projection lines are parallel to the X'axis of the 3D instrument.
  • the measuring instrument coordinate system is tilted with respect to the operating space coordinate system around one axis (this is defined as the Y axis) in the plane, the projection line on the upper surface of the reference object and the surface of the plate-shaped member (reference).
  • the inclination of the projection line to the plane on which the object is fixed) with respect to the X'axis of the 3D measuring instrument becomes large.
  • a cutting line irradiated with slit light may appear on the side surface of the reference object.
  • the greater the tilt of the measuring instrument coordinate system with respect to the operating space coordinate system around another horizontal axis (this is the X axis), the more the projection line on the upper surface of the reference object and the surface of the plate-like member. The distance between the projection lines to is increased. Therefore, from this information, the posture of the three-dimensional measuring instrument with respect to the reference object (in this example, the inclination of the axis of the measuring instrument coordinate system with respect to each of the three axes of the operating space coordinate system) can be obtained.
  • the three-dimensional measuring instrument is moved so that all the inclinations are lower than the threshold value.
  • the user operates the controller while visually recognizing the posture of the three-dimensional measuring instrument with respect to the reference object to move the three-dimensional measuring instrument so as to be parallel to the reference object, and the inclination is set at that position. This can be done by confirming that all of them are below the above threshold.
  • the posture of the three-dimensional measuring instrument can be set within the reference posture range (in this example, the direction in which the three sides of the reference object extend and the three axes of the measuring instrument coordinate system are parallel).
  • the alignment method according to the first aspect of the present invention does not need to perform matrix operations using a programming language, it is possible to easily align the robot coordinate system even if the person is not an expert.
  • the above alignment method is, for example, a three-dimensional measuring instrument so as to have a predetermined reference posture with respect to a rectangular parallelepiped reference object arranged so that each side is parallel to each of the three orthogonal axes of one operating space coordinate system.
  • the posture of the object is determined, and the object is three-dimensionally measured at the position and the posture. That is, in the above alignment method, one position and orientation of the three-dimensional measuring instrument are determined with respect to one operating space coordinate system, and the three-dimensional measurement data of the object is acquired at the position and orientation.
  • the object to be assembled or processed on the production line of a factory has a three-dimensional shape, and there is a shadowy part just by acquiring the three-dimensional measurement data of the object at one position and posture. It exists and it is not possible to obtain information on the shape of the object for that part. Therefore, it is required to acquire information on the shape of the object from a plurality of directions.
  • a reference operating space coordinate system and a reference operating space coordinate system are set.
  • the operating point is moved between the origin of the reference operating space coordinate system and the origin of the reference operating space coordinate system to obtain translational displacement and rotational displacement.
  • a transformation matrix for converting the reference operating space coordinate system into the reference operating space coordinate system is obtained.
  • the three-dimensional measuring instrument is moved so that the posture of the three-dimensional measuring instrument falls within the predetermined reference posture range, and the object is three-dimensional.
  • Get measurement data The object acquired in the reference space coordinate system by converting the three-dimensional measurement data of the object acquired in the reference operation space coordinate system into the three-dimensional measurement data of the reference operation space coordinate system using the conversion matrix. It is preferable to configure it so as to integrate it with the three-dimensional measurement data of the object.
  • the reference operating space coordinate system and the reference operating space coordinate system are set. Then, the motion point of the robot is moved between the origins of both coordinate systems, the translational displacement and the rotational displacement due to the movement are acquired, and the reference motion space coordinate system is converted into the reference motion space coordinate system based on the displacement. Find the matrix. After that, the three-dimensional measurement data of the object is acquired by the three-dimensional measuring instrument in each of the reference motion space coordinate system and the reference motion space coordinate system, and the three-dimensional measurement acquired in the reference motion space coordinate system using the above conversion matrix. The data is converted into 3D measurement data of the reference operating space coordinate system and integrated.
  • the object can be three-dimensionally measured at a plurality of positions and postures, and information on the shape of the object without a shaded portion can be obtained.
  • the movable range of the operating point of the robot is determined by the length of the arm, and if the object is large with respect to the robot, the arm cannot reach the opposite side of the object with only one robot, and a part of the object cannot be reached. It may not be possible to acquire 3D measurement data.
  • a plurality of operating space coordinate systems are set, which are the plurality of the operating space coordinate systems, one of which is common to the operating space coordinate system set for at least one other robot.
  • One of all the motion space coordinate systems set in the plurality of robots is the reference motion space coordinate system, and the other is the reference motion space coordinate system.
  • a transformation matrix for converting the reference motion space coordinate system to the reference motion space coordinate system is created.
  • Three-dimensional measurement data of the object is acquired in each of the plurality of operating space coordinate systems set for each of the plurality of robots.
  • the three-dimensional measurement data of the object acquired in the reference motion space coordinate system may be converted into the three-dimensional measurement data of the reference motion space coordinate system and integrated.
  • the three-dimensional shape information since the three-dimensional measurement data of the object is acquired by using a plurality of robots, the three-dimensional shape information has no shadow even when the object is larger than the robot. Can be obtained.
  • a fourth aspect of the present invention made to solve the above problems is that the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally, and the optical cutting method can be executed.
  • An operating space coordinate system setting unit that determines the relationship between the operating space coordinate system and the robot coordinate system by moving the operating point to a point and teaching the position of each reference point.
  • each side is parallel to each of the three orthogonal axes.
  • a projection line acquisition unit that irradiates a rectangular parallelepiped reference object fixed to the above with a sheet-shaped slit light from the three-dimensional measuring instrument and acquires a projection line to the reference object by the slit light.
  • a measuring instrument posture calculation unit that obtains the posture of the three-dimensional measuring instrument with respect to the reference object based on the profile of the projection line. It is characterized by including a measuring instrument moving unit for moving the three-dimensional measuring instrument.
  • the alignment system has the fourth aspect described above.
  • the operating space coordinate system setting unit sets the reference operating space coordinate system and the reference operating space coordinate system as the operating space coordinate system.
  • a displacement acquisition unit that moves the operating point between the origin of the reference operating space coordinate system and the origin of the reference operating space coordinate system to acquire translational displacement and rotational displacement.
  • a transformation matrix calculation unit that obtains a transformation matrix that converts the reference motion space coordinate system into the reference motion space coordinate system based on the translational displacement and the rotational displacement.
  • the three-dimensional measuring instrument is moved by the measuring instrument moving unit so that the posture of the three-dimensional measuring instrument falls within the predetermined reference posture range.
  • the 3D measurement data acquisition unit that acquires the 3D measurement data of the object, The object acquired in the reference space coordinate system by converting the three-dimensional measurement data of the object acquired in the reference operation space coordinate system into the three-dimensional measurement data of the reference operation space coordinate system using the conversion matrix. It may be equipped with a three-dimensional measurement data integration unit that integrates with the three-dimensional measurement data of an object.
  • the alignment system according to the sixth aspect of the present invention further comprises the fifth aspect. Equipped with multiple robots
  • the operating space coordinate system setting unit is common to a plurality of the operating space coordinate systems set for each of the plurality of robots, one of which is set for at least another robot.
  • a plurality of operating space coordinate systems are set, and one of the operating space coordinate systems set for the plurality of robots is set as the reference operating space coordinate system, and the other is set as the reference operating space coordinate system.
  • the transformation matrix calculation unit creates a transformation matrix that converts the reference motion space coordinate system into the reference motion space coordinate system among the coordinate systems set in the plurality of robots.
  • the three-dimensional measurement data integration unit acquires three-dimensional measurement data of the object in each of the plurality of operating space coordinate systems set for each of the plurality of robots.
  • the three-dimensional measurement data integration unit may convert the three-dimensional measurement data of the object acquired in the reference motion space coordinate system into the three-dimensional measurement data of the reference motion space coordinate system and integrate the data. it can.
  • a seventh aspect of the present invention made to solve the above problems is the operation of the operating point associated with the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally.
  • a projection line acquisition unit that irradiates a rectangular parallelepiped reference object fixed to the reference object with a sheet-shaped slit light from the three-dimensional measuring instrument and acquires a projection line to the reference object by the slit light.
  • a measuring instrument posture calculation unit that obtains the posture of the three-dimensional measuring instrument with respect to the reference object based on the profile of the projection line. It is characterized by having.
  • the robot coordinate system can be easily aligned even by a non-expert.
  • FIG. 6 is a schematic configuration diagram of a robot system including a coordinate system alignment device for a robot and a first embodiment of the alignment system according to the present invention.
  • FIG. 6 is a schematic configuration diagram of a robot system including a second embodiment of a coordinate system alignment device and an alignment system for a robot according to the present invention.
  • FIG. 6 is a schematic configuration diagram of a robot system including a third embodiment of a coordinate system alignment device and an alignment system for a robot according to the present invention.
  • the flowchart regarding the 3rd Example of the coordinate system alignment method for a robot which concerns on this invention.
  • the figure explaining the relationship of the operation space coordinate system in 3rd Example The figure explaining the relationship between the arrangement of the robot and the operating space coordinate system in the modification of the 3rd Example.
  • FIG. 1 is a configuration diagram of a main part of a robot system 1 including a coordinate system alignment system and device for a robot according to the first embodiment (hereinafter, referred to as “alignment system” and “alignment device”).
  • the robot system 1 of the first embodiment includes a robot 2 and a controller 3 for operating the robot 2.
  • the robot 2 includes an arm portion 21 provided with an operating point 20 at the tip thereof, a linear motion mechanism 22a to 22c for three-dimensionally moving the arm portion 21, and a rotation mechanism 23a to 23c.
  • the linear motion mechanisms 22a to 22c and the rotation mechanisms 23a to 23c operate under the control of the controller 3.
  • the controller 3 has an operating space coordinate system setting unit 31 as a functional block for determining the relationship between the operating space coordinate system and the robot coordinate system by teaching described later.
  • a three-dimensional sensor 4 is detachably attached in the vicinity of the operating point 20 of the arm portion 21.
  • the three-dimensional sensor 4 includes a slit light irradiation unit 41 that irradiates an object with slit light, and an area sensor 42 that captures a projection line of the slit light onto the object.
  • the measuring instrument coordinate system is defined in these, and as shown in FIG. 3, the light irradiation unit 41 is arranged in a state of being tilted by an angle ⁇ 1 with respect to the X'-Z'plane of the measuring instrument coordinate system, and is an area.
  • the sensor 42 is arranged on the side opposite to the slit light irradiation unit 41 with the X'-Z'plane in between, at an angle of ⁇ 2 with respect to the same plane.
  • ⁇ 1 in the first embodiment is 0, this is not an essential requirement for the present invention, and as long as the angle of ⁇ 1 is known, how the projection line to the object can be applied to the area sensor 42. You can figure out if you can catch it. Therefore, the processing described below can be performed by an appropriate arithmetic processing based on the known angle.
  • the slit light irradiation unit 41 is arranged on an X'-Z'plane, and slit light is emitted along the X'-Z'plane (slit light having a width in the X'axis direction is emitted in the Z'axis direction. Irradiate the object (towards).
  • ⁇ 2 is set to an appropriate angle in the range of, for example, 30 ° to 40 °.
  • aberrations and depth aberrations (aberrations caused by the perspective method within the angle of view when the camera is oriented in an oblique direction) caused by the lens included in the optical system of the sensor 4 are detected. Use the one to correct. As a result, accurate measurement is performed in the optical cutting method described later, and the alignment accuracy of the coordinate system is improved.
  • the three-dimensional sensor 4 is further provided with a control / processing unit 43.
  • the control / processing unit 43 includes a projection line acquisition unit 432, a measuring instrument posture calculation unit 433, and a coordinate system shift unit 434 as functional blocks.
  • the control / processing unit 43 of the first embodiment is configured as an arithmetic processing mechanism incorporated inside the three-dimensional sensor 4. Further, the control / processing unit 43 is connected to an input unit 45 for the user to give an appropriate input instruction and a display unit 46 for displaying the measurement result or the like via a predetermined communication interface 44. There is.
  • the control / processing unit 43 may also be configured by a portable terminal or the like provided separately from the three-dimensional sensor 4 and capable of communicating with the three-dimensional sensor 4.
  • the alignment jig 5 includes a plate-shaped member 51 and a block 52 fixed on the plate-shaped member 51.
  • the upper surface of the plate-shaped member 51 is a plane including two axes (X-axis and Y-axis) of the operating space coordinate system, which is the coordinate system of the space in which the operating point 20 of the robot 2 is operated. It is fixed in the matching position.
  • a reference point A is set at a position on the upper surface of the plate-shaped member 51 corresponding to the origin of the operating space coordinate system
  • a reference point B is set at a predetermined position on the X axis of the operating space coordinate system
  • Y of the operating space coordinate system is set.
  • a reference point C is set at a predetermined position on the axis. That is, the reference points A and B define the X-axis direction of the operating space coordinate system, and the reference points A and C define the Y-axis of the operating space coordinate system.
  • the size of the block 52 is W (mm) in the X-axis direction, L (mm) in the Y-axis direction, and H (mm) in the Z-axis direction.
  • the length of the block 52 in the X-axis direction is preferably as long as possible within the field of view of the area sensor 42. As a result, the projection line of the slit light reflected on the upper surface of the block 52 becomes longer when the process described later is performed, and the accuracy when determining the posture of the three-dimensional sensor 4 with respect to the block 52 becomes higher.
  • the length of the block 52 in the Y-axis direction is, for example, 10 to 100 mm, and may be appropriately determined in consideration of the size of the robot and the like.
  • the projection line of the slit light parallel to one side of the block 52 (the side parallel to the X-axis of the operating space coordinate system) when the block 52 is irradiated with the slit light? It becomes easier to visually recognize whether or not. If the Y-axis direction of the block 52 is made too short, the projection line of the slit light crosses one side of the block 52 (the side parallel to the X axis of the operating space coordinate system), and the projection line crossing the upper surface of the block 52 is formed. It becomes difficult to obtain.
  • a jig with a sharp tip as much as possible may be used in order to accurately position the operating point, but if a jig with a sharp tip is used, diffuse reflection of light occurs. It was easy, and it was sometimes difficult to take an image when confirming the positional relationship between the operating point and the tip of the jig.
  • the coordinate system since the coordinate system is aligned by the method and device described later, it is not necessary to use a jig with a sharp tip, and a rectangular parallelepiped block 52 may be used, so diffuse reflection does not occur. , The projection line and the like, which will be described later, are easily and accurately imaged.
  • the surface of the block 52 is subjected to a matte treatment to suppress diffused reflection of light.
  • a matte treatment for example, one made of matte alumite can be preferably used.
  • the block 52 is arranged parallel to the three axes of the operating space coordinate system so that the center of the bottom surface of the block 52 is located at a predetermined coordinate in the operating space coordinate system. In this way, the alignment jig is installed (step 1. see FIG. 5).
  • the controller 3 moves the arm portion 21, moves the operating point 20 to the reference point A, and teaches the robot 2 the position of the reference point A.
  • the operating points 20 are also moved to the reference points B and C, and their positions are taught to the robot 2 (step 2).
  • the user may operate the controller 3 by himself / herself, or the operating point 20 may be moved to each reference point by the operating space coordinate system setting unit 31.
  • the operation space coordinate system setting unit 31 registers the coordinates of the reference points A, B, and C in the operation space coordinate system in the robot 2, and the robot coordinates registered in advance (for example, at the time of shipment) in the robot 2. Determine the relationship between the system and the operating space coordinate system. In this way, the operating space coordinate system is set in the robot 2 (step 3).
  • the arm portion 21 is moved by the controller 3 and the three-dimensional sensor 4 is positioned above the block 52 as shown in FIG. 5 (step 4).
  • the robot 2 other than the arm portion 21 is not shown.
  • the slit light irradiation unit 41 irradiates the block 52 with slit light (step 5), and the area sensor 42 acquires a projection line to the block 52 (step 6).
  • the slit light in the first embodiment is light having a width in one axis (X'axis) direction of the measuring instrument coordinate system that is a reference of the output signal of the three-dimensional sensor 4, and is another one axis (X'axis) of the measuring instrument coordinate system. It is emitted in the Z'axis) direction. That is, it is a sheet-shaped slit light along the X'-Z'plane of the measuring instrument coordinate system (see FIG. 6).
  • the three-dimensional sensor 4 of the first embodiment corrects aberrations and depth aberrations (aberrations caused by the perspective method within the angle of view when the camera is oriented in an oblique direction) caused by a lens included in the optical system of the sensor 4. To do. Specifically, it is specified at which position of the two-dimensionally arranged pixels the slit light is incident, and then the position is corrected by calibration. Then, the profile of the projection line is acquired from the position of the corrected pixel.
  • the three sides of the block 52 are arranged parallel to the three axes of the operating space coordinate system, and the slit light is a sheet-like light along the X'-Z'plane. Therefore, if the X-axis, Y-axis, and Z-axis of the operating space coordinate system and the X'-axis, Y'-axis, and Z-axis of the measuring instrument coordinate system are parallel to each other, the upper surface of the block 52 can be reached.
  • the projection line is the same as the length W of one side parallel to the X axis of the block 52.
  • the distance between the projection line on the upper surface of the block 52 and the projection line on the plate-shaped member 51 is the height H of the block 52 (and the area sensor 42 of the three-dimensional sensor 4 sets the block and the plate-shaped member 51. It depends on the angle of capture).
  • FIG. 6 shows a state in which slit light is irradiated from an oblique direction with reference to the jig 5 (plate-shaped member 51 and block 52).
  • the projection lines are shown by making these angles larger than they actually are.
  • FIG. 7 is a display example created based on the projection line measured by the area sensor 42.
  • the area sensor 42 is arranged at a position tilted by an angle ⁇ 2 with respect to the X'-Z'plane, and the projection line is photographed from diagonally above. Therefore, the distance between the projection line on the upper surface of the block 52 and the projection line on the plate-shaped member 51 captured by the area sensor 42 is shorter than the actual distance. Therefore, the projection line acquisition unit 432 corrects the distance based on the arrangement of the slit light irradiation unit 41 and the area sensor 42 (angles ⁇ 1 and ⁇ 2 with respect to the X'-Z'plane) and the actually measured distance. And find the measured value h. On the other hand, the length of the projection line on the upper surface of the block 52 is taken as the measured value w as it is. Then, as shown in FIG. 7, the measured values w and h are displayed on the display unit 46.
  • the measuring instrument coordinate system passes through one end of the projection line on the upper surface of the block 52 (the bending point of the projection line when the projection line on the side surface of the block 52 is also photographed).
  • the angle formed by the straight line parallel to the Z'axis and the straight line passing through one end and parallel to the Z axis of the operating space coordinate system is ⁇ , and one direction (horizontal direction) of the angle of view of the area sensor 42 and the area sensor 42.
  • the angle formed by the projection line captured in 1 is defined as ⁇
  • the angle formed by the projection line on the upper surface of the block 52 and the straight line passing through one end of the projection line and parallel to the X axis of the operating space coordinate system is defined as ⁇ .
  • the measuring instrument posture calculation unit 433 calculates the above angles ⁇ , ⁇ , and ⁇ from the projection line profile as described above (step 7), and displays each angle on the screen of the display unit 46.
  • the measuring instrument posture calculation unit 433 determines whether all of these values are equal to or less than a predetermined value (predetermined value) (step 8).
  • predetermined value is set to a value at which each axis of the operating space coordinate system and each axis of the measuring instrument coordinate system can be regarded as substantially parallel to each other.
  • the predetermined value is, for example, 0.2 degrees. If the angles ⁇ , ⁇ , and ⁇ are all equal to or less than the predetermined values (YES in step 8), the process proceeds to step 10 described later.
  • step 8 if any of the angles exceeds a predetermined value (NO in step 8), the user adjusts the posture of the three-dimensional sensor 4 using the controller 3 (step 9). Then, returning to step 5, the block 52 is irradiated with slit light again, the angles ⁇ , ⁇ , and ⁇ are calculated and displayed on the display unit 46, and it is determined whether all of them are equal to or less than a predetermined value ( Steps 5-8). These processes are repeated until the angles ⁇ , ⁇ , and ⁇ are all equal to or less than a predetermined value. In the first embodiment, each time the angles ⁇ , ⁇ , and ⁇ are calculated, those values are displayed on the display unit 46.
  • the user confirms the angles ⁇ , ⁇ , and ⁇ , and operates with the coordinates of the measuring instrument. It is possible to intuitively grasp how much the spatial coordinates are deviated. It is also possible to determine to what extent it is necessary to match both coordinates while checking the change in the value and considering the accuracy required for the work to be performed using the robot 2. ..
  • the coordinate system shift unit 434 directs the slit light from the three-dimensional sensor to the Y'axis (operating space coordinate system and measuring instrument coordinate system). Scan in the direction (substantially the same as the Y axis because they are parallel to each other) to acquire the three-dimensional measurement data of the entire block 52, and position the block 52 at a predetermined position (for example, on the XY plane of the operating space coordinate system). The coordinate position of the bottom surface of the block 52) in the measuring instrument coordinate system is obtained (step 10).
  • the difference (shift amount) is obtained by comparing with the predetermined position coordinate position of the block 52 in the operating space coordinate system (step 11). Finally, the measuring instrument coordinate system and the operating space coordinate system are matched by shifting the measuring instrument coordinate system by the magnitude of the shift amount.
  • the robot system 100 of the second embodiment includes a control / processing device 6 in addition to the configuration of the alignment system 1 of the first embodiment.
  • the control / processing device 6 is provided so as to be able to communicate with the robot 2, the three-dimensional sensor 4, and the controller 3.
  • the control / processing device 6 has a displacement acquisition unit 621, a transformation matrix creation unit 622, a three-dimensional measurement data acquisition unit 623, and three-dimensional measurement data integration as functional blocks.
  • the unit 624 is provided.
  • the substance of the control / processing device 6 is, for example, a general personal computer, and each of the above functional blocks is embodied by executing the pre-installed three-dimensional measurement data processing program 62.
  • the control / processing unit 43 and the control / processing device 6 of the three-dimensional sensor 4 are provided respectively, but a part or all of the functions of the control / processing unit 43 of the three-dimensional sensor 4 are controlled / processed.
  • the configuration incorporated in 6 can also be adopted.
  • one position and orientation of the three-dimensional sensor 4 are determined for one operating space coordinate system, and measurement data is acquired in three dimensions of the object at the position and orientation. Therefore, when the object has a three-dimensional shape, the portion on the opposite side of the three-dimensional sensor 4 across the object is shaded, and that portion cannot be measured three-dimensionally.
  • the user first sets a plurality of operating space coordinate systems corresponding to the postures of the robot 2 by the same procedure as in the first embodiment. These plurality of operating space coordinate systems are set so that the part that is shaded in the three-dimensional measurement in the first embodiment can also be three-dimensionally measured. Then, one of these plurality of operating space coordinate systems is used as the reference operating space coordinate system, and the other operating space coordinate systems are used as the reference operating space coordinate system (step 21).
  • two reference operating space coordinate systems first reference operating space coordinate system and second reference operating space coordinate system
  • the reference operating space coordinate system is referred to as ⁇ A
  • the first reference operating space coordinate system is referred to as ⁇ B
  • the second reference operating space coordinate system is referred to as ⁇ C.
  • the number of reference operating space coordinate systems may be one or three or more.
  • the three-dimensional sensor 4 is subjected to the same procedure as in the first embodiment.
  • the position and posture are determined (step 22).
  • a T B is a homogeneous transformation matrix, and can be represented by the following matrix including the rotation transformation B R A and the translation B P A as elements.
  • the translational displacement and the rotational displacement in the reference operating space coordinate system ⁇ A when the operating point 20 is at the origin of the reference operating space coordinate system ⁇ B are acquired.
  • the transformation matrix creation unit 622 creates a homogeneous transformation matrix A T B based on these translational displacements and rotational displacements (step 24). This homogeneous transformation matrix A T B transforms the coordinates in the reference operating space coordinate system ⁇ B into the coordinates in the reference operating space coordinate system ⁇ A.
  • homogeneous transformation matrix A T C is obtained for converting the second reference operating space coordinate system sigma C in standard operating space coordinate system sigma A ..
  • the three-dimensional measurement data acquisition unit 623 performs three-dimensional measurement of the object in each of the reference operating space coordinate system ⁇ A , the first reference operating space coordinate system ⁇ B , and the second reference operating space coordinate system ⁇ C. Acquire the data (step 25). Then, the three-dimensional measurement data integration unit 624 inputs the three-dimensional measurement data of the object in the first reference operating space coordinate system ⁇ B and the second reference operating space coordinate system ⁇ C , respectively, into the same-order conversion matrices A T B and A. It is converted into three-dimensional measurement data of the reference operating space coordinate system ⁇ A by T C (step 26, FIG. 11).
  • the three-dimensional measurement data integration unit 624 integrates the three-dimensional measurement data of the object in the reference operating space coordinate system ⁇ A thus obtained and the three-dimensional measurement data of the object acquired in the reference operating space coordinate system ⁇ A (Ste 27), the display is displayed on the screen of the display unit 66 (step 28).
  • the user can confirm the shape of the object from an arbitrary direction by rotating the three-dimensional measurement data of the object displayed on the screen of the display unit 66 by an appropriate operation through the input unit 65. it can.
  • FIG. 12 shows the result of integrating the three-dimensional measurement data of the object acquired by the three-dimensional sensor 4 at the positions and orientations corresponding to the three different operating space coordinate systems.
  • FIG. 12 (a) is an optical image of the object
  • FIG. 12 (b) is the result of integrating the three-dimensional measurement data of the object acquired in three different coordinate systems.
  • the motion point 20 of the robot 2 is moved between the origin of the reference motion space coordinate system and the origin of the reference motion space coordinate system to acquire translational displacement and rotational displacement, and the reference motion is based on the displacement.
  • the technique described in Patent Document 2-4 has been conventionally proposed.
  • the resolution of the three-dimensional measurement data may decrease due to, for example, shear deformation or scaling.
  • the operating space coordinate system is set by using the alignment jig 5 as in the first embodiment. Therefore, unlike Patent Document 2-4, shear deformation and scaling do not occur, and three-dimensional measurement data can be easily integrated while maintaining high resolution and accuracy.
  • the robot system 200 of the third embodiment includes a plurality of robots 2a and 2b, controllers 3a and 3b that control the operations of the robots 2a and 2b, and a control / processing device 7.
  • the robots 2a and 2b and the controllers 3a and 3b have the same configurations as the robots 2 and the controller 3 in the first and second embodiments.
  • the control / processing device 7 has the same configuration and functional blocks as the control / processing device 6 in the second embodiment.
  • the configuration is provided with two robots for the sake of simplicity, but a system including three or more robots 2 may be used.
  • the movable range of the operating point 20 of the robot 2 in the second embodiment is determined by the lengths of the linear motion mechanisms 22a to 22c for moving the arm portion 21, and if the object is larger than the robot 2, only one robot 2 is used. Then, the arm may not reach the opposite side of the object, and it may not be possible to acquire three-dimensional measurement data of a part of the object. Further, in a system using only one robot 2, it is not possible to obtain three-dimensional measurement data of an object from a plurality of directions at the same time.
  • three-dimensional measurement data of the object is acquired by using a plurality of robots 2.
  • the operation of each part and the flow of data processing in the third embodiment will be described with reference to the flowchart of FIG. Descriptions of operations and processes common to those of the first embodiment or the second embodiment will be omitted as appropriate.
  • the connected motion space coordinate system ⁇ A and the reference motion space coordinate system ⁇ B are set.
  • the connected motion space coordinate system ⁇ A and the reference motion space coordinate system ⁇ C are set. That is, a connected motion space coordinate system common to the plurality of robots 2 is set, and a unique reference motion space coordinate system is set for each robot 2 (step 31; see FIG. 15).
  • the connected operating space coordinate system ⁇ A corresponds to the reference operating space coordinate system in the third and sixth aspects of the present invention.
  • the connected operating space coordinate system ⁇ A and the reference operating space coordinate system ⁇ B may be the same coordinate system.
  • step 32 the position and orientation of the three-dimensional sensor 4 in each operating space coordinate system are determined by the same procedure as in the first embodiment (step 32).
  • the homogeneous transformation matrix for converting the reference motion space coordinate systems ⁇ B and ⁇ C set in the robots 2a and 2b into the connected motion space coordinate system ⁇ A is obtained by the same procedure as in the second embodiment (step 33). ).
  • the three-dimensional measurement data of the object is acquired in the reference motion space coordinate systems ⁇ B and ⁇ C in each of the robots 2a and 2b (step 34). Then, the three-dimensional measurement data of the object in the reference motion space coordinate system ⁇ B and ⁇ C are converted into the three-dimensional measurement data of the connected motion space coordinate system ⁇ A by the same-order transformation matrices A T B and A T C , respectively. (Step 35. FIG. 15).
  • different robot 2a, the three-dimensional measurement data of the acquired object in a three-dimensional sensor 4 attached to 2b can be integrated with three-dimensional measurement data of the object in the reference operation space coordinate system sigma A (Step 36 ).
  • the integrated data is displayed on the screen of the display unit 66 (step 37).
  • the third embodiment since a plurality of robots 2a and 2b are used, it is possible to acquire the three-dimensional measurement data of the entire large object without being restricted by the length of the arm of the robot 2.
  • three-dimensional measurement data of the object can be obtained from a plurality of directions at the same time.
  • connection operation space coordinate system ⁇ A is set for the two robots 2, but when the object to be transported is measured by the three-dimensional sensors 4 attached to many robots 2, all the robots. It may be difficult to set the connection operation space coordinate system common to 2.
  • the connection operation space coordinate system common to 2.
  • FIG. 16 shows a robot system 300 of a modified example of the third embodiment.
  • the object 9 conveyed by the belt conveyor 8 is monitored.
  • a plurality of robots 2a, 2b ... Are arranged along the transport direction of the object 9 on one side of the belt conveyor 8, and a plurality of robots 2e, 2f ... Are arranged on the other side.
  • the connected operating space coordinate system ⁇ A is set for the robot 2a. Further, the connected operation space coordinate system ⁇ A and the reference operation space coordinate system ⁇ B are set in the robot 2b. For the robot 2c and below, the connected motion space coordinate system and the reference motion space coordinate system common to the adjacent robots 2 are set. Further, the connected operating space coordinate system ⁇ A and the reference operating space coordinate system ⁇ E are set in the robot 2e located on the opposite side of the robot 2a with the belt conveyor 8 in between. For the robot 2f and below, the connected motion space coordinate system and the reference motion space coordinate system common to the adjacent robots 2 are set.
  • these coordinate systems can be transformed with each other using the corresponding linear transformation matrices.
  • the three-dimensional measurement data acquired in the reference motion space coordinate system ⁇ F of the robot 2f is converted in order by two homogeneous transformation matrices E T F and A T E to obtain the reference motion space coordinates. It can be converted into 3D measurement data of system ⁇ A.
  • the three-dimensional measurement data of the object 9 acquired in all the reference motion space coordinate systems is used in the reference motion space coordinate system ⁇ A by using one or more homogeneous transformation matrices. Convert to 3D measurement data and combine them. This makes it possible to monitor an object 9 that moves due to transportation or the like in a wide range.
  • the jig 5 was used to determine the shift amount of both coordinate systems, and even the process of shifting and matching the same coordinate system was performed, but these processes are not essential.
  • the process of adding (or subtracting) the above shift amount to the position acquired in the measuring instrument coordinate system to convert it to the position in the operating space coordinate system is actually operated. You may go every time.
  • another jig is placed, and the predetermined jig is specified in each of the operating space coordinate system and the measuring instrument coordinate system.
  • the shift amount can also be obtained from the difference in the coordinates of the positions.
  • the jig used to obtain the shift amount does not necessarily have to be a rectangular parallelepiped shape, and a jig having an appropriate shape capable of defining the predetermined position can be used.
  • a jig having an appropriate shape capable of defining the predetermined position can be used.
  • the cylindrical block when it is difficult to accurately measure the shape of the edge portion when acquiring the entire profile of the rectangular parallelepiped block 52, it is preferable to use the cylindrical block.
  • the jig used for calculating the shift amount is not limited to the block, and the coordinates of the center located on the XY plane and the height from the center are known, and the three-dimensional measurement data is measured by the three-dimensional sensor 4. Can be used as appropriate.
  • the second and third embodiments when integrating the three-dimensional measurement data of the object by the optical cutting method using a measuring instrument such as a three-dimensional sensor attached to the robot, a complicated calculation as in the conventional case is performed.
  • the operator can integrate the three-dimensional measurement data acquired in different coordinate systems in a simple process without the need for a program.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

Provided is a system for aligning a robot coordinate system with a measuring instrument coordinate system. The system is provided with: an operation space coordinate system setting unit 31 that determines a relationship between an operation space coordinate system and the robot coordinate system through teaching of the positions of a first reference point A positioned at the origin of the operation space coordinate system, a second reference point B and a third reference point C on two lines through the first reference point; a projection line obtaining unit 432 that obtains a projection line by emitting slit light from a three-dimensional measuring instrument 4 toward a reference object 52 fixed to a plate-like member 51 arranged parallel to a plane on which two of three orthogonal axes defining the operation space coordinate system are placed, in such a way that all sides are parallel to the three orthogonal axes respectively; a measuring instrument attitude calculation unit 433 that obtains the attitude of a three-dimensional measuring instrument 30 relative to the reference object 52 on the basis of a profile of the projection line; and a measuring instrument moving unit 3 that moves the three-dimensional measuring instrument 4.

Description

ロボット用の座標系アライメント方法及びアライメントシステム並びにアライメント装置Coordinate system alignment method and alignment system for robots and alignment device

 本発明は、対象物に対して所定の動作を実行するロボットの動作を制御する基準となる座標系と、該ロボットに取り付けて用いられる三次元計測器の座標系をアライメントする技術に関する。 The present invention relates to a technique for aligning a coordinate system as a reference for controlling the movement of a robot that executes a predetermined movement with respect to an object and a coordinate system of a three-dimensional measuring instrument attached to the robot.

 工場の製造ラインでは、製品の組み立てや加工の効率を高めるために産業用ロボットが用いられている。産業用ロボットは、先端部を三次元的に移動可能なアームと、該先端部に取り付けられ被処理物に所定の処理を施す治具とを備えている。また、アームの先端部には三次元計測器が取り付けられており、例えば光切断法により被処理物を三次元的に計測してその位置及び向きを特定する。 Industrial robots are used on the factory production line to improve the efficiency of product assembly and processing. An industrial robot includes an arm whose tip can be moved three-dimensionally, and a jig attached to the tip to perform a predetermined treatment on an object to be processed. Further, a three-dimensional measuring instrument is attached to the tip of the arm, and for example, the object to be processed is three-dimensionally measured by an optical cutting method to specify the position and orientation thereof.

 産業用ロボットには、製造時等に予めメーカによって座標系が設定されており、アームの動作はその座標系に基づいて制御される。一方、前記三次元計測器は、該三次元計測器自身が有する座標系に基づく計測結果を出力する。従って、三次元計測器により位置及び向きが特定された被処理物の処理対象の位置に治具を移動するには、ロボットの座標系と三次元計測器の座標系を整合させる(アライメントする、又はキャリブレーションするとも言う)必要がある。 The coordinate system of the industrial robot is set in advance by the manufacturer at the time of manufacturing, etc., and the operation of the arm is controlled based on the coordinate system. On the other hand, the three-dimensional measuring instrument outputs a measurement result based on the coordinate system of the three-dimensional measuring instrument itself. Therefore, in order to move the jig to the position of the object to be processed whose position and orientation are specified by the 3D measuring instrument, the coordinate system of the robot and the coordinate system of the 3D measuring instrument are aligned (aligned). Or it is also called calibrating).

 従来、両座標系のアライメントには行列演算によるアフィン変換が用いられている(例えば特許文献1)。特許文献1に記載のアライメント方法では、アームの先端に取り付けた三次元計測器を、同一直線上にない3つの位置に移動させ、それぞれの位置で対象物の同一箇所を計測して、三次元計測器の座標系での座標値を取得する。そして、前記3つの位置のロボットの座標系での座標値と、各位置で取得した三次元計測器の座標系での座標値を用いて行列を作成し、それらの行列から得られる方程式を解くことにより三次元計測器の座標系をロボットの座標系に変換する変換行列を得る。 Conventionally, affine transformation by matrix calculation has been used for alignment of both coordinate systems (for example, Patent Document 1). In the alignment method described in Patent Document 1, the three-dimensional measuring instrument attached to the tip of the arm is moved to three positions that are not on the same straight line, and the same part of the object is measured at each position to three-dimensionally. Get the coordinate values in the coordinate system of the measuring instrument. Then, a matrix is created using the coordinate values in the coordinate system of the robot at the three positions and the coordinate values in the coordinate system of the three-dimensional measuring instrument acquired at each position, and the equation obtained from those matrices is solved. As a result, a transformation matrix that transforms the coordinate system of the three-dimensional measuring instrument into the coordinate system of the robot is obtained.

特開平8-132373号公報Japanese Unexamined Patent Publication No. 8-132373 特開2019-184340号公報JP-A-2019-184340 特開平7-239219号公報Japanese Unexamined Patent Publication No. 7-239219 特開2010-91540号公報Japanese Unexamined Patent Publication No. 2010-91540

 特許文献1に記載のアライメント方法では、変換行列を得るために、3つの位置のそれぞれに対応する行列の式を展開して複雑な方程式を解く必要がある。また、ロボットの動作を制御するプログラムの開発環境(開発言語)はメーカによって異なる。そのため、種々のプログラム言語を理解して行列演算を行うことが可能な熟練者でなければアライメントを行うことが難しいという問題があった。 In the alignment method described in Patent Document 1, in order to obtain a transformation matrix, it is necessary to expand the equations of the matrix corresponding to each of the three positions and solve a complicated equation. In addition, the development environment (development language) of the program that controls the operation of the robot differs depending on the manufacturer. Therefore, there is a problem that it is difficult to perform alignment unless an expert who understands various programming languages and can perform matrix operations.

 本発明が解決しようとする課題は、熟練者でなくても容易にロボット座標系のアライメントを行うことができる技術を提供することである。 The problem to be solved by the present invention is to provide a technique capable of easily aligning a robot coordinate system even if the person is not an expert.

 上記課題を解決するために成された本発明の第1の態様は、動作点を三次元的に移動するためのロボットの座標系であるロボット座標系と、光切断法を実行可能であり前記動作点に対する位置及び姿勢が不変である三次元計測器の座標系である計測器座標系をアライメントする方法であって、
 前記動作点の動作空間の座標系である動作空間座標系における原点に位置する第1基準点、該第1基準点で直交する2つの直線上にそれぞれ位置する第2基準点及び第3基準点に前記動作点を移動させて各基準点の位置をティーチングすることにより、前記動作空間座標系と前記ロボット座標系の関係を決定し、
 前記動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に配置された板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対し、前記三次元計測器からシート状のスリット光を照射して、該スリット光による前記参照物体への投映線を取得し、
 前記投映線のプロファイルに基づいて、前記参照物体に対する前記三次元計測器の姿勢を求め、
 前記三次元計測器の姿勢が所定の基準姿勢範囲内に入るように前記三次元計測器を移動させる
 工程を有することを特徴とする。
The first aspect of the present invention made to solve the above problems is that the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally, and the optical cutting method can be executed. This is a method of aligning the coordinate system of a three-dimensional measuring instrument whose position and orientation with respect to the operating point are invariant.
The first reference point located at the origin in the operating space coordinate system, which is the coordinate system of the operating space of the operating point, and the second reference point and the third reference point located on two straight lines orthogonal to the first reference point, respectively. By moving the motion point to and teaching the position of each reference point, the relationship between the motion space coordinate system and the robot coordinate system is determined.
On the surface of the plate-shaped member arranged parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A rectangular parallelepiped reference object fixed to the above is irradiated with a sheet-shaped slit light from the three-dimensional measuring instrument to acquire a projection line to the reference object by the slit light.
Based on the profile of the projection line, the posture of the three-dimensional measuring instrument with respect to the reference object is obtained.
It is characterized by having a step of moving the three-dimensional measuring instrument so that the posture of the three-dimensional measuring instrument falls within a predetermined reference posture range.

 上記姿勢とは、座標系の軸や所定面などの基準に対する傾きや向きを意味する。また、光切断法を実行可能な三次元計測器とは、計測器座標系に対する姿勢が既知であるスリット光を対象物に照射し、該スリット光による対象物の各面への投映線のデータを取得して、該対象物の三次元形状に関する情報を取得可能な計測器をいう。さらに、上記動作空間とは、ロボットの動作点が動作する空間をいう。 The above posture means an inclination or orientation with respect to a reference such as an axis of a coordinate system or a predetermined surface. A three-dimensional measuring instrument capable of performing the optical cutting method is a three-dimensional measuring instrument that irradiates an object with slit light whose attitude with respect to the coordinate system of the measuring instrument is known, and data of projection lines on each surface of the object by the slit light. Refers to a measuring instrument capable of acquiring information on the three-dimensional shape of the object. Further, the operating space refers to a space in which the operating point of the robot operates.

 本発明の第1の態様に係るアライメント方法では、まず、動作空間座標系における原点に位置する第1基準点、該第1基準点で直交する2つの直線上にそれぞれ位置する第2基準点及び第3基準点のそれぞれにロボットの動作点を移動させて各基準点の位置をティーチングし、ロボット座標系と動作空間座標系の関係を決定する。つまり、第1基準点及び第2基準点により動作空間座標系の第1軸が規定され、第1基準点及び第3基準点により第2軸が規定され、さらに、これらの二軸に直交する軸として第3軸が規定される。 In the alignment method according to the first aspect of the present invention, first, a first reference point located at the origin in the operating space coordinate system, a second reference point located on two straight lines orthogonal to the first reference point, and a second reference point, respectively. The operating point of the robot is moved to each of the third reference points, the position of each reference point is taught, and the relationship between the robot coordinate system and the operating space coordinate system is determined. That is, the first reference point and the second reference point define the first axis of the operating space coordinate system, the first reference point and the third reference point define the second axis, and further, they are orthogonal to these two axes. A third axis is defined as the axis.

 次に、動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対して、三次元計測器からスリット光を照射して、該スリット光による参照物体の各面への投映線を取得する。そして、取得した投映線のプロファイルに基づいて参照物体に対する三次元計測器の姿勢を求める。この投映線のプロファイルには、参照物体の上面への投映線の長さ、及び該参照物体の上面への投映線と板状部材の面(参照物体が載置されている面)への投映線の間の距離、及び三次元計測器の座標軸に対するそれらの投映線の傾きが含まれる。そして、三次元計測器の姿勢が所定の基準姿勢範囲内に入るように三次元計測器を移動させる。投映線のプロファイルから三次元計測器の姿勢を求める具体的な方法は後述する。 Next, on the surface of the plate-shaped member parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A fixed rectangular parallelepiped reference object is irradiated with slit light from a three-dimensional measuring instrument, and projection lines on each surface of the reference object by the slit light are acquired. Then, the posture of the three-dimensional measuring instrument with respect to the reference object is obtained based on the acquired profile of the projection line. The profile of this projection line includes the length of the projection line on the upper surface of the reference object, and the projection line on the upper surface of the reference object and the projection on the surface of the plate-like member (the surface on which the reference object is placed). Includes the distance between the lines and the slope of those projection lines with respect to the coordinate axes of the 3D instrument. Then, the three-dimensional measuring instrument is moved so that the posture of the three-dimensional measuring instrument falls within a predetermined reference posture range. A specific method for obtaining the attitude of the three-dimensional measuring instrument from the profile of the projection line will be described later.

 ここで、理解を容易にするため、参照物体が動作空間座標系の二軸(これをX軸及びY軸とする)を含む平面上に、Z軸とも平行になるように配置されており、また、シート状のスリット光が計測器座標系X'軸方向に幅を有してZ'軸方向に照射され、さらに、三次元計測器の姿勢が参照物体に対して略平行となる(即ち、この例ではロボット座標系と計測器座標系の3軸が相互に略平行である)ように基準姿勢範囲が設定されている場合の具体的な一例を説明する。このとき、計測器座標系の三軸が動作空間座標系の三軸と平行に(従って、参照物体の三辺とも平行に)なっていれば、参照物体の上面に照射されるスリット光の長さは該参照物体の上面の幅と一致する。また、参照物体の上面への投映線と板状部材の面(参照物体が載置されている面)への投映線の間の距離は、参照物体の高さに基づく所定の値となる。さらに、これらの投映線は三次元計測器のX'軸に対して平行になる。 Here, for ease of understanding, the reference object is arranged on a plane including the two axes of the operating space coordinate system (this is referred to as the X axis and the Y axis) so as to be parallel to the Z axis. Further, the sheet-shaped slit light has a width in the X'axis direction of the measuring instrument coordinate system and is irradiated in the Z'axis direction, and the posture of the three-dimensional measuring instrument becomes substantially parallel to the reference object (that is,). , In this example, a specific example when the reference posture range is set so that the three axes of the robot coordinate system and the measuring instrument coordinate system are substantially parallel to each other will be described. At this time, if the three axes of the measuring instrument coordinate system are parallel to the three axes of the operating space coordinate system (and therefore parallel to the three sides of the reference object), the length of the slit light emitted to the upper surface of the reference object. The width of the reference object matches the width of the upper surface of the reference object. Further, the distance between the projection line on the upper surface of the reference object and the projection line on the surface of the plate-shaped member (the surface on which the reference object is placed) is a predetermined value based on the height of the reference object. In addition, these projection lines are parallel to the X'axis of the 3D instrument.

 一方、計測器座標系が、動作空間座標系のZ軸周りに大きく回転しているほど参照物体の上面に照射されるスリット光の長さが長くなる。また、計測器座標系が前記平面内の一軸(これをY軸とする)周りに動作空間座標系に対して傾いていると、参照物体の上面への投映線と板状部材の表面(参照物体が固定されている面)への投映線の、三次元計測器のX’軸に対する傾きが大きくなる。また、参照物体の側面にスリット光が照射された切断線が現れうる。さらに、計測器座標系が水平方向の別の軸(これをX軸とする)周りに動作空間座標系に対して大きく傾いているほど、参照物体の上面への投映線と板状部材の面への投映線の間の距離が長くなる。従って、これらの情報から、参照物体に対する三次元計測器の姿勢(この例では動作空間座標系の三軸のそれぞれに対する計測器座標系の軸の傾き)が求められる。 On the other hand, the greater the rotation of the measuring instrument coordinate system around the Z axis of the operating space coordinate system, the longer the length of the slit light radiated to the upper surface of the reference object. Further, when the measuring instrument coordinate system is tilted with respect to the operating space coordinate system around one axis (this is defined as the Y axis) in the plane, the projection line on the upper surface of the reference object and the surface of the plate-shaped member (reference). The inclination of the projection line to the plane on which the object is fixed) with respect to the X'axis of the 3D measuring instrument becomes large. In addition, a cutting line irradiated with slit light may appear on the side surface of the reference object. Furthermore, the greater the tilt of the measuring instrument coordinate system with respect to the operating space coordinate system around another horizontal axis (this is the X axis), the more the projection line on the upper surface of the reference object and the surface of the plate-like member. The distance between the projection lines to is increased. Therefore, from this information, the posture of the three-dimensional measuring instrument with respect to the reference object (in this example, the inclination of the axis of the measuring instrument coordinate system with respect to each of the three axes of the operating space coordinate system) can be obtained.

 そして、いずれかの傾きが予め決められた閾値よりも大きい場合には、該傾きの全てが該閾値を下回るように三次元計測器を移動させる。これは、例えば、使用者が参照物体に対する三次元計測器の姿勢を視認しつつコントローラを操作して参照物体に対して平行になるように三次元計測器を移動させ、その位置で前記傾きが全て前記閾値を下回っていることを確認することにより行うことができる。これにより、三次元計測器の姿勢が基準姿勢範囲内に入るように(この例では参照物体の三辺が延びる方向と計測器座標系の三軸を平行に)することができる。 Then, when any of the inclinations is larger than the predetermined threshold value, the three-dimensional measuring instrument is moved so that all the inclinations are lower than the threshold value. For example, the user operates the controller while visually recognizing the posture of the three-dimensional measuring instrument with respect to the reference object to move the three-dimensional measuring instrument so as to be parallel to the reference object, and the inclination is set at that position. This can be done by confirming that all of them are below the above threshold. As a result, the posture of the three-dimensional measuring instrument can be set within the reference posture range (in this example, the direction in which the three sides of the reference object extend and the three axes of the measuring instrument coordinate system are parallel).

 上記のとおり、本発明の第1の態様に係るアライメント方法ではプログラミング言語を用いた行列演算を行う必要がないため、熟練者でなくても容易にロボット座標系のアライメントを行うことができる。 As described above, since the alignment method according to the first aspect of the present invention does not need to perform matrix operations using a programming language, it is possible to easily align the robot coordinate system even if the person is not an expert.

 上記アライメント方法は、例えば1つの動作空間座標系の直交三軸のそれぞれに各辺が平行になるように配置された直方体状の参照物体に対して所定の基準姿勢となるように三次元計測器の姿勢を決定し、該位置及び姿勢で対象物を三次元計測する。つまり、上記アライメント方法では、1つの動作空間座標系に対して三次元計測器の1つの位置及び姿勢を決定し、当該位置及び姿勢で対象物の三次元計測データを取得することになる。工場の製造ライン等で組立や加工を行う対象物は、多くの場合、立体形状を有しており、1つの位置及び姿勢で対象物の三次元計測データを取得するのみでは陰になる部分が存在し、該部分について対象物の形状の情報を得ることができない。そこで、対象物の形状の情報を複数の方向から取得することが求められる。 The above alignment method is, for example, a three-dimensional measuring instrument so as to have a predetermined reference posture with respect to a rectangular parallelepiped reference object arranged so that each side is parallel to each of the three orthogonal axes of one operating space coordinate system. The posture of the object is determined, and the object is three-dimensionally measured at the position and the posture. That is, in the above alignment method, one position and orientation of the three-dimensional measuring instrument are determined with respect to one operating space coordinate system, and the three-dimensional measurement data of the object is acquired at the position and orientation. In many cases, the object to be assembled or processed on the production line of a factory has a three-dimensional shape, and there is a shadowy part just by acquiring the three-dimensional measurement data of the object at one position and posture. It exists and it is not possible to obtain information on the shape of the object for that part. Therefore, it is required to acquire information on the shape of the object from a plurality of directions.

 そこで、本発明の第2の態様に係るアライメント方法では、上記第1の態様において、
 前記動作空間座標系として、基準動作空間座標系と参照動作空間座標系を設定し、
 前記動作点を、前記基準動作空間座標系の原点と前記参照動作空間座標系の原点の間で移動させて並進変位及び回転変位を取得し、
 前記並進変位及び回転変位に基づいて、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を求め、
 前記基準動作空間座標系と前記参照動作空間座標系のそれぞれにおいて、前記三次元計測器の姿勢が前記所定の基準姿勢範囲内に入るように該三次元計測器を移動させて対象物の三次元計測データを取得し、
 前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記変換行列を用いて前記基準動作空間座標系の三次元計測データに変換して該基準空間座標系で取得した前記対象物の三次元計測データと統合する
 ように構成することが好ましい。
Therefore, in the alignment method according to the second aspect of the present invention, in the above first aspect,
As the operating space coordinate system, a reference operating space coordinate system and a reference operating space coordinate system are set.
The operating point is moved between the origin of the reference operating space coordinate system and the origin of the reference operating space coordinate system to obtain translational displacement and rotational displacement.
Based on the translational displacement and the rotational displacement, a transformation matrix for converting the reference operating space coordinate system into the reference operating space coordinate system is obtained.
In each of the reference motion space coordinate system and the reference motion space coordinate system, the three-dimensional measuring instrument is moved so that the posture of the three-dimensional measuring instrument falls within the predetermined reference posture range, and the object is three-dimensional. Get measurement data,
The object acquired in the reference space coordinate system by converting the three-dimensional measurement data of the object acquired in the reference operation space coordinate system into the three-dimensional measurement data of the reference operation space coordinate system using the conversion matrix. It is preferable to configure it so as to integrate it with the three-dimensional measurement data of the object.

 上記第2の態様のアライメント方法では、基準動作空間座標系と参照動作空間座標系を設定する。そして、両座標系の原点の間でロボットの動作点を移動させ、該移動による並進変位及び回転変位を取得し、該変位に基づいて参照動作空間座標系を基準動作空間座標系に変換する変換行列を求める。その後、基準動作空間座標系と参照動作空間座標系のそれぞれにおいて三次元計測器により対象物の三次元計測データを取得し、上記変換行列を用いて参照動作空間座標系で取得した当該三次元計測データを基準動作空間座標系の三次元計測データに変換して統合する。この態様のアライメント方法では、複数の位置及び姿勢で対象物を三次元計測し、陰になる部分のない対象物の形状の情報を得ることができる。 In the alignment method of the second aspect described above, the reference operating space coordinate system and the reference operating space coordinate system are set. Then, the motion point of the robot is moved between the origins of both coordinate systems, the translational displacement and the rotational displacement due to the movement are acquired, and the reference motion space coordinate system is converted into the reference motion space coordinate system based on the displacement. Find the matrix. After that, the three-dimensional measurement data of the object is acquired by the three-dimensional measuring instrument in each of the reference motion space coordinate system and the reference motion space coordinate system, and the three-dimensional measurement acquired in the reference motion space coordinate system using the above conversion matrix. The data is converted into 3D measurement data of the reference operating space coordinate system and integrated. In the alignment method of this aspect, the object can be three-dimensionally measured at a plurality of positions and postures, and information on the shape of the object without a shaded portion can be obtained.

 ロボットの動作点の可動範囲はアームの長さによって決まり、ロボットに対して対象物が大きいと、1台のロボットのみでは当該対象物の反対側までアームが届かず、当該対象物の一部の三次元計測データを取得することができない場合がある。 The movable range of the operating point of the robot is determined by the length of the arm, and if the object is large with respect to the robot, the arm cannot reach the opposite side of the object with only one robot, and a part of the object cannot be reached. It may not be possible to acquire 3D measurement data.

 そこで、本発明の第3の態様に係るアライメント方法では、上記第2の態様において、
 複数のロボットのそれぞれについて、複数の前記動作空間座標系であって、そのうちの1つが少なくとも別の1つのロボットについて設定される動作空間座標系と共通である、複数の動作空間座標系を設定し、
 前記複数のロボットに設定された全ての動作空間座標系のうちの1つを前記基準動作空間座標系、他を参照動作空間座標系とし、
 前記複数のロボットに設定された座標系のうち、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を作成し、
 前記複数のロボットのそれぞれについて設定された前記複数の動作空間座標系のそれぞれにおいて前記対象物の三次元計測データを取得し、
 前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記基準動作空間座標系の三次元計測データに変換して統合する
 ように構成するとよい。
Therefore, in the alignment method according to the third aspect of the present invention, in the second aspect described above,
For each of the plurality of robots, a plurality of operating space coordinate systems are set, which are the plurality of the operating space coordinate systems, one of which is common to the operating space coordinate system set for at least one other robot. ,
One of all the motion space coordinate systems set in the plurality of robots is the reference motion space coordinate system, and the other is the reference motion space coordinate system.
Among the coordinate systems set in the plurality of robots, a transformation matrix for converting the reference motion space coordinate system to the reference motion space coordinate system is created.
Three-dimensional measurement data of the object is acquired in each of the plurality of operating space coordinate systems set for each of the plurality of robots.
The three-dimensional measurement data of the object acquired in the reference motion space coordinate system may be converted into the three-dimensional measurement data of the reference motion space coordinate system and integrated.

 上記第3の態様のアライメント方法では、複数のロボットを用いて対象物の三次元計測データを取得するため、ロボットに比べて対象物が大きい場合でも陰になる部分がない、三次元形状の情報を得ることができる。 In the alignment method of the third aspect, since the three-dimensional measurement data of the object is acquired by using a plurality of robots, the three-dimensional shape information has no shadow even when the object is larger than the robot. Can be obtained.

 上記課題を解決するために成された本発明の第4の態様は、動作点を三次元的に移動するためのロボットの座標系であるロボット座標系と、光切断法を実行可能であり前記動作点に対する位置及び姿勢が不変である三次元計測器の座標系である計測器座標系をアライメントするシステムであって、
 前記動作点の動作空間の座標系である動作空間座標系における原点に位置する第1基準点、該第1基準点を通り直交する2つの直線上にそれぞれ位置する第2基準点及び第3基準点に前記動作点を移動させて各基準点の位置をティーチングすることにより、前記動作空間座標系と前記ロボット座標系の関係を決定する動作空間座標系設定部と、
 前記動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に配置された板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対し、前記三次元計測器からシート状のスリット光を照射して、該スリット光による前記参照物体のへの投映線を取得する投映線取得部と、
 前記投映線のプロファイルに基づいて、前記参照物体に対する前記三次元計測器の姿勢を求める計測器姿勢算出部と、
 前記三次元計測器を移動させる計測器移動部と
 を備えることを特徴とする。
A fourth aspect of the present invention made to solve the above problems is that the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally, and the optical cutting method can be executed. A system that aligns the coordinate system of a three-dimensional measuring instrument whose position and orientation with respect to the operating point are invariant.
The first reference point located at the origin in the operating space coordinate system, which is the coordinate system of the operating space of the operating point, the second reference point and the third reference point located on two straight lines orthogonal to the first reference point, respectively. An operating space coordinate system setting unit that determines the relationship between the operating space coordinate system and the robot coordinate system by moving the operating point to a point and teaching the position of each reference point.
On the surface of the plate-shaped member arranged parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A projection line acquisition unit that irradiates a rectangular parallelepiped reference object fixed to the above with a sheet-shaped slit light from the three-dimensional measuring instrument and acquires a projection line to the reference object by the slit light.
A measuring instrument posture calculation unit that obtains the posture of the three-dimensional measuring instrument with respect to the reference object based on the profile of the projection line.
It is characterized by including a measuring instrument moving unit for moving the three-dimensional measuring instrument.

 本発明の第5の態様に係るアライメントシステムは、上記第4の態様において、
 前記動作空間座標系設定部が、前記動作空間座標系として、基準動作空間座標系と参照動作空間座標系を設定するものであり、
 さらに、
 前記動作点を、前記基準動作空間座標系の原点と前記参照動作空間座標系の原点の間で移動させて並進変位及び回転変位を取得する変位取得部と、
 前記並進変位及び回転変位に基づいて、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を求める変換行列算出部と、
 前記基準動作空間座標系と前記参照動作空間座標系のそれぞれにおいて、前記計測器移動部により前記三次元計測器の姿勢が前記所定の基準姿勢範囲内に入るように該三次元計測器を移動させ、対象物の三次元計測データを取得する三次元計測データ取得部と、
 前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記変換行列を用いて前記基準動作空間座標系の三次元計測データに変換して該基準空間座標系で取得した前記対象物の三次元計測データと統合する三次元計測データ統合部と
 を備えるものとすることができる。
The alignment system according to the fifth aspect of the present invention has the fourth aspect described above.
The operating space coordinate system setting unit sets the reference operating space coordinate system and the reference operating space coordinate system as the operating space coordinate system.
further,
A displacement acquisition unit that moves the operating point between the origin of the reference operating space coordinate system and the origin of the reference operating space coordinate system to acquire translational displacement and rotational displacement.
A transformation matrix calculation unit that obtains a transformation matrix that converts the reference motion space coordinate system into the reference motion space coordinate system based on the translational displacement and the rotational displacement.
In each of the reference motion space coordinate system and the reference motion space coordinate system, the three-dimensional measuring instrument is moved by the measuring instrument moving unit so that the posture of the three-dimensional measuring instrument falls within the predetermined reference posture range. , The 3D measurement data acquisition unit that acquires the 3D measurement data of the object,
The object acquired in the reference space coordinate system by converting the three-dimensional measurement data of the object acquired in the reference operation space coordinate system into the three-dimensional measurement data of the reference operation space coordinate system using the conversion matrix. It may be equipped with a three-dimensional measurement data integration unit that integrates with the three-dimensional measurement data of an object.

 また、本発明の第6の態様に係るアライメントシステムは、上記第5の態様において、さらに、
 複数のロボット
 を備え、
 前記動作空間座標系設定部が、前記複数のロボットのそれぞれについて、複数の前記動作空間座標系であって、そのうちの1つが少なくとも別の1つのロボットについて設定される動作空間座標系と共通である複数の動作空間座標系を設定するとともに、該複数のロボットに設定した動作空間座標系のうちの1つを前記基準動作空間座標系、他を参照動作空間座標系として設定し、
 前記変換行列算出部が、前記複数のロボットに設定された座標系のうち、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を作成し、
 前記三次元計測データ統合部が、前記複数のロボットのそれぞれについて設定された前記複数の動作空間座標系のそれぞれにおいて前記対象物の三次元計測データを取得し、
 前記三次元計測データ統合部が、前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記基準動作空間座標系の三次元計測データに変換して統合する
 ものとすることができる。
Further, the alignment system according to the sixth aspect of the present invention further comprises the fifth aspect.
Equipped with multiple robots
The operating space coordinate system setting unit is common to a plurality of the operating space coordinate systems set for each of the plurality of robots, one of which is set for at least another robot. A plurality of operating space coordinate systems are set, and one of the operating space coordinate systems set for the plurality of robots is set as the reference operating space coordinate system, and the other is set as the reference operating space coordinate system.
The transformation matrix calculation unit creates a transformation matrix that converts the reference motion space coordinate system into the reference motion space coordinate system among the coordinate systems set in the plurality of robots.
The three-dimensional measurement data integration unit acquires three-dimensional measurement data of the object in each of the plurality of operating space coordinate systems set for each of the plurality of robots.
The three-dimensional measurement data integration unit may convert the three-dimensional measurement data of the object acquired in the reference motion space coordinate system into the three-dimensional measurement data of the reference motion space coordinate system and integrate the data. it can.

 上記課題を解決するために成された本発明の第7の態様は、動作点を三次元的に移動するためのロボットの座標系であるロボット座標系に予め対応付けられた前記動作点の動作空間の座標系である動作空間座標系と、光切断法を実行可能であり前記動作点に対する位置及び姿勢が不変である三次元計測器の座標系である計測器座標系をアライメントするために用いられる装置であって、
 前記動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に配置された板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対し、前記三次元計測器からシート状のスリット光を照射して、該スリット光による前記参照物体への投映線を取得する投映線取得部と、
 前記投映線のプロファイルに基づいて、前記参照物体に対する前記三次元計測器の姿勢を求める計測器姿勢算出部と、
 を備えることを特徴とする。
A seventh aspect of the present invention made to solve the above problems is the operation of the operating point associated with the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally. Used to align the operating space coordinate system, which is the coordinate system of space, with the coordinate system of the measuring instrument, which is the coordinate system of the three-dimensional measuring instrument that can execute the optical cutting method and whose position and orientation with respect to the operating point are invariant. It is a device that can be used
On the surface of the plate-shaped member arranged parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A projection line acquisition unit that irradiates a rectangular parallelepiped reference object fixed to the reference object with a sheet-shaped slit light from the three-dimensional measuring instrument and acquires a projection line to the reference object by the slit light.
A measuring instrument posture calculation unit that obtains the posture of the three-dimensional measuring instrument with respect to the reference object based on the profile of the projection line.
It is characterized by having.

 本発明に係るロボット用の座標系アライメント方法、アライメントシステム、又はアライメント装置を用いることにより、熟練者でなくても容易にロボット座標系のアライメントを行うことができる。 By using the coordinate system alignment method for the robot, the alignment system, or the alignment device according to the present invention, the robot coordinate system can be easily aligned even by a non-expert.

本発明に係るロボット用の座標系アライメント装置及びアライメントシステムの第1実施例を含むロボットシステムの概略構成図。FIG. 6 is a schematic configuration diagram of a robot system including a coordinate system alignment device for a robot and a first embodiment of the alignment system according to the present invention. 第1実施例の三次元センサの構成を説明する図。The figure explaining the structure of the 3D sensor of 1st Example. 三次元センサのスリット光照射部とエリアセンサの配置を説明する図。The figure explaining the arrangement of the slit light irradiation part of the 3D sensor and the area sensor. 本発明に係るロボット用の座標系アライメント方法の第1実施例に関するフローチャート。The flowchart regarding 1st Example of the coordinate system alignment method for a robot which concerns on this invention. 第1実施例において用いるアライメント治具について説明する図。The figure explaining the alignment jig used in 1st Example. 第1実施例におけるスリット光によるブロックへの投映線について説明する図。The figure explaining the projection line to the block by the slit light in 1st Example. 第1実施例において得られる投映線のプロファイルの一例。An example of the projection line profile obtained in the first embodiment. 本発明に係るロボット用の座標系アライメント装置及びアライメントシステムの第2実施例を含むロボットシステムの概略構成図。FIG. 6 is a schematic configuration diagram of a robot system including a second embodiment of a coordinate system alignment device and an alignment system for a robot according to the present invention. 第2実施例のロボットシステムにおける制御・処理部の構成を説明する図。The figure explaining the structure of the control / processing part in the robot system of 2nd Example. 本発明に係るロボット用の座標系アライメント方法の第2実施例に関するフローチャート。The flowchart regarding the 2nd Example of the coordinate system alignment method for a robot which concerns on this invention. 第2実施例における動作空間座標系の関係を説明する図。The figure explaining the relationship of the operation space coordinate system in 2nd Example. 第2実施例のロボットシステムを用いて対象物の三次元計測データを統合した結果。The result of integrating the three-dimensional measurement data of the object using the robot system of the second embodiment. 本発明に係るロボット用の座標系アライメント装置及びアライメントシステムの第3実施例を含むロボットシステムの概略構成図。FIG. 6 is a schematic configuration diagram of a robot system including a third embodiment of a coordinate system alignment device and an alignment system for a robot according to the present invention. 本発明に係るロボット用の座標系アライメント方法の第3実施例に関するフローチャート。The flowchart regarding the 3rd Example of the coordinate system alignment method for a robot which concerns on this invention. 第3実施例における動作空間座標系の関係を説明する図。The figure explaining the relationship of the operation space coordinate system in 3rd Example. 第3実施例の変形例におけるロボットの配置及び動作空間座標系の関係を説明する図。The figure explaining the relationship between the arrangement of the robot and the operating space coordinate system in the modification of the 3rd Example.

<第1実施例>
 本発明に係るロボット用の座標系アライメント方法、アライメントシステム、及びアライメント装置の第1実施例について、以下、図面を参照して説明する。図1は、第1実施例のロボット用の座標系アライメントシステム及び装置(以下、「アライメントシステム」及び「アライメント装置」と呼ぶ。)を含むロボットシステム1の要部構成図である。
<First Example>
A coordinate system alignment method for a robot, an alignment system, and a first embodiment of an alignment device according to the present invention will be described below with reference to the drawings. FIG. 1 is a configuration diagram of a main part of a robot system 1 including a coordinate system alignment system and device for a robot according to the first embodiment (hereinafter, referred to as “alignment system” and “alignment device”).

 第1実施例のロボットシステム1は、ロボット2と、該ロボット2を操作するためのコントローラ3を備えている。 The robot system 1 of the first embodiment includes a robot 2 and a controller 3 for operating the robot 2.

 ロボット2は、先端に動作点20が設けられたアーム部21と、該アーム部21を三次元的に移動させるための直動機構22a~22c及び回転機構23a~23cを備えている。直動機構22a~22c及び回転機構23a~23cは、コントローラ3による制御の下で動作する。コントローラ3は後述するティーチングによって動作空間座標系とロボット座標系の関係を決定するための動作空間座標系設定部31を、機能ブロックとして有している。 The robot 2 includes an arm portion 21 provided with an operating point 20 at the tip thereof, a linear motion mechanism 22a to 22c for three-dimensionally moving the arm portion 21, and a rotation mechanism 23a to 23c. The linear motion mechanisms 22a to 22c and the rotation mechanisms 23a to 23c operate under the control of the controller 3. The controller 3 has an operating space coordinate system setting unit 31 as a functional block for determining the relationship between the operating space coordinate system and the robot coordinate system by teaching described later.

 アーム部21の動作点20の近傍には、三次元センサ4が着脱可能に取り付けられている。三次元センサ4は、図2に示すように、対象物にスリット光を照射するスリット光照射部41と、該スリット光の対象物への投映線を撮影するエリアセンサ42を備えている。これらには計測器座標系が定義されており、図3に示すように、光照射部41は計測器座標系のX’-Z’平面に対して角度θ1傾いた状態で配置され、エリアセンサ42は、X’-Z’平面を挟んでスリット光照射部41とは反対側に、同平面に対して角度θ2傾いた状態で配置されている。第1実施例におけるθ1は0であるが、これは本発明に必須の要件ではなく、θ1の角度が既知でありさえすれば、対象物への投映線がどのようにエリアセンサ42に捉えられるかを把握することができる。従って、該既知の角度に基づく適宜の演算処理によって以降に説明する処理を行うことができる。 A three-dimensional sensor 4 is detachably attached in the vicinity of the operating point 20 of the arm portion 21. As shown in FIG. 2, the three-dimensional sensor 4 includes a slit light irradiation unit 41 that irradiates an object with slit light, and an area sensor 42 that captures a projection line of the slit light onto the object. The measuring instrument coordinate system is defined in these, and as shown in FIG. 3, the light irradiation unit 41 is arranged in a state of being tilted by an angle θ 1 with respect to the X'-Z'plane of the measuring instrument coordinate system, and is an area. The sensor 42 is arranged on the side opposite to the slit light irradiation unit 41 with the X'-Z'plane in between, at an angle of θ 2 with respect to the same plane. Although θ 1 in the first embodiment is 0, this is not an essential requirement for the present invention, and as long as the angle of θ 1 is known, how the projection line to the object can be applied to the area sensor 42. You can figure out if you can catch it. Therefore, the processing described below can be performed by an appropriate arithmetic processing based on the known angle.

 第1実施例ではスリット光照射部41はX’-Z’平面上に配置され、X’-Z’平面に沿ってスリット光を(X’軸方向に幅を有するスリット光をZ’軸方向に向かって)対象物に照射する。θ2は、例えば、30°から40°の範囲内の適宜の角度に設定される。第1実施例では、三次元センサ4として、該センサ4が有する光学系に含まれるレンズによって生じる収差や奥行き収差(カメラを斜め方向に向けたときに画角内で遠近法により生じる収差)を補正するものを使用する。これにより、後述する光切断法において正確な計測が行われ、座標系のアライメント精度が高くなる。 In the first embodiment, the slit light irradiation unit 41 is arranged on an X'-Z'plane, and slit light is emitted along the X'-Z'plane (slit light having a width in the X'axis direction is emitted in the Z'axis direction. Irradiate the object (towards). θ 2 is set to an appropriate angle in the range of, for example, 30 ° to 40 °. In the first embodiment, as the three-dimensional sensor 4, aberrations and depth aberrations (aberrations caused by the perspective method within the angle of view when the camera is oriented in an oblique direction) caused by the lens included in the optical system of the sensor 4 are detected. Use the one to correct. As a result, accurate measurement is performed in the optical cutting method described later, and the alignment accuracy of the coordinate system is improved.

 三次元センサ4には、さらに、制御・処理部43が設けられている。制御・処理部43は、記憶部431の他に、機能ブロックとして、投映線取得部432、計測器姿勢算出部433、及び座標系シフト部434を備えている。第1実施例の制御・処理部43は三次元センサ4の内部に組み込まれた演算処理機構として構成されている。また、この制御・処理部43は、所定の通信インターフェース44を介して、使用者が適宜の入力指示を行うための入力部45と、計測結果等を表示するための表示部46に接続されている。制御・処理部43は、その他、三次元センサ4と別体で設けられ該三次元センサ4と通信可能に構成された携帯型端末等で構成することもできる。 The three-dimensional sensor 4 is further provided with a control / processing unit 43. In addition to the storage unit 431, the control / processing unit 43 includes a projection line acquisition unit 432, a measuring instrument posture calculation unit 433, and a coordinate system shift unit 434 as functional blocks. The control / processing unit 43 of the first embodiment is configured as an arithmetic processing mechanism incorporated inside the three-dimensional sensor 4. Further, the control / processing unit 43 is connected to an input unit 45 for the user to give an appropriate input instruction and a display unit 46 for displaying the measurement result or the like via a predetermined communication interface 44. There is. The control / processing unit 43 may also be configured by a portable terminal or the like provided separately from the three-dimensional sensor 4 and capable of communicating with the three-dimensional sensor 4.

 次に、図4のフローチャートを参照して第1実施例のアライメント方法の手順を説明する。 Next, the procedure of the alignment method of the first embodiment will be described with reference to the flowchart of FIG.

 まず、アライメント用の治具を用意する。アライメント用の治具5は、板状部材51と、該板状部材51の上に固定されたブロック52を含む。図5に示すように、板状部材51は、その上面が、ロボット2の動作点20を動作させる空間の座標系である動作空間座標系の二軸(X軸及びY軸)を含む平面と合致する位置に固定される。また、板状部材51の上面の、動作空間座標系の原点に対応する位置に基準点Aを、動作空間座標系のX軸上の所定の位置に基準点Bを、動作空間座標系のY軸上の所定の位置に基準点Cを、それぞれ設定する。即ち、基準点A及びBにより動作空間座標系のX軸方向が規定され、基準点A及びCにより動作空間座標系のY軸が規定される。 First, prepare a jig for alignment. The alignment jig 5 includes a plate-shaped member 51 and a block 52 fixed on the plate-shaped member 51. As shown in FIG. 5, the upper surface of the plate-shaped member 51 is a plane including two axes (X-axis and Y-axis) of the operating space coordinate system, which is the coordinate system of the space in which the operating point 20 of the robot 2 is operated. It is fixed in the matching position. Further, a reference point A is set at a position on the upper surface of the plate-shaped member 51 corresponding to the origin of the operating space coordinate system, a reference point B is set at a predetermined position on the X axis of the operating space coordinate system, and Y of the operating space coordinate system is set. A reference point C is set at a predetermined position on the axis. That is, the reference points A and B define the X-axis direction of the operating space coordinate system, and the reference points A and C define the Y-axis of the operating space coordinate system.

 ブロック52の大きさは、X軸方向の長さがW(mm)、Y軸方向の長さがL(mm)、Z軸方向の長さがH(mm)である。ブロック52のX軸方向の長さは、エリアセンサ42の視野に入る範囲内で、できるだけ長いことが好ましい。これにより、後述する処理を行う際にブロック52の上面に映るスリット光の投映線が長くなり、ブロック52に対する三次元センサ4の姿勢を求める際の精度が高くなる。 The size of the block 52 is W (mm) in the X-axis direction, L (mm) in the Y-axis direction, and H (mm) in the Z-axis direction. The length of the block 52 in the X-axis direction is preferably as long as possible within the field of view of the area sensor 42. As a result, the projection line of the slit light reflected on the upper surface of the block 52 becomes longer when the process described later is performed, and the accuracy when determining the posture of the three-dimensional sensor 4 with respect to the block 52 becomes higher.

 一方、ブロック52のY軸方向の長さは、例えば10~100mmであり、ロボットの大きさ等を考慮して適宜に決めればよい。ブロック52のY軸方向を長くすることにより、ブロック52にスリット光を照射したときにスリット光の投映線とブロック52の一辺(動作空間座標系のX軸に平行な辺)が平行であるか否かを視認しやすくなる。ブロック52のY軸方向を短くしすぎると、スリット光の投映線がブロック52の一辺(動作空間座標系のX軸に平行な辺)を横切ってしまい、ブロック52の上面を横断する投映線を得ることが難しくなる。 On the other hand, the length of the block 52 in the Y-axis direction is, for example, 10 to 100 mm, and may be appropriately determined in consideration of the size of the robot and the like. By lengthening the Y-axis direction of the block 52, is the projection line of the slit light parallel to one side of the block 52 (the side parallel to the X-axis of the operating space coordinate system) when the block 52 is irradiated with the slit light? It becomes easier to visually recognize whether or not. If the Y-axis direction of the block 52 is made too short, the projection line of the slit light crosses one side of the block 52 (the side parallel to the X axis of the operating space coordinate system), and the projection line crossing the upper surface of the block 52 is formed. It becomes difficult to obtain.

 従来の座標系のアライメント方法では、動作点を正確に位置させるために可能な限り先端が尖った治具を用いることがあったが、先端が尖った治具を用いると光の拡散反射が生じやすく、動作点と治具の先端の位置関係を確認する際の撮像が困難になる場合があった。これに対し、第1実施例では後述する方法及び装置によって座標系のアライメントを行うため、先端が尖った治具を用いる必要がなく、直方体状のブロック52を用いればよいため拡散反射は生じず、容易かつ正確に後述する投映線等が撮像される。 In the conventional coordinate system alignment method, a jig with a sharp tip as much as possible may be used in order to accurately position the operating point, but if a jig with a sharp tip is used, diffuse reflection of light occurs. It was easy, and it was sometimes difficult to take an image when confirming the positional relationship between the operating point and the tip of the jig. On the other hand, in the first embodiment, since the coordinate system is aligned by the method and device described later, it is not necessary to use a jig with a sharp tip, and a rectangular parallelepiped block 52 may be used, so diffuse reflection does not occur. , The projection line and the like, which will be described later, are easily and accurately imaged.

 また、ブロック52の表面には、光の乱反射を抑えるための、つや消し処理が施されている。ブロック52には、例えばつや消しアルマイトからなるものを好適に用いることができる。ブロック52は、動作空間座標系において該ブロック52の底面の中心が予め決められた座標に位置するように、動作空間座標系の三軸に平行に配置される。こうしてアライメント用の治具が設置される(ステップ1。図5参照)。 In addition, the surface of the block 52 is subjected to a matte treatment to suppress diffused reflection of light. For the block 52, for example, one made of matte alumite can be preferably used. The block 52 is arranged parallel to the three axes of the operating space coordinate system so that the center of the bottom surface of the block 52 is located at a predetermined coordinate in the operating space coordinate system. In this way, the alignment jig is installed (step 1. see FIG. 5).

 続いて、コントローラ3によりアーム部21を動かし、動作点20を基準点Aに移動させ、ロボット2に基準点Aの位置をティーチングする。同様に、動作点20を基準点B及びCにも移動させ、それらの位置をロボット2にティーチングする(ステップ2)。このとき、コントローラ3を使用者が自ら操作してもよく、あるいは動作空間座標系設定部31により動作点20を各基準点に移動させてもよい。その後、動作空間座標系設定部31は、基準点A、B、及びCの、動作空間座標系における座標をロボット2に登録し、該ロボット2に予め(例えば出荷時に)登録されているロボット座標系と動作空間座標系の関係を決定する。こうして、ロボット2に動作空間座標系が設定される(ステップ3)。 Subsequently, the controller 3 moves the arm portion 21, moves the operating point 20 to the reference point A, and teaches the robot 2 the position of the reference point A. Similarly, the operating points 20 are also moved to the reference points B and C, and their positions are taught to the robot 2 (step 2). At this time, the user may operate the controller 3 by himself / herself, or the operating point 20 may be moved to each reference point by the operating space coordinate system setting unit 31. After that, the operation space coordinate system setting unit 31 registers the coordinates of the reference points A, B, and C in the operation space coordinate system in the robot 2, and the robot coordinates registered in advance (for example, at the time of shipment) in the robot 2. Determine the relationship between the system and the operating space coordinate system. In this way, the operating space coordinate system is set in the robot 2 (step 3).

 ロボット2に動作空間座標系を設定した後、コントローラ3によりアーム部21を移動し、図5に示すように、三次元センサ4をブロック52の上方に位置させる(ステップ4)。図5では、ロボット2のうちアーム部21以外の図示を省略している。続いて、スリット光照射部41からブロック52に対してスリット光を照射し(ステップ5)、エリアセンサ42によりブロック52への投映線を取得する(ステップ6)。 After setting the operating space coordinate system in the robot 2, the arm portion 21 is moved by the controller 3 and the three-dimensional sensor 4 is positioned above the block 52 as shown in FIG. 5 (step 4). In FIG. 5, the robot 2 other than the arm portion 21 is not shown. Subsequently, the slit light irradiation unit 41 irradiates the block 52 with slit light (step 5), and the area sensor 42 acquires a projection line to the block 52 (step 6).

 第1実施例におけるスリット光は、三次元センサ4の出力信号の基準となる計測器座標系の一軸(X'軸)方向に幅を持つ光であり、該計測器座標系の別の一軸(Z'軸)方向に発せられる。即ち、計測器座標系のX'-Z'平面に沿ったシート状のスリット光である(図6参照)。第1実施例の三次元センサ4は、該センサ4が有する光学系に含まれるレンズによって生じる収差や奥行き収差(カメラを斜め方向に向けたときに画角内で遠近法により生じる収差)を補正する。具体的には、二次元的に配列された画素のどの位置にスリット光が入射しているかを特定し、続いてその位置をキャリブレーションにより補正する。そして補正後の画素の位置から投映線のプロファイルを取得する。 The slit light in the first embodiment is light having a width in one axis (X'axis) direction of the measuring instrument coordinate system that is a reference of the output signal of the three-dimensional sensor 4, and is another one axis (X'axis) of the measuring instrument coordinate system. It is emitted in the Z'axis) direction. That is, it is a sheet-shaped slit light along the X'-Z'plane of the measuring instrument coordinate system (see FIG. 6). The three-dimensional sensor 4 of the first embodiment corrects aberrations and depth aberrations (aberrations caused by the perspective method within the angle of view when the camera is oriented in an oblique direction) caused by a lens included in the optical system of the sensor 4. To do. Specifically, it is specified at which position of the two-dimensionally arranged pixels the slit light is incident, and then the position is corrected by calibration. Then, the profile of the projection line is acquired from the position of the corrected pixel.

 上述のとおり、ブロック52の三辺は動作空間座標系の三軸に平行に配置されており、また、スリット光はX'-Z'平面に沿ったシート状の光である。従って、動作空間座標系のX軸、Y軸、及びZ軸と、計測器座標系のX’軸、Y’軸、及びZ’軸がそれぞれ平行になっていると、ブロック52の上面への投映線は該ブロック52のX軸に平行な一辺の長さWと同一になる。また、ブロック52の上面への投映線と、板状部材51への投映線の間の距離は、ブロック52の高さH(及び三次元センサ4のエリアセンサ42がブロック及び板状部材51を捉える角度)に応じたものとなる。 As described above, the three sides of the block 52 are arranged parallel to the three axes of the operating space coordinate system, and the slit light is a sheet-like light along the X'-Z'plane. Therefore, if the X-axis, Y-axis, and Z-axis of the operating space coordinate system and the X'-axis, Y'-axis, and Z-axis of the measuring instrument coordinate system are parallel to each other, the upper surface of the block 52 can be reached. The projection line is the same as the length W of one side parallel to the X axis of the block 52. Further, the distance between the projection line on the upper surface of the block 52 and the projection line on the plate-shaped member 51 is the height H of the block 52 (and the area sensor 42 of the three-dimensional sensor 4 sets the block and the plate-shaped member 51. It depends on the angle of capture).

 一方、計測器座標系が動作空間座標系に対して傾いている場合には、上記と異なる長さ及び距離を有する投映線が現れる。そうした場合に現れる、投映線について、図6及び7を参照して説明する。図6は、治具5(板状部材51及びブロック52)を基準にスリット光が斜方から照射された状態を示している。なお、図6では、角度α、β、及びγを分かりやすくするため、これらの角度を実際よりも大きくして投映線を示している。図7は、エリアセンサ42により計測された投映線に基づいて作成された表示例である。 On the other hand, when the measuring instrument coordinate system is tilted with respect to the operating space coordinate system, a projection line having a different length and distance from the above appears. The projection lines that appear in such a case will be described with reference to FIGS. 6 and 7. FIG. 6 shows a state in which slit light is irradiated from an oblique direction with reference to the jig 5 (plate-shaped member 51 and block 52). In FIG. 6, in order to make the angles α, β, and γ easy to understand, the projection lines are shown by making these angles larger than they actually are. FIG. 7 is a display example created based on the projection line measured by the area sensor 42.

 第1実施例の場合、エリアセンサ42がX’-Z'平面に対して角度θ2傾いた位置に配置され、斜め上方から投映線を撮影する。そのため、エリアセンサ42により捉えられる、ブロック52の上面への投映線と板状部材51への投映線の間の距離は実際よりも短くなる。そこで、投映線取得部432は、スリット光照射部41及びエリアセンサ42の配置(X’-Z’平面に対する角度θ1及びθ2)及び実際に計測された上記距離に基づいて該距離を補正し測定値hを求める。一方、ブロック52の上面への投映線の長さはそのまま測定値wとする。そして、図7に示すように測定値w, hを表示部46に表示する。 In the case of the first embodiment, the area sensor 42 is arranged at a position tilted by an angle θ 2 with respect to the X'-Z'plane, and the projection line is photographed from diagonally above. Therefore, the distance between the projection line on the upper surface of the block 52 and the projection line on the plate-shaped member 51 captured by the area sensor 42 is shorter than the actual distance. Therefore, the projection line acquisition unit 432 corrects the distance based on the arrangement of the slit light irradiation unit 41 and the area sensor 42 (angles θ 1 and θ 2 with respect to the X'-Z'plane) and the actually measured distance. And find the measured value h. On the other hand, the length of the projection line on the upper surface of the block 52 is taken as the measured value w as it is. Then, as shown in FIG. 7, the measured values w and h are displayed on the display unit 46.

 ここで、図6及び7に示すように、ブロック52の上面への投映線の一端(ブロック52の側面への投映線も撮影されている場合は投映線の屈曲点)を通り計測器座標系のZ‘軸に平行な直線と、該一端を通り動作空間座標系のZ軸と平行な直線とがなす角度をα、エリアセンサ42の画角の一方向(横方向)と該エリアセンサ42で捉えられた投映線がなす角度をβ、ブロック52の上面への投映線と、該投映線の一端を通り動作空間座標系のX軸に平行な直線がなす角度をγ、と規定する。 Here, as shown in FIGS. 6 and 7, the measuring instrument coordinate system passes through one end of the projection line on the upper surface of the block 52 (the bending point of the projection line when the projection line on the side surface of the block 52 is also photographed). The angle formed by the straight line parallel to the Z'axis and the straight line passing through one end and parallel to the Z axis of the operating space coordinate system is α, and one direction (horizontal direction) of the angle of view of the area sensor 42 and the area sensor 42. The angle formed by the projection line captured in 1 is defined as β, and the angle formed by the projection line on the upper surface of the block 52 and the straight line passing through one end of the projection line and parallel to the X axis of the operating space coordinate system is defined as γ.

 上記のように角度α、β、及びγを規定すると、ブロック52のX軸方向の長さW及びZ軸方向の長さHと、上記測定値w及びhとから、次式により角度α及びγが求められる。
 α=cos-1(H/h)…(1)
 γ=cos-1(W/w)…(2)
 また、角度βは、上記の通りエリアセンサ42画角の横方向と、板状部材51への投映線がなす角度として求められる。
If the angles α, β, and γ are defined as described above, the angles α and α and the lengths H in the X-axis direction and the length H in the Z-axis direction of the block 52 and the measured values w and h are calculated by the following equations. γ is required.
α = cos -1 (H / h)… (1)
γ = cos -1 (W / w)… (2)
Further, the angle β is obtained as an angle formed by the lateral direction of the area sensor 42 angle of view and the projection line on the plate-shaped member 51 as described above.

 計測器姿勢算出部433は、上記のようにして投映線のプロファイルから上記の角度α、β、及びγを算出し(ステップ7)、表示部46の画面にそれぞれの角度を表示する。 The measuring instrument posture calculation unit 433 calculates the above angles α, β, and γ from the projection line profile as described above (step 7), and displays each angle on the screen of the display unit 46.

 上記角度α、β、及びγを算出すると、計測器姿勢算出部433は、それらの値が全て予め決められた値(所定値)以下であるかを判定する(ステップ8)。この所定値は、動作空間座標系の各軸と計測器座標系の各軸とが実質的に互いに平行であるであるとみなせる値に設定される。所定値は、例えば0.2度である。角度α、β、及びγが全て所定値以下であれば(ステップ8でYES)、後述するステップ10に進む。 After calculating the angles α, β, and γ, the measuring instrument posture calculation unit 433 determines whether all of these values are equal to or less than a predetermined value (predetermined value) (step 8). This predetermined value is set to a value at which each axis of the operating space coordinate system and each axis of the measuring instrument coordinate system can be regarded as substantially parallel to each other. The predetermined value is, for example, 0.2 degrees. If the angles α, β, and γ are all equal to or less than the predetermined values (YES in step 8), the process proceeds to step 10 described later.

 一方、いずれかの角度が所定値を上回っている場合には(ステップ8でNO)、使用者がコントローラ3を用いて三次元センサ4の姿勢を調整する(ステップ9)。そして、ステップ5に戻って再びブロック52にスリット光を照射し、上記角度α、β、及びγを算出して表示部46に表示するとともに、それらが全て所定値以下であるかを判定する(ステップ5~8)。これらの処理は、角度α、β、及びγが全て所定値以下になるまで繰り返し行う。第1実施例では、角度α、β、及びγを算出するごとに表示部46にそれらの値が表示されるため、使用者は角度α、β、及びγを確認し、計測器座標と動作空間座標がどの程度ずれているのかを直感的に把握することができる。また、その値の変化を確認しながら、ロボット2を用いて行おうとしている作業に求められる精度を考慮するなどして、どの程度まで両座標を合致させる必要があるかを判断することもできる。 On the other hand, if any of the angles exceeds a predetermined value (NO in step 8), the user adjusts the posture of the three-dimensional sensor 4 using the controller 3 (step 9). Then, returning to step 5, the block 52 is irradiated with slit light again, the angles α, β, and γ are calculated and displayed on the display unit 46, and it is determined whether all of them are equal to or less than a predetermined value ( Steps 5-8). These processes are repeated until the angles α, β, and γ are all equal to or less than a predetermined value. In the first embodiment, each time the angles α, β, and γ are calculated, those values are displayed on the display unit 46. Therefore, the user confirms the angles α, β, and γ, and operates with the coordinates of the measuring instrument. It is possible to intuitively grasp how much the spatial coordinates are deviated. It is also possible to determine to what extent it is necessary to match both coordinates while checking the change in the value and considering the accuracy required for the work to be performed using the robot 2. ..

 上記角度α、β、及びγが全て所定値以下になると(ステップ8でYES)、座標系シフト部434は、三次元センサからのスリット光をY'軸(動作空間座標系と計測器座標系が相互に平行であるため実質的にY軸と同じ)方向に走査してブロック52全体の三次元計測データを取得し、ブロック52の所定の位置(例えば動作空間座標系のX-Y平面上に位置する、ブロック52の底面の中心)の、計測器座標系における座標位置を求める(ステップ10)。そして、動作空間座標系におけるブロック52の上記所定の位置座標位置と比較して差分(シフト量)を求める(ステップ11)。最後に、計測器座標系を上記シフト量の大きさだけシフトさせることにより、計測器座標系と動作空間座標系を合致させる。 When the above angles α, β, and γ are all equal to or less than a predetermined value (YES in step 8), the coordinate system shift unit 434 directs the slit light from the three-dimensional sensor to the Y'axis (operating space coordinate system and measuring instrument coordinate system). Scan in the direction (substantially the same as the Y axis because they are parallel to each other) to acquire the three-dimensional measurement data of the entire block 52, and position the block 52 at a predetermined position (for example, on the XY plane of the operating space coordinate system). The coordinate position of the bottom surface of the block 52) in the measuring instrument coordinate system is obtained (step 10). Then, the difference (shift amount) is obtained by comparing with the predetermined position coordinate position of the block 52 in the operating space coordinate system (step 11). Finally, the measuring instrument coordinate system and the operating space coordinate system are matched by shifting the measuring instrument coordinate system by the magnitude of the shift amount.

<第2実施例>
 次に、第2実施例について説明する。図8に示すように、第2実施例のロボットシステム100は、第1実施例のアライメントシステム1の構成に加えて制御・処理装置6を備えている。制御・処理装置6は、ロボット2、三次元センサ4、及びコントローラ3との間で通信可能に設けられている。
<Second Example>
Next, the second embodiment will be described. As shown in FIG. 8, the robot system 100 of the second embodiment includes a control / processing device 6 in addition to the configuration of the alignment system 1 of the first embodiment. The control / processing device 6 is provided so as to be able to communicate with the robot 2, the three-dimensional sensor 4, and the controller 3.

 図9に示すように、制御・処理装置6は、記憶部61のほかに、機能ブロックとして、変位取得部621、変換行列作成部622、三次元計測データ取得部623、及び三次元計測データ統合部624を備えている。制御・処理装置6の実体は、例えば一般的なパーソナルコンピュータであり、予めインストールされた三次元計測データ処理用プログラム62を実行することにより上記の各機能ブロックが具現化される。ここでは三次元センサ4が有する制御・処理部43と制御・処理装置6をそれぞれ備えた構成としたが、三次元センサ4の制御・処理部43の機能の一部又は全部を制御・処理装置6に組み込んだ構成を採ることもできる。 As shown in FIG. 9, in addition to the storage unit 61, the control / processing device 6 has a displacement acquisition unit 621, a transformation matrix creation unit 622, a three-dimensional measurement data acquisition unit 623, and three-dimensional measurement data integration as functional blocks. The unit 624 is provided. The substance of the control / processing device 6 is, for example, a general personal computer, and each of the above functional blocks is embodied by executing the pre-installed three-dimensional measurement data processing program 62. Here, the control / processing unit 43 and the control / processing device 6 of the three-dimensional sensor 4 are provided respectively, but a part or all of the functions of the control / processing unit 43 of the three-dimensional sensor 4 are controlled / processed. The configuration incorporated in 6 can also be adopted.

 第2実施例における各部の動作及びデータ処理の流れについて、図10のフローチャートを参照して説明する。第1実施例と共通する動作や処理については適宜、説明を省略する。 The operation of each part and the flow of data processing in the second embodiment will be described with reference to the flowchart of FIG. The operation and processing common to the first embodiment will be omitted as appropriate.

 第1実施例では、1つの動作空間座標系に対して三次元センサ4の1つの位置及び姿勢を決定し、当該位置及び姿勢で対象物の三次元で計測データを取得する。そのため、対象物が立体形状である場合には、対象物を挟んで三次元センサ4と反対側の部分が陰になり、その部分を三次元計測することができない。 In the first embodiment, one position and orientation of the three-dimensional sensor 4 are determined for one operating space coordinate system, and measurement data is acquired in three dimensions of the object at the position and orientation. Therefore, when the object has a three-dimensional shape, the portion on the opposite side of the three-dimensional sensor 4 across the object is shaded, and that portion cannot be measured three-dimensionally.

 そこで、第2実施例では、まず、使用者が、第1実施例と同様の手順で、ロボット2の姿勢に対応する複数の動作空間座標系を設定する。これら複数の動作空間座標系は、第1実施例における三次元計測では陰になる部分も三次元計測することができるように設定する。そして、これら複数の動作空間座標系のうちの1つを基準動作空間座標系とし、それ以外の動作空間座標系を参照動作空間座標系とする(ステップ21)。以下、参照動作空間座標系を2つ(第1参照動作空間座標系と第2参照動作空間座標系)、設定した場合を一例に説明する。以下、基準動作空間座標系をΣA、第1参照動作空間座標系をΣB、及び第2参照動作空間座標系をΣCと表記する。参照動作空間座標系の数は1つであっても良く、3つ以上であってもよい。 Therefore, in the second embodiment, the user first sets a plurality of operating space coordinate systems corresponding to the postures of the robot 2 by the same procedure as in the first embodiment. These plurality of operating space coordinate systems are set so that the part that is shaded in the three-dimensional measurement in the first embodiment can also be three-dimensionally measured. Then, one of these plurality of operating space coordinate systems is used as the reference operating space coordinate system, and the other operating space coordinate systems are used as the reference operating space coordinate system (step 21). Hereinafter, a case where two reference operating space coordinate systems (first reference operating space coordinate system and second reference operating space coordinate system) are set will be described as an example. Hereinafter, the reference operating space coordinate system is referred to as Σ A , the first reference operating space coordinate system is referred to as Σ B , and the second reference operating space coordinate system is referred to as Σ C. The number of reference operating space coordinate systems may be one or three or more.

 次に、基準動作空間座標系ΣA、第1参照動作空間座標系ΣB、及び第2参照動作空間座標系ΣCのそれぞれについて、第1実施例と同様の手順により、三次元センサ4の位置及び姿勢を決定する(ステップ22)。 Next, for each of the reference operating space coordinate system Σ A , the first reference operating space coordinate system Σ B , and the second reference operating space coordinate system Σ C , the three-dimensional sensor 4 is subjected to the same procedure as in the first embodiment. The position and posture are determined (step 22).

 基準動作空間座標系や参照動作空間座標系を設定する三次元空間に存在する点に関して座標系ΣBで表される位置ベクトルBrと座標系ΣAで表される位置ベクトルArの関係は次式で表される。

Figure JPOXMLDOC01-appb-M000001
The relationship between the position vector B r represented by the coordinate system Σ B and the position vector A r represented by the coordinate system Σ A with respect to the points existing in the three-dimensional space that sets the reference motion space coordinate system and the reference motion space coordinate system is It is expressed by the following equation.
Figure JPOXMLDOC01-appb-M000001

 ここで、ATBは同次変換行列であり、回転変換BRAと平行移動BPAを要素として含む以下の行列で表すことができる。

Figure JPOXMLDOC01-appb-M000002
Here, A T B is a homogeneous transformation matrix, and can be represented by the following matrix including the rotation transformation B R A and the translation B P A as elements.
Figure JPOXMLDOC01-appb-M000002

 この同次変換行列の行列要素を求めるには、座標系ΣAにおける、座標系ΣBの原点の位置と姿勢を求めればよい。 To find the matrix elements of this homogeneous transformation matrix, find the position and orientation of the origin of the coordinate system Σ B in the coordinate system Σ A.

 具体的には、変位取得部621により、ロボット2の動作点20を、基準動作空間座標系ΣAの原点(xA=0, yA=0, zA=0, rxA=0, ryA=0, rzA=0)にセットし、続いて該動作点20を参照動作空間座標系ΣBの原点(xB=0, yB=0, zB=0, rxB=0, ryB=0, rzB=0)に移動させる(ステップ23)。そして、動作点20が参照動作空間座標系ΣBの原点にあるときの、基準動作空間座標系ΣAにおける並進変位及び回転変位を取得する。これは、ロボット2の動作を制御するコントローラ3から読み出すことができる。変換行列作成部622は、これらの並進変位及び回転変位に基づいて同次変換行列ATBを作成する(ステップ24)。この同次変換行列ATBは基準動作空間座標系ΣBにおける座標を参照動作空間座標系ΣAにおける座標に変換するものである。 Specifically, the displacement acquisition unit 621 sets the operating point 20 of the robot 2 to the origin (x A = 0, y A = 0, z A = 0, r xA = 0, r) of the reference operating space coordinate system Σ A. Set yA = 0, r zA = 0), and then refer to the operating point 20. Origin of the operating space coordinate system Σ B (x B = 0, y B = 0, z B = 0, r xB = 0, r yB = 0, is moved to r zB = 0) (step 23). Then, the translational displacement and the rotational displacement in the reference operating space coordinate system Σ A when the operating point 20 is at the origin of the reference operating space coordinate system Σ B are acquired. This can be read from the controller 3 that controls the operation of the robot 2. The transformation matrix creation unit 622 creates a homogeneous transformation matrix A T B based on these translational displacements and rotational displacements (step 24). This homogeneous transformation matrix A T B transforms the coordinates in the reference operating space coordinate system Σ B into the coordinates in the reference operating space coordinate system Σ A.

 第2参照動作空間座標系ΣCについても上記同様の処理を行うことにより、第2参照動作空間座標系ΣCを基準動作空間座標系ΣAに変換する同次変換行列ATCが得られる。 By performing the same process described above for the second reference operating space coordinate system sigma C, homogeneous transformation matrix A T C is obtained for converting the second reference operating space coordinate system sigma C in standard operating space coordinate system sigma A ..

 次に、三次元計測データ取得部623は、基準動作空間座標系ΣA、第1参照動作空間座標系ΣB、及び第2参照動作空間座標系ΣCのそれぞれにおいて、対象物の三次元計測データを取得する(ステップ25)。そして、三次元計測データ統合部624が、第1参照動作空間座標系ΣB及び第2参照動作空間座標系ΣCにおける対象物の三次元計測データをそれぞれ、同次変換行列ATBATCによって基準動作空間座標系ΣAの三次元計測データに変換する(ステップ26。図11)。三次元計測データ統合部624は、こうして得られた基準動作空間座標系ΣAにおける対象物の三次元計測データと基準動作空間座標系ΣAで取得した対象物の三次元計測データを統合し(ステップ27)、表示部66の画面に表示する(ステップ28)。使用者は、入力部65を通じた適宜の操作により、表示部66の画面に表示される対象物の三次元計測データを回転させる等して該対象物の形状を任意の方向から確認することができる。 Next, the three-dimensional measurement data acquisition unit 623 performs three-dimensional measurement of the object in each of the reference operating space coordinate system Σ A , the first reference operating space coordinate system Σ B , and the second reference operating space coordinate system Σ C. Acquire the data (step 25). Then, the three-dimensional measurement data integration unit 624 inputs the three-dimensional measurement data of the object in the first reference operating space coordinate system Σ B and the second reference operating space coordinate system Σ C , respectively, into the same-order conversion matrices A T B and A. It is converted into three-dimensional measurement data of the reference operating space coordinate system Σ A by T C (step 26, FIG. 11). The three-dimensional measurement data integration unit 624 integrates the three-dimensional measurement data of the object in the reference operating space coordinate system Σ A thus obtained and the three-dimensional measurement data of the object acquired in the reference operating space coordinate system Σ A ( Step 27), the display is displayed on the screen of the display unit 66 (step 28). The user can confirm the shape of the object from an arbitrary direction by rotating the three-dimensional measurement data of the object displayed on the screen of the display unit 66 by an appropriate operation through the input unit 65. it can.

 図12に、3つの異なる動作空間座標系にそれぞれ対応する位置及び姿勢で三次元センサ4により取得した、対象物の三次元計測データを統合した結果を示す。図12(a)は対象物の光学画像であり、図12(b)は3つの異なる座標系で取得した対象物の三次元計測データを統合した結果である。第2実施例では、ロボット2の動作点20を基準動作空間座標系の原点と参照動作空間座標系の原点の間で移動させて並進変位及び回転変位を取得し、該変位に基づいて参照動作空間座標系を基準動作空間座標系に変換する変換行列を求めるのみで、異なる動作空間座標系に対応する位置及び姿勢により三次元センサ4で取得した3つの三次元計測データを簡便に統合することができる。 FIG. 12 shows the result of integrating the three-dimensional measurement data of the object acquired by the three-dimensional sensor 4 at the positions and orientations corresponding to the three different operating space coordinate systems. FIG. 12 (a) is an optical image of the object, and FIG. 12 (b) is the result of integrating the three-dimensional measurement data of the object acquired in three different coordinate systems. In the second embodiment, the motion point 20 of the robot 2 is moved between the origin of the reference motion space coordinate system and the origin of the reference motion space coordinate system to acquire translational displacement and rotational displacement, and the reference motion is based on the displacement. By simply finding the conversion matrix that converts the spatial coordinate system to the reference operating space coordinate system, it is possible to easily integrate the three three-dimensional measurement data acquired by the three-dimensional sensor 4 according to the positions and orientations corresponding to the different operating space coordinate systems. Can be done.

 三次元計測データを処理する技術として、例えば特許文献2-4に記載のものが従来提案されている。しかし、これらの方法では、例えばせん断変形やスケーリングによって三次元計測データの分解能が低下する場合がある。また、これらの技術では、データ統合の際に同次変換行列の要素を求める複雑な計算を行う必要があり、例えば、ロボットの動作空間座標系とは別に、複数の位置の座標値を求め、それらから同次変換行列を得る必要がある。しかし、ロボットシステムが多く用いられている工場等の作業現場においてオペレータがそのように高度な演算処理を行うことは困難である。 As a technique for processing three-dimensional measurement data, for example, the technique described in Patent Document 2-4 has been conventionally proposed. However, in these methods, the resolution of the three-dimensional measurement data may decrease due to, for example, shear deformation or scaling. In addition, in these techniques, it is necessary to perform complicated calculations to obtain the elements of the homogeneous transformation matrix at the time of data integration. For example, the coordinate values of a plurality of positions are obtained separately from the operating space coordinate system of the robot. It is necessary to obtain the homogeneous transformation matrix from them. However, it is difficult for an operator to perform such advanced arithmetic processing at a work site such as a factory where many robot systems are used.

 これに対し、第2実施例では、第1実施例と同様に、アライメント用の治具5を使って、動作空間座標系を設定する。そのため、特許文献2-4のようにせん断変形やスケーリングが生じることがなく、高い分解能及び精度を保ちつつ簡便に三次元計測データを統合することができる。 On the other hand, in the second embodiment, the operating space coordinate system is set by using the alignment jig 5 as in the first embodiment. Therefore, unlike Patent Document 2-4, shear deformation and scaling do not occur, and three-dimensional measurement data can be easily integrated while maintaining high resolution and accuracy.

<第3実施例>
 第3実施例について説明する。図13に示すように、第3実施例のロボットシステム200は、複数のロボット2a、2b、各ロボット2a、2bの動作を制御するコントローラ3a、3b、及び制御・処理装置7を備えている。ロボット2a、2b、及びコントローラ3a、3bは第1実施例及び第2実施例におけるロボット2及びコントローラ3と同様の構成を備えている。また、制御・処理装置7は第2実施例における制御・処理装置6と同様の構成及び機能ブロックを備えている。ここでは、説明を簡単にするために2つのロボットを備えた構成としているが、3つ以上のロボット2を備えたシステムとしてもよい。
<Third Example>
A third embodiment will be described. As shown in FIG. 13, the robot system 200 of the third embodiment includes a plurality of robots 2a and 2b, controllers 3a and 3b that control the operations of the robots 2a and 2b, and a control / processing device 7. The robots 2a and 2b and the controllers 3a and 3b have the same configurations as the robots 2 and the controller 3 in the first and second embodiments. Further, the control / processing device 7 has the same configuration and functional blocks as the control / processing device 6 in the second embodiment. Here, the configuration is provided with two robots for the sake of simplicity, but a system including three or more robots 2 may be used.

 第2実施例におけるロボット2の動作点20の可動範囲はアーム部21を移動させる直動機構22a~22cの長さによって決まり、ロボット2に対して対象物が大きいと、1台のロボット2のみでは当該対象物の反対側までアームが届かず、当該対象物の一部の三次元計測データを取得することができない場合がある。また、1台のロボット2のみを用いるシステムでは、同時に複数の方向から対象物の三次元計測データを得ることはできない。 The movable range of the operating point 20 of the robot 2 in the second embodiment is determined by the lengths of the linear motion mechanisms 22a to 22c for moving the arm portion 21, and if the object is larger than the robot 2, only one robot 2 is used. Then, the arm may not reach the opposite side of the object, and it may not be possible to acquire three-dimensional measurement data of a part of the object. Further, in a system using only one robot 2, it is not possible to obtain three-dimensional measurement data of an object from a plurality of directions at the same time.

 そこで、第3実施例では複数のロボット2を用いて対象物の三次元計測データを取得する。第3実施例における各部の動作及びデータ処理の流れについて、図14のフローチャートを参照して説明する。第1実施例又は第2実施例と共通する動作や処理については適宜、説明を省略する。 Therefore, in the third embodiment, three-dimensional measurement data of the object is acquired by using a plurality of robots 2. The operation of each part and the flow of data processing in the third embodiment will be described with reference to the flowchart of FIG. Descriptions of operations and processes common to those of the first embodiment or the second embodiment will be omitted as appropriate.

 はじめに、ロボット2aについて、連結動作空間座標系ΣAと参照動作空間座標系ΣBを設定する。また、ロボット2bについても、連結動作空間座標系ΣAと参照動作空間座標系ΣCを設定する。つまり、複数のロボット2に共通する連結動作空間座標系を設定し、各ロボット2に独自の参照動作空間座標系を設定する(ステップ31。図15参照)。連結動作空間座標系ΣAは、本発明の第3の態様及び第6の態様における基準動作空間座標系に相当する。なお、いずれか一方のロボットについては、連結動作空間座標系ΣAと参照動作空間座標系ΣBを同一の座標系としてもよい。 First, for the robot 2a, the connected motion space coordinate system Σ A and the reference motion space coordinate system Σ B are set. Also, for the robot 2b, the connected motion space coordinate system Σ A and the reference motion space coordinate system Σ C are set. That is, a connected motion space coordinate system common to the plurality of robots 2 is set, and a unique reference motion space coordinate system is set for each robot 2 (step 31; see FIG. 15). The connected operating space coordinate system Σ A corresponds to the reference operating space coordinate system in the third and sixth aspects of the present invention. For either robot, the connected operating space coordinate system Σ A and the reference operating space coordinate system Σ B may be the same coordinate system.

 次に、第1実施例と同様の手順により、各動作空間座標系における三次元センサ4の位置及び姿勢を決定する(ステップ32)。 Next, the position and orientation of the three-dimensional sensor 4 in each operating space coordinate system are determined by the same procedure as in the first embodiment (step 32).

 また、第2実施例と同様の手順により、各ロボット2a、2bに設定した参照動作空間座標系ΣB, ΣCを連結動作空間座標系ΣAに変換する同次変換行列を求める(ステップ33)。 Further, the homogeneous transformation matrix for converting the reference motion space coordinate systems Σ B and Σ C set in the robots 2a and 2b into the connected motion space coordinate system Σ A is obtained by the same procedure as in the second embodiment (step 33). ).

 その後、各ロボット2a、2bのそれぞれにおいて参照動作空間座標系ΣB, ΣCで対象物の三次元計測データを取得する(ステップ34)。そして、参照動作空間座標系ΣB, ΣCにおける対象物の三次元計測データをそれぞれ、同次変換行列ATBATCによって連結動作空間座標系ΣAの三次元計測データに変換する(ステップ35。図15)。こうして、異なるロボット2a、2bに取り付けた三次元センサ4で取得した対象物の三次元計測データを、基準動作空間座標系ΣAにおける対象物の三次元計測データと統合することができる(ステップ36)。統合されたデータは表示部66の画面に表示される(ステップ37)。 After that, the three-dimensional measurement data of the object is acquired in the reference motion space coordinate systems Σ B and Σ C in each of the robots 2a and 2b (step 34). Then, the three-dimensional measurement data of the object in the reference motion space coordinate system Σ B and Σ C are converted into the three-dimensional measurement data of the connected motion space coordinate system Σ A by the same-order transformation matrices A T B and A T C , respectively. (Step 35. FIG. 15). Thus, different robot 2a, the three-dimensional measurement data of the acquired object in a three-dimensional sensor 4 attached to 2b, can be integrated with three-dimensional measurement data of the object in the reference operation space coordinate system sigma A (Step 36 ). The integrated data is displayed on the screen of the display unit 66 (step 37).

 第3実施例では、複数のロボット2a、2bを使用するため、ロボット2のアームの長さによる制約を受けることなく、大きな対象物の全体の三次元計測データを取得することができる。また、同時に複数の方向から対象物の三次元計測データを得ることができる。 In the third embodiment, since a plurality of robots 2a and 2b are used, it is possible to acquire the three-dimensional measurement data of the entire large object without being restricted by the length of the arm of the robot 2. In addition, three-dimensional measurement data of the object can be obtained from a plurality of directions at the same time.

 上記の説明では、2台のロボット2について共通する連結動作空間座標系ΣAを設定したが、搬送される対象物を多数のロボット2に取り付けた三次元センサ4で計測する場合、全てのロボット2に共通する連結動作空間座標系を設定することが困難な場合がある。そうした場合には、複数のロボット2のそれぞれについて、複数の動作空間座標系であって、そのうちの1つが少なくとも別の1つのロボットについて設定される動作空間座標系と共通である、複数の動作空間座標系を設定すればよい。 In the above explanation, the common connection operation space coordinate system Σ A is set for the two robots 2, but when the object to be transported is measured by the three-dimensional sensors 4 attached to many robots 2, all the robots. It may be difficult to set the connection operation space coordinate system common to 2. In such a case, for each of the plurality of robots 2, there are a plurality of motion space coordinate systems, one of which is common to the motion space coordinate system set for at least another robot. The coordinate system may be set.

 図16に第3実施例の変形例のロボットシステム300を示す。このロボットシステム300では、ベルトコンベア8により搬送される対象物9をモニタする。ベルトコンベア8を挟む一方の側には、対象物9の搬送方向に沿って複数のロボット2a、2b…が配置され、他方の側にも複数のロボット2e、2f…が配置されている。 FIG. 16 shows a robot system 300 of a modified example of the third embodiment. In this robot system 300, the object 9 conveyed by the belt conveyor 8 is monitored. A plurality of robots 2a, 2b ... Are arranged along the transport direction of the object 9 on one side of the belt conveyor 8, and a plurality of robots 2e, 2f ... Are arranged on the other side.

 ロボット2aには基準動作空間座標系ΣAのみを設定する。また、ロボット2bには、連結動作空間座標系ΣAと参照動作空間座標系ΣBを設定する。ロボット2c以下についても、隣接配置されたロボット2と共通する連結動作空間座標系と参照動作空間座標系を設定する。また、ベルトコンベア8を挟んでロボット2aの反対側に位置するロボット2eには、連結動作空間座標系ΣAと参照動作空間座標系ΣEを設定する。ロボット2f以下については、隣接配置されたロボット2と共通する連結動作空間座標系と参照動作空間座標系を設定する。 Only the reference operating space coordinate system Σ A is set for the robot 2a. Further, the connected operation space coordinate system Σ A and the reference operation space coordinate system Σ B are set in the robot 2b. For the robot 2c and below, the connected motion space coordinate system and the reference motion space coordinate system common to the adjacent robots 2 are set. Further, the connected operating space coordinate system Σ A and the reference operating space coordinate system Σ E are set in the robot 2e located on the opposite side of the robot 2a with the belt conveyor 8 in between. For the robot 2f and below, the connected motion space coordinate system and the reference motion space coordinate system common to the adjacent robots 2 are set.

 第3実施例において説明した通り、これらの座標系は、それぞれに対応する同次変換行列を用いて相互に変換することができる。例えば、図16に示すとおり、ロボット2fの参照動作空間座標系ΣFで取得した三次元計測データは、2つの同次変換行列ETF及びATEで順に変換することにより基準動作空間座標系ΣAの三次元計測データに変換することができる。このように、変形例のシステム300では、1乃至複数の同次変換行列を用いて、全ての参照動作空間座標系で取得した対象物9の三次元計測データを基準動作空間座標系ΣAの三次元計測データに変換し、それらを結合する。これにより、搬送等により移動する対象物9を広い範囲でモニタすることが可能になる。 As described in the third embodiment, these coordinate systems can be transformed with each other using the corresponding linear transformation matrices. For example, as shown in FIG. 16, the three-dimensional measurement data acquired in the reference motion space coordinate system Σ F of the robot 2f is converted in order by two homogeneous transformation matrices E T F and A T E to obtain the reference motion space coordinates. It can be converted into 3D measurement data of system Σ A. As described above, in the modified example system 300, the three-dimensional measurement data of the object 9 acquired in all the reference motion space coordinate systems is used in the reference motion space coordinate system Σ A by using one or more homogeneous transformation matrices. Convert to 3D measurement data and combine them. This makes it possible to monitor an object 9 that moves due to transportation or the like in a wide range.

 上記実施例で説明した具体的な構成や数値はいずれも一例であって、本発明の趣旨に沿って適宜に変更することができる。 The specific configurations and numerical values described in the above examples are all examples, and can be appropriately changed according to the gist of the present invention.

 上記実施例では、治具5を使用して両座標系のシフト量を求めた、同座標系をシフトさせて一致させる処理まで行ったが、これらの処理は必須ではない。例えば、実際にロボットを動作させる際に、計測器座標系で取得された位置に上記シフト量を加えて(あるいは減じて)動作空間座標系の位置に変換する処理を、実際にロボットを動作させる毎に行ってもよい。あるいは、動作空間座標系の三軸と計測器座標系の三軸を互いに平行にした後、別の治具を配置し、動作空間座標系と計測器座標系のそれぞれにおける、その治具の所定位置の座標の差からシフト量を求めることもできる。シフト量を求めるために使用する治具は必ずしも直方体状のものである必要はなく、上記所定位置を規定可能な適宜の形状のものを用いることができる。例えば、直方体状のブロック52の全体のプロファイルを取得する際にエッジ部分の形状を正確に計測することが難しい場合には、円筒状のブロックを用いるとよい。また、シフト量の算出に用いる治具はブロックに限らず、所定位置(例えばX-Y平面上に位置する中心と該中心からの高さの座標が既知であり、三次元センサ4によって三次元計測データを取得可能な、適宜のものを用いることができる。 In the above embodiment, the jig 5 was used to determine the shift amount of both coordinate systems, and even the process of shifting and matching the same coordinate system was performed, but these processes are not essential. For example, when actually operating the robot, the process of adding (or subtracting) the above shift amount to the position acquired in the measuring instrument coordinate system to convert it to the position in the operating space coordinate system is actually operated. You may go every time. Alternatively, after making the three axes of the operating space coordinate system and the three axes of the measuring instrument coordinate system parallel to each other, another jig is placed, and the predetermined jig is specified in each of the operating space coordinate system and the measuring instrument coordinate system. The shift amount can also be obtained from the difference in the coordinates of the positions. The jig used to obtain the shift amount does not necessarily have to be a rectangular parallelepiped shape, and a jig having an appropriate shape capable of defining the predetermined position can be used. For example, when it is difficult to accurately measure the shape of the edge portion when acquiring the entire profile of the rectangular parallelepiped block 52, it is preferable to use the cylindrical block. Further, the jig used for calculating the shift amount is not limited to the block, and the coordinates of the center located on the XY plane and the height from the center are known, and the three-dimensional measurement data is measured by the three-dimensional sensor 4. Can be used as appropriate.

 現在、製造業の現場では、人口の減少により、人力に代わるロボットの普及が急務とされている。しかし、ロボットと三次元計測器を使ったシステムの統合には高度な知識が必要とされるため、現場の作業員にとって必ずしも容易ではない。特に、様々なロボットメーカにより製造されたロボットが用いられる自動車産業などでは、ロボットメーカによって操作性が異なるために、その操作を熟知した専門の技術者の派遣が必要となる場合があり、人材の確保が難しく、また人材の育成も難しい。例えば、ロボットの使用中に、予期しない衝突等が起こり三次元計測器の取り付け位置がずれてしまうことがある。現場では速やかな復旧が求められるため、高度な知識を有することなく様々なロボットメーカにより製造されたロボットに共通して適用可能なアライメント技術が必要とされており、本発明を好適に用いることができる。 Currently, in the manufacturing industry, there is an urgent need to popularize robots that can replace human power due to the declining population. However, integration of systems using robots and 3D measuring instruments requires advanced knowledge, so it is not always easy for field workers. In particular, in the automobile industry where robots manufactured by various robot manufacturers are used, operability differs depending on the robot manufacturer, so it may be necessary to dispatch specialized engineers who are familiar with the operation. It is difficult to secure and develop human resources. For example, while using the robot, an unexpected collision or the like may occur and the mounting position of the three-dimensional measuring instrument may shift. Since prompt recovery is required in the field, an alignment technique that can be commonly applied to robots manufactured by various robot manufacturers without having advanced knowledge is required, and the present invention can be preferably used. it can.

 産業用ロボットの普及と、産業用ロボットを制御するために用いるカメラなどのセンシング技術の高度化に伴い、外観検査やリバースエンジニアリングなどを目的として、対象物の外観だけでなく内観も含めたあらゆる三次元形状のデータを精度良く取得したいというニーズが増えてきている。また、その対象物のサイズも大きくなり、電車や自動車を丸ごとスキャンするようなニーズもある。しかし、従来技術をそのまま高度化したシステムは演算処理等の工程が複雑化し、ロボットだけでなく三次元計測に関するプログラミング等の技術にも精通したエンジニアではなければ対応することが困難になりつつある。
 上記第2実施例や第3実施例では、ロボットに取り付けた三次元センサ等の計測器を用いて光切断法により対象物の三次元計測データを統合する際に、従来のように複雑な演算プログラムを必要とすることなく、オペレータが簡便な工程で、異なる座標系で取得した三次元計測データを統合することができる。
With the spread of industrial robots and the sophistication of sensing technology such as cameras used to control industrial robots, all tertiary not only the appearance but also the inside of the object are used for the purpose of visual inspection and reverse engineering. There is an increasing need to acquire data of the original shape with high accuracy. In addition, the size of the object is also increasing, and there is a need to scan the entire train or automobile. However, in a system in which the conventional technology is advanced as it is, processes such as arithmetic processing become complicated, and it is becoming difficult to deal with it unless an engineer is familiar with not only robots but also technologies such as programming related to three-dimensional measurement.
In the second and third embodiments, when integrating the three-dimensional measurement data of the object by the optical cutting method using a measuring instrument such as a three-dimensional sensor attached to the robot, a complicated calculation as in the conventional case is performed. The operator can integrate the three-dimensional measurement data acquired in different coordinate systems in a simple process without the need for a program.

1、100、200、300…ロボットシステム
2…ロボット
 20…動作点
 21…アーム部
 22a~22c…直動機構
 23a~23c…回転機構
3…コントローラ
 31…動作空間座標系設定部
4…三次元センサ
 41…スリット光照射部
 42…エリアセンサ
 43…制御・処理部
  431…記憶部
  432…投映線取得部
  433…計測器姿勢算出部
 44…通信インターフェース
 45…入力部
 46…表示部
5…治具
 51…板状部材
 52…ブロック
6、7…制御・処理装置
 61…記憶部
 62…三次元計測データ処理用プログラム
  621…変位取得部
  622…変換行列作成部
  623…三次元計測データ取得部
  624…三次元計測データ統合部 45…入力部
 65…入力部
 66…表示部
1, 100, 200, 300 ... Robot system 2 ... Robot 20 ... Operating point 21 ... Arm unit 22a-22c ... Linear mechanism 23a-23c ... Rotation mechanism 3 ... Controller 31 ... Operating space coordinate system setting unit 4 ... Three-dimensional sensor 41 ... Slit light irradiation unit 42 ... Area sensor 43 ... Control / processing unit 431 ... Storage unit 432 ... Projection line acquisition unit 433 ... Measuring instrument attitude calculation unit 44 ... Communication interface 45 ... Input unit 46 ... Display unit 5 ... Jig 51 ... Plate-shaped member 52 ... Blocks 6, 7 ... Control / processing device 61 ... Storage unit 62 ... Three-dimensional measurement data processing program 621 ... Displacement acquisition unit 622 ... Conversion matrix creation unit 623 ... Three-dimensional measurement data acquisition unit 624 ... Third-order Original measurement data integration unit 45 ... Input unit 65 ... Input unit 66 ... Display unit

Claims (14)

 動作点を三次元的に移動するためのロボットの座標系であるロボット座標系と、光切断法を実行可能であり前記動作点に対する位置及び姿勢が不変である三次元計測器の座標系である計測器座標系をアライメントする方法であって、
 前記動作点の動作空間の座標系である動作空間座標系における原点に位置する第1基準点、該第1基準点で直交する2つの直線上にそれぞれ位置する第2基準点及び第3基準点に前記動作点を移動させて各基準点の位置をティーチングすることにより、前記動作空間座標系と前記ロボット座標系の関係を決定し、
 前記動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に配置された板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対し、前記三次元計測器からシート状のスリット光を照射して、該スリット光による前記参照物体への投映線を取得し、
 前記投映線のプロファイルに基づいて、前記参照物体に対する前記三次元計測器の姿勢を求め、
 前記三次元計測器の姿勢が所定の基準姿勢範囲内に入るように前記三次元計測器を移動させる
 工程を有することを特徴とするロボット用の座標系アライメント方法。
The robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally, and the coordinate system of the three-dimensional measuring instrument, which can execute the optical cutting method and whose position and orientation with respect to the operating point are invariant. It is a method of aligning the coordinate system of the measuring instrument.
The first reference point located at the origin in the operating space coordinate system, which is the coordinate system of the operating space of the operating point, and the second reference point and the third reference point located on two straight lines orthogonal to the first reference point, respectively. By moving the motion point to and teaching the position of each reference point, the relationship between the motion space coordinate system and the robot coordinate system is determined.
On the surface of the plate-shaped member arranged parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A rectangular parallelepiped reference object fixed to the above is irradiated with a sheet-shaped slit light from the three-dimensional measuring instrument to acquire a projection line to the reference object by the slit light.
Based on the profile of the projection line, the posture of the three-dimensional measuring instrument with respect to the reference object is obtained.
A coordinate system alignment method for a robot, which comprises a step of moving the three-dimensional measuring instrument so that the posture of the three-dimensional measuring instrument falls within a predetermined reference posture range.
 前記三次元計測器の姿勢が所定の基準姿勢範囲内に入るように前記三次元計測器を移動させたあと、さらに、
 前記参照物体の所定位置の、前記動作空間座標系における座標と、前記計測器座標系における座標を比較して差分を求め、
 前記計測器座標系を前記差分だけシフトさせる
 工程を有することを特徴とする請求項1に記載のロボット用の座標系アライメント方法。
After moving the three-dimensional measuring instrument so that the posture of the three-dimensional measuring instrument falls within a predetermined reference posture range, further
The difference is obtained by comparing the coordinates of the predetermined position of the reference object in the operating space coordinate system with the coordinates in the measuring instrument coordinate system.
The coordinate system alignment method for a robot according to claim 1, further comprising a step of shifting the measuring instrument coordinate system by the difference.
 前記動作空間座標系において、前記第1基準点が原点であり、前記第2基準点及び前記第3基準点が該座標系の軸上の点である
 ことを特徴とする請求項1又は2に記載のロボット用の座標系アライメント方法。
The first or second aspect of the operating space coordinate system is characterized in that the first reference point is the origin, and the second reference point and the third reference point are points on the axis of the coordinate system. The coordinate system alignment method for the robot described.
 前記参照物体の三辺が前記動作空間座標系の三軸に平行になるように該参照物体が配置されている
 ことを特徴とする請求項1から3のいずれかに記載のロボット用の座標系アライメント方法。
The coordinate system for a robot according to any one of claims 1 to 3, wherein the reference object is arranged so that the three sides of the reference object are parallel to the three axes of the operating space coordinate system. Alignment method.
 前記スリット光が、前記計測器座標系の一軸方向に幅を有し、該計測器座標系の別の一軸方向に発せられる光である
 ことを特徴とする請求項1から4のいずれかに記載のロボット用の座標系アライメント方法。
The invention according to any one of claims 1 to 4, wherein the slit light has a width in one axial direction of the measuring instrument coordinate system and is emitted in another uniaxial direction of the measuring instrument coordinate system. Coordinate system alignment method for robots.
 前記動作空間座標系として、基準動作空間座標系と参照動作空間座標系を設定し、
 前記動作点を、前記基準動作空間座標系の原点と前記参照動作空間座標系の原点の間で移動させて並進変位及び回転変位を取得し、
 前記並進変位及び回転変位に基づいて、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を求め、
 前記基準動作空間座標系と前記参照動作空間座標系のそれぞれにおいて、前記三次元計測器の姿勢が前記所定の基準姿勢範囲内に入るように該三次元計測器を移動させて対象物の三次元計測データを取得し、
 前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記変換行列を用いて前記基準動作空間座標系の三次元計測データに変換して該基準空間座標系で取得した前記対象物の三次元計測データと統合する
 ことを特徴とする請求項1から5のいずれかに記載のロボット用の座標系アライメント方法。
As the operating space coordinate system, a reference operating space coordinate system and a reference operating space coordinate system are set.
The operating point is moved between the origin of the reference operating space coordinate system and the origin of the reference operating space coordinate system to obtain translational displacement and rotational displacement.
Based on the translational displacement and the rotational displacement, a transformation matrix for converting the reference operating space coordinate system into the reference operating space coordinate system is obtained.
In each of the reference motion space coordinate system and the reference motion space coordinate system, the three-dimensional measuring instrument is moved so that the posture of the three-dimensional measuring instrument falls within the predetermined reference posture range, and the object is three-dimensional. Get measurement data,
The object acquired in the reference space coordinate system by converting the three-dimensional measurement data of the object acquired in the reference operation space coordinate system into the three-dimensional measurement data of the reference operation space coordinate system using the conversion matrix. The coordinate system alignment method for a robot according to any one of claims 1 to 5, wherein the coordinate system alignment method is integrated with three-dimensional measurement data of an object.
 複数のロボットのそれぞれについて、複数の前記動作空間座標系であって、そのうちの1つが少なくとも別の1つのロボットについて設定される動作空間座標系と共通である、複数の動作空間座標系を設定し、
 前記複数のロボットに設定された全ての動作空間座標系のうちの1つを前記基準動作空間座標系、他を参照動作空間座標系とし、
 前記複数のロボットに設定された座標系のうち、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を作成し、
 前記複数のロボットのそれぞれについて設定された前記複数の動作空間座標系のそれぞれにおいて前記対象物の三次元計測データを取得し、
 前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記基準動作空間座標系の三次元計測データに変換して統合する
 ことを特徴とする請求項6に記載のロボット用の座標系アライメント方法。
For each of the plurality of robots, a plurality of operating space coordinate systems are set, which are the plurality of the operating space coordinate systems, one of which is common to the operating space coordinate system set for at least one other robot. ,
One of all the motion space coordinate systems set in the plurality of robots is the reference motion space coordinate system, and the other is the reference motion space coordinate system.
Among the coordinate systems set in the plurality of robots, a transformation matrix for converting the reference motion space coordinate system to the reference motion space coordinate system is created.
Three-dimensional measurement data of the object is acquired in each of the plurality of operating space coordinate systems set for each of the plurality of robots.
The robot according to claim 6, wherein the three-dimensional measurement data of the object acquired in the reference motion space coordinate system is converted into three-dimensional measurement data of the reference motion space coordinate system and integrated. Coordinate system alignment method.
 前記基準動作空間座標系が前記複数のロボットの全てに対して設定される
 ことを特徴とする請求項7に記載のロボット用の座標系アライメントシステム。
The coordinate system alignment system for a robot according to claim 7, wherein the reference operating space coordinate system is set for all of the plurality of robots.
 動作点を三次元的に移動するためのロボットの座標系であるロボット座標系と、光切断法を実行可能であり前記動作点に対する位置及び姿勢が不変である三次元計測器の座標系である計測器座標系をアライメントするシステムであって、
 前記動作点の動作空間の座標系である動作空間座標系における原点に位置する第1基準点、該第1基準点を通り直交する2つの直線上にそれぞれ位置する第2基準点及び第3基準点に前記動作点を移動させて各基準点の位置をティーチングすることにより、前記動作空間座標系と前記ロボット座標系の関係を決定する動作空間座標系設定部と、
 前記動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に配置された板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対し、前記三次元計測器からシート状のスリット光を照射して、該スリット光による前記参照物体への投映線を取得する投映線取得部と、
 前記投映線のプロファイルに基づいて、前記参照物体に対する前記三次元計測器の姿勢を求める計測器姿勢算出部と、
 前記三次元計測器を移動させる計測器移動部と
 を備えることを特徴とするロボット用の座標系アライメントシステム。
The robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally, and the coordinate system of the three-dimensional measuring instrument, which can execute the optical cutting method and whose position and orientation with respect to the operating point are invariant. A system that aligns the coordinate system of measuring instruments
The first reference point located at the origin in the operating space coordinate system, which is the coordinate system of the operating space of the operating point, the second reference point and the third reference point located on two straight lines orthogonal to the first reference point, respectively. An operating space coordinate system setting unit that determines the relationship between the operating space coordinate system and the robot coordinate system by moving the operating point to a point and teaching the position of each reference point.
On the surface of the plate-shaped member arranged parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A projection line acquisition unit that irradiates a rectangular parallelepiped reference object fixed to the reference object with a sheet-shaped slit light from the three-dimensional measuring instrument and acquires a projection line to the reference object by the slit light.
A measuring instrument posture calculation unit that obtains the posture of the three-dimensional measuring instrument with respect to the reference object based on the profile of the projection line.
A coordinate system alignment system for a robot, which comprises a measuring instrument moving unit for moving the three-dimensional measuring instrument.
 前記動作空間座標系設定部が、前記動作空間座標系として、基準動作空間座標系と参照動作空間座標系を設定するものであり、
 さらに、
 前記動作点を、前記基準動作空間座標系の原点と前記参照動作空間座標系の原点の間で移動させて並進変位及び回転変位を取得する変位取得部と、
 前記並進変位及び回転変位に基づいて、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を求める変換行列算出部と、
 前記基準動作空間座標系と前記参照動作空間座標系のそれぞれにおいて、前記計測器移動部により前記三次元計測器の姿勢が前記所定の基準姿勢範囲内に入るように該三次元計測器を移動させ、対象物の三次元計測データを取得する三次元計測データ取得部と、
 前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記変換行列を用いて前記基準動作空間座標系の三次元計測データに変換して該基準空間座標系で取得した前記対象物の三次元計測データと統合する三次元計測データ統合部と
 を備えることを特徴とする請求項1から5のいずれかに記載のロボット用の座標系アライメントシステム。
The operating space coordinate system setting unit sets the reference operating space coordinate system and the reference operating space coordinate system as the operating space coordinate system.
further,
A displacement acquisition unit that moves the operating point between the origin of the reference operating space coordinate system and the origin of the reference operating space coordinate system to acquire translational displacement and rotational displacement.
A transformation matrix calculation unit that obtains a transformation matrix that converts the reference motion space coordinate system into the reference motion space coordinate system based on the translational displacement and the rotational displacement.
In each of the reference motion space coordinate system and the reference motion space coordinate system, the three-dimensional measuring instrument is moved by the measuring instrument moving unit so that the posture of the three-dimensional measuring instrument falls within the predetermined reference posture range. , The 3D measurement data acquisition unit that acquires the 3D measurement data of the object,
The object acquired in the reference space coordinate system by converting the three-dimensional measurement data of the object acquired in the reference operation space coordinate system into the three-dimensional measurement data of the reference operation space coordinate system using the conversion matrix. The coordinate system alignment system for a robot according to any one of claims 1 to 5, further comprising a three-dimensional measurement data integration unit that integrates the three-dimensional measurement data of an object.
 さらに、
 複数のロボット
 を備え、
 前記動作空間座標系設定部が、前記複数のロボットのそれぞれについて、複数の前記動作空間座標系であって、そのうちの1つが少なくとも別の1つのロボットについて設定される動作空間座標系と共通である複数の動作空間座標系を設定するとともに、該複数のロボットに設定した動作空間座標系のうちの1つを前記基準動作空間座標系、他を参照動作空間座標系として設定し、
 前記変換行列算出部が、前記複数のロボットに設定された座標系のうち、前記参照動作空間座標系を前記基準動作空間座標系に変換する変換行列を作成し、
 前記三次元計測データ統合部が、前記複数のロボットのそれぞれについて設定された前記複数の動作空間座標系のそれぞれにおいて前記対象物の三次元計測データを取得し、
 前記三次元計測データ統合部が、前記参照動作空間座標系で取得した前記対象物の三次元計測データを、前記基準動作空間座標系の三次元計測データに変換して統合する
 ことを特徴とする請求項6に記載のロボット用の座標系アライメントシステム。
further,
Equipped with multiple robots
The operating space coordinate system setting unit is common to a plurality of the operating space coordinate systems set for each of the plurality of robots, one of which is set for at least another robot. A plurality of operating space coordinate systems are set, and one of the operating space coordinate systems set for the plurality of robots is set as the reference operating space coordinate system, and the other is set as the reference operating space coordinate system.
The transformation matrix calculation unit creates a transformation matrix that converts the reference motion space coordinate system into the reference motion space coordinate system among the coordinate systems set in the plurality of robots.
The three-dimensional measurement data integration unit acquires three-dimensional measurement data of the object in each of the plurality of operating space coordinate systems set for each of the plurality of robots.
The three-dimensional measurement data integration unit converts the three-dimensional measurement data of the object acquired in the reference motion space coordinate system into the three-dimensional measurement data of the reference motion space coordinate system and integrates the data. The coordinate system alignment system for a robot according to claim 6.
 動作点を三次元的に移動するためのロボットの座標系であるロボット座標系に予め対応付けられた前記動作点の動作空間の座標系である動作空間座標系と、光切断法を実行可能であり前記動作点に対する位置及び姿勢が不変である三次元計測器の座標系である計測器座標系をアライメントするために用いられる装置であって、
 前記動作空間座標系を規定する直交三軸のうちの二軸が載る平面に対して平行に配置された板状部材の表面に、該直交三軸のそれぞれに対して各辺が平行になるように固定された直方体状の参照物体に対し、前記三次元計測器からシート状のスリット光を照射して、該スリット光による前記参照物体への投映線を取得する投映線取得部と、
 前記投映線のプロファイルに基づいて、前記参照物体に対する前記三次元計測器の姿勢を求める計測器姿勢算出部と、
 を備えることを特徴とするロボット用の座標系アライメント装置。
The optical cutting method can be executed with the operating space coordinate system, which is the coordinate system of the operating space of the operating point, which is previously associated with the robot coordinate system, which is the coordinate system of the robot for moving the operating point three-dimensionally. Yes A device used to align the coordinate system of a three-dimensional measuring instrument whose position and orientation with respect to the operating point are invariant.
On the surface of the plate-shaped member arranged parallel to the plane on which two of the three orthogonal axes defining the operating space coordinate system are placed, each side is parallel to each of the three orthogonal axes. A projection line acquisition unit that irradiates a rectangular parallelepiped reference object fixed to the reference object with a sheet-shaped slit light from the three-dimensional measuring instrument and acquires a projection line to the reference object by the slit light.
A measuring instrument posture calculation unit that obtains the posture of the three-dimensional measuring instrument with respect to the reference object based on the profile of the projection line.
A coordinate system alignment device for a robot, which comprises.
 さらに、
 前記参照物体の所定位置の、前記動作空間座標系における座標と、前記計測器座標系における座標を比較して差分を求め、前記計測器座標系を前記差分だけシフトさせる座標系シフト部
 を備えることを特徴とする請求項7に記載のロボット用の座標系アライメント装置。
further,
A coordinate system shift unit is provided which obtains a difference by comparing the coordinates of the predetermined position of the reference object in the operating space coordinate system with the coordinates in the measuring instrument coordinate system and shifts the measuring instrument coordinate system by the difference. 7. The coordinate system alignment device for a robot according to claim 7.
 さらに、
 前記計測器姿勢算出部によって前記三次元計測器の姿勢が求められる毎に、前記参照物体に対する前記三次元計測器の姿勢を表示する表示部
 を備えることを特徴とする請求項7又は8に記載のロボット用の座標系アライメント装置。
further,
The invention according to claim 7 or 8, wherein the display unit for displaying the posture of the three-dimensional measuring instrument with respect to the reference object is provided every time the posture of the three-dimensional measuring instrument is obtained by the measuring instrument posture calculation unit. Coordinate system alignment device for robots.
PCT/JP2020/042870 2019-12-02 2020-11-17 Coordinate system alignment method, alignment system, and alignment device for robot Ceased WO2021111868A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP20896987.3A EP4070923A4 (en) 2019-12-02 2020-11-17 Coordinate system alignment method, alignment system, and alignment device for robot
JP2021562554A JP7273185B2 (en) 2019-12-02 2020-11-17 COORDINATE SYSTEM ALIGNMENT METHOD, ALIGNMENT SYSTEM AND ALIGNMENT APPARATUS FOR ROBOT
US17/781,141 US12290924B2 (en) 2019-12-02 2020-11-17 Coordinate system alignment method, alignment system, and alignment device for robot

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019217775A JP6670974B1 (en) 2019-12-02 2019-12-02 Robot coordinate system alignment method, alignment system, and alignment device
JP2019-217775 2019-12-02

Publications (1)

Publication Number Publication Date
WO2021111868A1 true WO2021111868A1 (en) 2021-06-10

Family

ID=70000783

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/042870 Ceased WO2021111868A1 (en) 2019-12-02 2020-11-17 Coordinate system alignment method, alignment system, and alignment device for robot

Country Status (4)

Country Link
US (1) US12290924B2 (en)
EP (1) EP4070923A4 (en)
JP (2) JP6670974B1 (en)
WO (1) WO2021111868A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7277340B2 (en) * 2019-11-15 2023-05-18 川崎重工業株式会社 Master-slave system, control method and control device
EP4131153A1 (en) * 2020-03-30 2023-02-08 NEC Corporation Photographing system, photographing method, and non-transitory computer-readable medium storing photographing program
EP3916394B1 (en) * 2020-05-29 2025-07-02 Roche Diagnostics GmbH Module for an automated laboratory system
CN112873264B (en) * 2021-03-18 2024-02-23 中国工程物理研究院机械制造工艺研究所 Industrial robot joint structure, robot control system and method
CN114485427B (en) * 2022-01-20 2023-09-22 上汽大众汽车有限公司 A measurement reference construction method and system for vehicle body size measurement
CN119610094A (en) * 2024-12-03 2025-03-14 珠海格力智能装备有限公司 Robot anti-collision method, device and robot control system
CN119704191B (en) * 2024-12-31 2025-08-01 广东工业大学 Calibration method, device and equipment for external actuator of robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06137840A (en) * 1992-10-22 1994-05-20 Toyota Central Res & Dev Lab Inc Vision sensor automatic calibration device
JPH07239219A (en) 1990-04-30 1995-09-12 Korea Mach Res Inst Non-contact tire end face contour shape measuring method and device
JPH08132373A (en) 1994-11-08 1996-05-28 Fanuc Ltd Coordinate system coupling method in robot-sensor system
JPH1133962A (en) * 1997-07-18 1999-02-09 Yaskawa Electric Corp Calibration of robot three-dimensional position sensor. Method and device.
JP2010091540A (en) 2008-10-13 2010-04-22 Toyota Motor Corp System and method for three-dimensional measurement
JP2018034272A (en) * 2016-09-01 2018-03-08 アイシン精機株式会社 Palletizing device
JP2019184340A (en) 2018-04-05 2019-10-24 オムロン株式会社 Information processing apparatus, information processing method, and program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6044308A (en) * 1997-06-13 2000-03-28 Huissoon; Jan Paul Method and device for robot tool frame calibration
JP2002172575A (en) * 2000-12-07 2002-06-18 Fanuc Ltd Teaching device
CN103620341B (en) * 2011-06-20 2016-01-20 株式会社安川电机 3 dimension form measuring instrument and robot systems
US20160243703A1 (en) * 2015-02-19 2016-08-25 Isios Gmbh Arrangement and method for the model-based calibration of a robot in a working space
EP3147086B1 (en) 2015-09-22 2020-11-04 Airbus Defence and Space GmbH Automation of robot operations in aircraft construction
CN111504183B (en) 2020-04-22 2021-03-09 无锡中车时代智能装备有限公司 Calibration method for relative position of linear laser three-dimensional measurement sensor and robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07239219A (en) 1990-04-30 1995-09-12 Korea Mach Res Inst Non-contact tire end face contour shape measuring method and device
JPH06137840A (en) * 1992-10-22 1994-05-20 Toyota Central Res & Dev Lab Inc Vision sensor automatic calibration device
JPH08132373A (en) 1994-11-08 1996-05-28 Fanuc Ltd Coordinate system coupling method in robot-sensor system
JPH1133962A (en) * 1997-07-18 1999-02-09 Yaskawa Electric Corp Calibration of robot three-dimensional position sensor. Method and device.
JP2010091540A (en) 2008-10-13 2010-04-22 Toyota Motor Corp System and method for three-dimensional measurement
JP2018034272A (en) * 2016-09-01 2018-03-08 アイシン精機株式会社 Palletizing device
JP2019184340A (en) 2018-04-05 2019-10-24 オムロン株式会社 Information processing apparatus, information processing method, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4070923A4

Also Published As

Publication number Publication date
US20220410375A1 (en) 2022-12-29
JP6670974B1 (en) 2020-03-25
EP4070923A1 (en) 2022-10-12
US12290924B2 (en) 2025-05-06
EP4070923A4 (en) 2024-01-03
JP2021088003A (en) 2021-06-10
JPWO2021111868A1 (en) 2021-06-10
JP7273185B2 (en) 2023-05-12

Similar Documents

Publication Publication Date Title
JP7273185B2 (en) COORDINATE SYSTEM ALIGNMENT METHOD, ALIGNMENT SYSTEM AND ALIGNMENT APPARATUS FOR ROBOT
US11911914B2 (en) System and method for automatic hand-eye calibration of vision system for robot motion
JP4191080B2 (en) Measuring device
JP4021413B2 (en) Measuring device
KR102129103B1 (en) System and method for calibration of machine vision cameras along at least three discrete planes
JP6280525B2 (en) System and method for runtime determination of camera miscalibration
JP4844453B2 (en) Robot teaching apparatus and teaching method
JP3946711B2 (en) Robot system
US10812778B1 (en) System and method for calibrating one or more 3D sensors mounted on a moving manipulator
JP2008012604A (en) Measuring apparatus and method of its calibration
JP2016185572A (en) Robot, robot controller and robot system
US20150377606A1 (en) Projection system
EP3577629B1 (en) Calibration article for a 3d vision robotic system
EP3421930A1 (en) Three-dimensional shape data and texture information generation system, photographing control program, and three-dimensional shape data and texture information generation method
US20190255706A1 (en) Simulation device that simulates operation of robot
US20240070910A1 (en) Processing method and processing device for generating cross-sectional image from three-dimensional position information acquired by visual sensor
CN110712194A (en) Object inspection device, object inspection system, and method for adjusting inspection position
JP2019063955A (en) Robot system, motion control method and motion control program
US20240066701A1 (en) Simulation device using three-dimensional position information obtained from output from vision sensor
CN115362049B (en) Device for correcting the teaching position of a robot, teaching device, robot system, teaching position correction method, and computer program
Scaria et al. Cost Effective Real Time Vision Interface for Off Line Simulation of Fanuc Robots
JP2016187851A (en) Calibration device
WO2025210276A1 (en) Method and device for determining the orienting of a surface of a workpiece and method for orienting a device at the point of work at the surface of the workpiece
US20220219328A1 (en) Method and device for creation of three dimensional tool frame
JP2025171459A (en) Method and apparatus for determining the position and attitude of an object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20896987

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021562554

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020896987

Country of ref document: EP

Effective date: 20220704

WWG Wipo information: grant in national office

Ref document number: 17781141

Country of ref document: US