[go: up one dir, main page]

WO2025132520A1 - Procédé et dispositif pour étalonner un robot, procédé pour déterminer les coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et l'orientation d'un objet v mesurer, robot et programme d'ordinateur - Google Patents

Procédé et dispositif pour étalonner un robot, procédé pour déterminer les coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et l'orientation d'un objet v mesurer, robot et programme d'ordinateur Download PDF

Info

Publication number
WO2025132520A1
WO2025132520A1 PCT/EP2024/087018 EP2024087018W WO2025132520A1 WO 2025132520 A1 WO2025132520 A1 WO 2025132520A1 EP 2024087018 W EP2024087018 W EP 2024087018W WO 2025132520 A1 WO2025132520 A1 WO 2025132520A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot kinematics
robot
digitizer
camera
kinematics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/087018
Other languages
German (de)
English (en)
Inventor
Manuel WINTER
Bennet BOETTINGER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carl Zeiss GOM Metrology GmbH
Original Assignee
Carl Zeiss GOM Metrology GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carl Zeiss GOM Metrology GmbH filed Critical Carl Zeiss GOM Metrology GmbH
Publication of WO2025132520A1 publication Critical patent/WO2025132520A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37048Split beam, stripe projection on object, lines detected with cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39008Fixed camera detects reference pattern held by end effector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39045Camera on end effector detects reference pattern
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40605Two cameras, each on a different end effector to measure relative position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40613Camera, laser scanner on end effector, hand eye manipulator, local

Definitions

  • Method and device for calibrating a robot method for determining the 3D coordinates of a measuring object, method for determining the position and orientation of a measuring object, robot and computer program
  • the present invention relates to a robot, a method and a device for calibrating a robot with a coordinate measuring machine or as part of a coordinate measuring machine, as well as a method for determining the 3D coordinates of a measurement object and a method for determining the position and orientation of a measurement object.
  • the present invention further relates to a computer program with instructions which, when the program is executed by at least one processor, cause this at least one processor and/or further processors to carry out the methods according to the invention.
  • Robot kinematics are mechanical constructions in which one or more axes are coupled to one another.
  • the goal of robot kinematics is to bring an end effector into a specific position and orientation (pose) in relation to a reference system.
  • An axis is understood here as the movable connection between two structural components of the robot kinematics, whereby this connection can be movable in up to six degrees of freedom. Examples of this are a typical robot rotary axis that only allows rotation in one rotational degree of freedom, a typical linear axis that allows translation in one translational degree of freedom, or a spherical joint (ball joint) that allows rotation in three rotational degrees of freedom.
  • Typical robot kinematics are six-axis robots.
  • a turntable can also be considered a robot kinematics.
  • the robot axes can be equipped with an actuator or can be operated passively.
  • the reference system can, for example, lie on the floor.
  • mobile robot kinematics traveling on corresponding mobile platforms can therefore also be calibrated individually or relative to each other. In this case, the reference system can be located on the floor of the platform, which is stationary for the measurement.
  • Robot kinematics, whether parallel, serial, or coupled parallel and serial, such as an articulated arm on a delta robot are inaccurate for metrological purposes in many respects.
  • the end effector is the end interface along the link chain of the robot kinematics or the tool attached to the end interface.
  • a measurement object can usually only be partially captured by an optical measuring system due to its limited measuring range. Therefore, the measuring system is placed in various poses in relation to the measurement object using a robot in order to capture the measurement object as completely as possible.
  • the data available locally in the reference coordinate system of the measuring system are then combined in a global coordinate system. To do this, it is necessary to know the position and orientation of the measuring system in relation to the measurement object as precisely as possible. This accuracy is directly related to the positioning accuracy of the robot or the accuracy in determining the position of the end effector. The better a robot is calibrated, the more accurately the position of the measuring system can be determined and the more accurately the measurement object as a whole can be measured.
  • Measurement objects can be so large that even the reach of a robot is insufficient to capture the measurement object with the measuring system attached to the robot. In this case, the measurement object must be moved and realigned with the robot in order to be able to transfer the data before and after the measurement object is moved into a common global coordinate system. This is usually done using recognizable features that are attached to or around the measurement object and are captured before and after the movement with the measuring system. The better the robot is calibrated, the more precise the alignment. [0007] In 3D coordinate metrology, 3D digitizers are used to measure 3D data.
  • the 3D data can be generated using different methods, such as triangulation with at least two cameras or one camera and a projector, time-of-flight measurement (TOF), lidar, confocal measurement, etc.
  • the 3D data can be present, for example, in the form of individual 3D points of discrete features or as a full-surface mesh of a large number of surface points.
  • 3D data can also be generated using a 2D camera.
  • a single 2D camera that cannot generate 3D data in a single pose without further prior knowledge of the measurement object, such as a distance between features of the measurement object, is not considered a 3D digitizer within the meaning of the invention.
  • 2D cameras are also referred to simply as cameras below.
  • EP 3 377 948 B1 describes a method in which, at several points in time of a scanning plan, the positions of a camera system of a robot relative to a reference coordinate network frame are obtained by comparing three-dimensional images of a scene and the positions of the robot are obtained by using a second scanning plan. to determine the position of the reference coordinate network frame and the reference point of the camera system relative to the position of the robot.
  • Several equations are created that are used to solve an optimization problem. It is described that the position of the reference coordinate network frame, the tool center point of the camera system, a possible time offset, and parameters of the robot model can be optimized in the optimization process.
  • One input parameter of the optimization process is the position of the camera system, which results from a comparison of the three-dimensional images of the scene and is calculated in a step preceding the optimization process.
  • the disadvantage of the method is that inexpensive cameras have a relatively low resolution and therefore the position of the points of the reference object in the images of the cameras and thus also in space cannot be determined very precisely.
  • the object of the present invention is to provide a better method and a better device for calibrating a robot kinematics as well as a better method for determining the 3D coordinates of a measurement object and a better method for determining the position and orientation of a measurement object as well as better robot kinematics and a computer program for carrying out the better method according to the invention.
  • the invention proposes a method for calibrating at least one robot kinematics, which at least comprises:
  • the camera can also be part of the 3D digitizer for all initial substeps of the method for calibrating robot kinematics. This naturally also applies to the subsequent method claims for measuring and aligning a measurement object, as well as to the corresponding device claims.
  • a method for determining the 3D data of a measurement object wherein at least one 3D digitizer attached to a robot kinematics or several 3D digitizers attached to robot kinematics are brought into different poses for measuring the measurement object and the 3D digitizer(s) generate local 3D data in the reference coordinate systems of the 3D digitizers and wherein the local 3D data generated for the different poses are based on the method according to the invention derived calibration parameters of the robot kinematics are transferred into a global coordinate system.
  • a method for aligning a measurement object with respect to a robot kinematics based on recognizable features that are distributed on or around the measurement object and are captured by at least one 3D digitizer attached to a robot kinematics and/or by at least one camera attached to a robot kinematics, wherein the calibration parameters of the robot kinematics were derived by means of the method according to the invention.
  • a device for calibrating robot kinematics comprises at least the following components: at least one 3D digitizer attached to a first robot kinematics and/or at least one camera, wherein the camera is attached to this first robot kinematics or to a further second robot kinematics, and at least one reference object located in the vicinity of one robot kinematics or in the vicinity of both robot kinematics and stationary for the recordings, which has recognizable features, or at least one 3D digitizer located and fixed in the vicinity of a first robot kinematics and/or at least one camera fixed in the environment and at least one reference object attached to the first robot kinematics, which has recognizable features, or at least one reference object attached to a first robot kinematics, which has recognizable features, and at least one 3D digitizer, wherein the 3D digitizer is attached to a second robot kinematics or to a third Robot kinematics
  • a method for calibrating robot kinematics wherein data for measuring a reference object and for deriving the calibration parameters of the robot kinematics or the robot kinematics are generated and at least part of the data is evaluated in a step which includes generating the measurement data and deriving the calibration parameters.
  • the generation of the measurement data and the derivation of the calibration parameters are carried out by an optimization process.
  • it can be iteratively checked after each optimization iteration to determine which data is used for measuring a reference object and which for deriving the calibration parameters, or to weight these accordingly.
  • it can be checked to determine whether it is, for example, an outlier with regard to the measurement of the reference object or the derivation of the calibration parameters. This makes it possible to detect inconsistencies between the data, which can be caused, for example, by a temperature jump during data acquisition or by instability of the camera or 3D digitizer.
  • the data is recorded by at least one 3D digitizer attached to a first robot kinematics and/or at least one camera, the camera being attached to this first robot kinematics or to a further second robot kinematics, and at least one reference object located in the vicinity of one robot kinematics or the two robot kinematics and stationary for the recordings, which has recognizable features, being brought into different relative poses to one another by the robot kinematics or the robot kinematics, and in which data of the recognizable features of the reference object are recorded in these different relative poses by the at least one 3D digitizer and/or by the at least one camera.
  • the data is recorded by bringing a reference object attached to a robot kinematics into different relative poses by the robot kinematics to a 3D digitizer located and fixed in the environment of the robot kinematics and/or a camera located and fixed in the environment of the robot kinematics, and in these different relative poses, data of the recognizable features of the reference object are recorded.
  • data can also be recorded by placing a reference object attached to a first robot kinematic system, which has recognizable features, and at least one 3D digitizer, wherein the 3D digitizer is attached to a second robot kinematic system or to a third robot kinematic system, and/or at least one camera, wherein the camera is attached to a second robot kinematic system or to a third robot kinematic system, in different relative poses to one another, and recording data of the recognizable features of the reference object in these different relative poses.
  • the recorded data can be, on the one hand, 2D data recorded with one or more cameras, which can also be one or more cameras of the 3D digitizer. On the other hand, it can be 3D data generated by the 3D digitizer.
  • the use of a camera, especially a high-resolution camera, is advantageous for measuring the recognizable features of the reference object.
  • the use of a 3D digitizer is advantageous for determining the pose of the end effector, since depth data is particularly relevant for determining the pose.
  • a high-quality, high-resolution camera for measuring the recognizable features of the reference object based on 2D data
  • a 3D digitizer for highly accurate determination of the end-effector's pose based on 3D data.
  • a camera can also be used to determine the end-effector's pose, or a 3D digitizer can be used to measure the recognizable features of the reference object, or both can be combined.
  • the reference object In order to calibrate robot kinematics using a reference object, the reference object must be calibrated with high precision. This can be achieved, for example, by calibrating the reference object in advance in a calibration laboratory. If the reference object does not remain calibrated with long-term stability, regular calibration is necessary to ensure lasting accuracy.
  • the advantage of the method according to the invention is that the reference object is calibrated with high precision during the method according to the invention by generating calibration data, thus eliminating the complexity of preliminary calibration and long-term stability.
  • influencing factors such as the gravity orientation (sub-step b) or sub-step c)) of the reference object or the ambient temperature can also be taken into account. It is therefore possible, for example, to carry out the first step with different ambient temperatures and to generate the calibration data or the references of the recorded recognizable features of the at least one reference object in a coordinate system in a temperature-dependent manner, whereby it must be taken into account that sufficient recordings of the recognizable features of the at least one reference object are generated for the different ambient temperatures. Due to the existing temperature dependence of the references of the recorded recognizable features of the at least one reference object, this can lead to an improved ambient temperature-dependent calibration of the robot kinematics.
  • a lightweight reference object can be used that is mounted on a robot kinematics, thereby reducing the load on the robot kinematics, particularly compared to a 3D digitizer attached to the robot kinematics. Furthermore, if the reference object is mounted on the robot kinematics and not the 3D digitizer or the camera, cabling along the robot kinematics can be eliminated.
  • the newly determined calibration parameters also enable a more precise alignment of a measurement object to the calibrated robot kinematics.
  • the 3D digitizer is a triangulation sensor, in particular one with stripe light projection.
  • the recognizable features of the reference object are absolutely related to one another by at least one absolute measurement with the 3D digitizer, whereby the 3D digitizer itself is the scale-determining element.
  • 3D data is measured absolutely using a calibrated 3D digitizer. A measurement of the recognizable features of the reference object with the 3D digitizer can therefore be used to absolutely determine the distances, which may only be relative, between the recognizable features of the reference object. This advantageously eliminates the need for a scale-defining element on the reference object.
  • the calibration parameters of the 3D digitizer and/or the camera can also be adjusted, provided that a scale-defining element is detected by the 3D digitizer and/or the camera (see Luhmann, Robson, Kyle, Boehm, Close Range Photogrammetry and 3D Imaging, Second Edition, ISBN 978-3-1110-2986-3) or, if necessary, the calibration parameters of the 3D digitizer and/or the camera can also be adjusted without a scale-defining element being detected by the 3D digitizer and/or the camera, provided that one of the robot limbs and thus a part of the robot kinematics itself is the scale-defining element.
  • scale-independent calibration parameters of the 3D digitizer and/or the camera can also be adjusted without capturing a scale-defining element, provided that the scale-dependent parameters of the calibration of the 3D digitizer and/or the camera were previously captured using a scale-defining element.
  • the input parameters for the optimization process are the calibration parameters of the robot kinematics obtained from the forward kinematics, the recorded data of the 3D digitizer or the camera, if necessary with an indication as to whether these are to be used for generating the calibration data or for deriving the calibration parameters of the robot kinematics, and, if necessary, calibration parameters of the 3D digitizer or the camera.
  • Output parameters of the optimization process are the optimized calibration parameters of the robot kinematics, the generated calibration data of the recognizable Features of the reference object and, if applicable, the optimized calibration parameters of the 3D digitizer or camera.
  • a robot kinematics can be calibrated, the reference object can be measured and the 3D digitizer or the camera can be calibrated.
  • the calibration parameters of the robot kinematics are geometric and/or elastic and/or thermal and/or calibration parameters of the robot kinematics that take the configuration of the robot kinematics into account, and/or calibration parameters that take the direction of travel into account, and/or calibration parameters of the characteristics of the or calibration parameters for determining the error curves of the length or angle measurement technology of the robot kinematics and/or calibration parameters that characterize the play of individual or all axes of the robot kinematics and/or calibration parameters that describe the dependence of the robot kinematics on other variables that can be recorded, for example, with sensors or can be calculated based on the available data.
  • the method according to the invention is not limited to these calibration parameters and can be supplemented and combined with other parameters.
  • the calibration parameters are used to determine the pose of the end effector and/or to align the end effector based on the parameters.
  • the adjusted calibration parameters resulting from the optimization process can be used to position the end effector of the robot kinematics more precisely. This is useful if the measurement of a measurement object takes place after the calibration of the robot kinematics.
  • the calibration parameters can also be used to precisely determine the pose of the end effector. This makes it possible, for example, to use inexpensive robot kinematics whose approach accuracy to a certain pose is low due to mechanical reasons, but the pose can still be precisely determined using the adjusted calibration parameters. Furthermore, it is also possible to subsequently correct the poses of a measurement.
  • the forward kinematics already includes the transformation from the global reference coordinate system to the local reference coordinate system of the 3D digitizer or camera. Thus, no additional hand-eye calibration is necessary.
  • 3D digitizers and/or cameras and/or reference objects are mounted on a plurality of robot kinematics and at least one of the robot kinematics is calibrated according to the method according to the invention.
  • a combination with two or more robot kinematics, each with a reference object is also possible.
  • the recognizable features of the reference objects are recorded by a 3D digitizer and/or camera located and fixed in the vicinity of the robot kinematics, while the reference objects are brought into different poses relative to the 3D digitizer and/or camera.
  • Another exemplary combination can be that some robot kinematics are equipped with a 3D digitizer and some robot kinematics with a camera and a robot kinematics or some robot kinematics with a reference object, and these generate data, for example, after sub-step c).
  • Fig. 4 is a schematic representation of a device of the method according to the invention in a third variant
  • FIG. 4 An exemplary device of the third variant is shown in Figure 4. This device has a reference object 7 attached to a robot kinematics 1a and a 3D digitizer 2 attached to a robot kinematics 1b with a measurement volume 3 and a camera 4 attached to the robot kinematics 1b.
  • a further second step 102 ( Figure 1), at least part of the recorded data is used to generate calibration data which relate the recorded features 8 of the reference object 7 to one another in a coordinate system, and to derive calibration parameters of the robot kinematics 1a, 1b, 1c.
  • the reference object 7 In order to calibrate a robot kinematics 1a, 1b, 1c on the reference object 7, the reference object 7 must be measured with high precision. This can be achieved, for example, by It should be noted that the reference object 7 has been calibrated in advance in a calibration laboratory. If the reference object 7 does not remain calibrated with long-term stability, regular calibration is necessary to ensure lasting accuracy.
  • the advantage of the method according to the invention is that the reference object 7 is calibrated with high precision by generating calibration data during the method according to the invention, thus eliminating the complexity of a preliminary calibration and long-term stability.
  • the reference object 7 can be designed such that the recognizable features 8 are located on an independent body and/or a wall provided with recognizable features 8 and/or on another robot kinematics 1a, 1b, 1c and/or on one or more arbitrary members of a robot kinematics 1a, 1b, 1c and/or on a 3D digitizer 2 or a camera 4 and/or on a tool and/or on a measurement object 9 and/or around a measurement object 9.
  • the recognizable features 8 can be optical markers that themselves actively illuminate and/or are passively illuminated, and/or features 8 that are recognizable in two dimensions with a unique 2D geometry and/or features 8 that are recognizable in three dimensions with a unique 3D geometry.
  • a scale-determining element 6 can be attached to the reference object 7 in order to be able to determine the distances of the recognizable features 8 absolutely and not only relatively.
  • the scale-defining element 6 is shown in Figure 2, but could just as easily be included in Figure 3 or 4 for the device of the second or third variant of the method according to the invention.
  • the scale-defining element 6 is, for example, a stable precision body. This is calibrated with high precision, usually using a tactile coordinate measuring machine, and consists of a material that hardly expands, ideally not at all, when the temperature changes.
  • the precision bodies are usually simple forms of length scales, such as a ball rod. The advantage is that by generating the calibration data, the accuracy of the precision body is transferred to the rather complex reference object 7, without the reference object 7 itself having to be calibrated with high precision (e.g., tactilely).
  • a scale-defining element 6 can also be provided by the or one of the 3D digitizers 2, as shown in Figures 3, 4, 6 or Figure 7, in that the 3D digitizer 2 provides at least one recording of 3D data.
  • An additional scale-defining element 6, as shown in Figure 2, is therefore not absolutely necessary.
  • the additional scale-defining element 6 could also be omitted in Figure 2, i.e., the device for the first variant of the method according to the invention, if the 3D digitizer 2 is used as the scale-defining element 6.
  • the 3D digitizer 2 can be the scale-defining element 6, since it absolutely measures the distances between the recognizable features 8 of the reference object 7 through at least one 3D data recording based on the calibration of the 3D digitizer 2.
  • the data recorded to generate the calibration data can come from the 3D digitizer 2 and/or the camera 4.
  • a high-resolution digital camera is used to generate the calibration data, wherein said camera is brought into many different relative poses to the reference object 7 in order to record images of the recognizable features 8 of a reference object 7.
  • the data used to generate the measurement data and to derive the calibration parameters can be different, identical, or partially identical. Furthermore, the chronological order in which the data are recorded is irrelevant.
  • the calibration parameters of the 3D digitizer 2 and/or the camera 4 can also be adjusted, provided that a scale-defining element 6 is detected by the 3D digitizer 2 and/or the camera 4 (see Luhmann, Robson, Kyle, Boehm, Close Range Photogrammetry and 3D Imaging, Second Edition, ISBN 978-3-1110-2986-3) or, if necessary, the calibration parameters of the 3D digitizer 2 and/or the camera 4 can be adjusted without a scale-defining element 6 being detected by the 3D digitizer 2 and/or the camera 4, provided that one of the robot links and thus one of the robot kinematics 1a, 1b, 1c itself is the scale-defining element 6.
  • scale-independent calibration parameters of the 3D digitizer 2 and/or the camera 4 can also be adjusted without a scale-defining element 6 being detected, provided that the scale-dependent parameters of the calibration of the 3D digitizer 2 and/or the camera 4 were adjusted using a scale-defining element.
  • the input parameters for the optimization process are the calibration parameters of the robot kinematics 1a, 1b, 1c obtained from the forward kinematics, the recorded data of the calibration data or of the camera v. with an indication of whether these are to be used for generating the calibration data or for deriving the calibration parameters of the robot kinematics 1a, 1b, 1c, and, if required, calibration parameters of the 3D digitizer 2 or the camera 4.
  • Output parameters of the optimization process are the optimized calibration parameters of the robot kinematics 1a, 1b, 1c, the generated calibration data of the recognizable features 8 of the reference object 7 and, if required, the optimized calibration parameters of the 3D digitizer w. of the camera 4.
  • the robot kinematics 1a, 1b, 1c can be calibrated, the reference object 7 can be measured and the 3D digitizer 2 or the camera 4 can be calibrated.
  • Pics.ij contains the coordinates of the i-th recognizable feature 8 of the reference object 7 for the j-th pose of the robot kinematics 1a in the local reference coordinate system DCS of the 3D digitizer 2 or CCS of the camera 4.
  • the forward kinematics is described by the axis values Jj belonging to the j-th pose.
  • Axis values can For example, this could be an angle or a displacement between structural components of the robot kinematics 1a.
  • the coordinates PGCS.IJ transformed into a global reference coordinate system, for example RCS, are determined by applying a transformation function F p , which can be obtained, for example, by executing several transformations for the individual robot links in succession.
  • an optimization problem (2) (2) where QGCSXQ) are the coordinates of the i-th recognizable feature of the recognizable features 8 of the reference object 7, and these can also be adjusted using the parameters q.
  • the parameters q can, for example, contain the coordinates of the features 8 of the reference object 7.
  • the calibration parameters of the robot kinematics 1a, 1b, 1c can be, for example, geometric and/or elastic and/or thermal and/or calibration parameters of the at least one robot kinematics 1a, 1b, 1c, which take into account the configuration of the at least one robot kinematics 1a, 1b, 1c, and/or calibration parameters which take into account the direction of travel, and/or calibration parameters of the characteristic of the or calibration parameters for determining the error curve of the Length or angle measuring technology of the at least one robot kinematics 1a, 1b, 1c and/or calibration parameters that characterize the play of individual or all axes of the at least one robot kinematics 1a, 1b, 1c and/or calibration parameters that describe the dependence of the robot kinematics 1a, 1b, 1c on other variables that can be recorded, for example, with sensors or calculated based on the available data.
  • the distance and angular position of the two coordinate systems RCS1 and RCS2 or RCS3 and RCS4 to each other can also be regarded as a parameter of the robot kinematics 1a, 1b, 1c,
  • the method according to the invention is not limited to these calibration parameters and can be supplemented and combined with other parameters.
  • the calibration parameters resulting from the optimization procedure can be used to determine the pose of the end effector and/or to align the end effector based on the parameters.
  • the adjusted calibration parameters resulting from the optimization process can be used to position the end effector of the robot kinematics 1a, 1b, 1c more precisely. This is useful if the measurement of a measurement object 9 takes place after the calibration of the robot kinematics 1a, 1b, 1c.
  • the calibration parameters can also be used to precisely determine a pose of the end effector.
  • 3D digitizer 2 can be brought into different poses for measuring the measuring object 9 and the 3D digitizer 2 in the reference coordinate system the 3D digitizer 2 generate local 3D data and the local 3D data generated for the various poses based on the robot kinematics calibrated according to the method according to the invention a global coordinate system, for example RCS.
  • a global coordinate system for example RCS.
  • the 3D data of a measuring object 9, in which a 3D digitizer 2 only captures partial areas 10 of the measuring object its measuring volume « can be determined more accurately overall by precisely transferring the local 3D data into the global coordinate system RCS.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un procédé pour étalonner une cinématique de robot, au cours duquel, dans une première étape, un numériseur 3D et/ou une caméra sont placés dans différentes positions relatives l'un par rapport à l'autre et par rapport à un objet de référence par au moins une cinématique de robot et, dans ces positions, des données de caractéristiques reconnaissables de l'objet de référence sont enregistrées par le numériseur 3D et/ou la caméra et au cours duquel, dans une deuxième étape, au moins à partir d'une partie des données enregistrées, des données d'étalonnage sont générées, qui mettent en relation les caractéristiques enregistrées de l'au moins un objet de référence dans un système de coordonnées, ainsi que des paramètres d'étalonnage pour l'au moins une cinématique de robot sont dérivés d'au moins une partie des données enregistrées. L'invention concerne également une cinématique de robot et un dispositif pour étalonner une cinématique de robot avec une machine de mesure de coordonnées ou en tant que partie d'une machine de mesure de coordonnées, ainsi qu'un procédé pour déterminer les coordonnées 3D d'un objet à mesurer et un procédé pour déterminer la position et l'orientation d'un objet à mesurer. L'invention concerne en outre un programme d'ordinateur comprenant des instructions qui, lorsque le programme est exécuté par au moins un processeur, amènent ledit au moins un processeur et/ou d'autres processeurs à exécuter les procédés selon l'invention.
PCT/EP2024/087018 2023-12-22 2024-12-18 Procédé et dispositif pour étalonner un robot, procédé pour déterminer les coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et l'orientation d'un objet v mesurer, robot et programme d'ordinateur Pending WO2025132520A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102023136450.2A DE102023136450A1 (de) 2023-12-22 2023-12-22 Verfahren und Vorrichtung zum Kalibrieren eines Roboters, Verfahren zum Bestimmen der 3D-Koordinaten eines Messobjekts, Verfahren zum Bestimmen der Position und Orientierung eines Messobjekts, Roboter und Computerprogramm
DE102023136450.2 2023-12-22

Publications (1)

Publication Number Publication Date
WO2025132520A1 true WO2025132520A1 (fr) 2025-06-26

Family

ID=94129677

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/087018 Pending WO2025132520A1 (fr) 2023-12-22 2024-12-18 Procédé et dispositif pour étalonner un robot, procédé pour déterminer les coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et l'orientation d'un objet v mesurer, robot et programme d'ordinateur

Country Status (2)

Country Link
DE (1) DE102023136450A1 (fr)
WO (1) WO2025132520A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2489977B1 (fr) 2011-02-16 2018-11-21 Carl Zeiss Optotechnik GmbH Procédé destiné à la détermination des coordonnées 3-D d'un objets et d'étalonnage d'un robot industriel
EP3377948B1 (fr) 2015-11-16 2021-01-06 ABB Schweiz AG Facilitation de positionnement de robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2489977B1 (fr) 2011-02-16 2018-11-21 Carl Zeiss Optotechnik GmbH Procédé destiné à la détermination des coordonnées 3-D d'un objets et d'étalonnage d'un robot industriel
EP3377948B1 (fr) 2015-11-16 2021-01-06 ABB Schweiz AG Facilitation de positionnement de robot

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"Dynamic Photogrammetric Calibration of Industrial Robots", SPIE'S 42ND ANNUAL MEETING, SAN DIEGO, 27.7. - 1.8., vol. 3174, 1997
LI WEI ET AL: "Simultaneous Robot-World and Hand-Eye Calibration without a Calibration Object", SENSORS, vol. 18, no. 11, 15 November 2018 (2018-11-15), CH, pages 3949, XP093266932, ISSN: 1424-8220, DOI: 10.3390/s18113949 *
NISHAD GOTHOSKAR ET AL: "Learning a generative model for robot control using visual feedback", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 10 March 2020 (2020-03-10), XP081618170 *
SIEHE AUCH LUHMANNROBSONKYLEBOEHM: "Optimierungsproblems erfolgen nach Nocedal, Wright, Numerical Optimization"
TABB AMY ET AL: "Solving the robot-world hand-eye(s) calibration problem with iterative methods", MACHINE VISION AND APPLICATIONS, SPRINGER VERLAG, DE, vol. 28, no. 5, 2 May 2017 (2017-05-02), pages 569 - 590, XP036288276, ISSN: 0932-8092, [retrieved on 20170502], DOI: 10.1007/S00138-017-0841-7 *
ZHUANG H ET AL: "ON VISION-BASED ROBOT CALIBRATION", SOUTHCON /94. CONFERENCE RECORD. ORLANDO, MAR. 29 - 31, 1994; [SOUTHCON], NEW YORK, IEEE, US, 29 March 1994 (1994-03-29), pages 104 - 109, XP000544398, ISBN: 978-0-7803-9989-1 *

Also Published As

Publication number Publication date
DE102023136450A1 (de) 2025-06-26

Similar Documents

Publication Publication Date Title
DE112011101730B4 (de) System und Verfahren zur robusten Kalibrierung zwischen einem Bildverarbeitungssystem und einem Roboter
EP2227356B1 (fr) Procédé et système destinés au positionnement très précis d'au moins un objet dans une position finale dans la pièce
EP2435217B1 (fr) Procédé et système destinés au positionnement très précis d'au moins un objet dans une position finale dans l' espace
EP1604789A2 (fr) Méthode et dispositif pour améliorer la précision du positionnement d'un manipulateur
DE102016114337A1 (de) System und verfahren zum verknüpfen von koordinatenräumen maschinellen sehens in einer umgebung angeleiteten zusammenbaus
EP1285224A1 (fr) Procede et dispositif pour determiner la forme en trois dimensions d'un objet
EP2269783A1 (fr) Procédé de calibrage pour un système de mesure
DE112019005484T5 (de) Automatische Kalibrierung für ein Kamera-Robotersystem mit Werkzeugoffsets
DE202019105838U1 (de) Anordnung mit einem Koordinatenmessgerät oder Mikroskop
DE102021209178A1 (de) Verfahren und Vorrichtung zum Bestimmen von Relativposen und zur Kalibrierung bei einem Koordinatenmessgerät oder Roboter
EP3491367B1 (fr) Procédé et dispositif d'étalonnage d'un système radiographique
EP3418680A1 (fr) Système et procédé de mesure de positionnement
DE102019102927B4 (de) Verfahren und Vorrichtung zum Bestimmen von dimensionalen und/oder geometrischen Eigenschaften eines Messobjekts
WO2009018894A1 (fr) Procédé et dispositif de détermination de données géométriques d'un objet mesuré
EP1915239B1 (fr) Procédé pour générer l'image d'un environnement
EP4225537A1 (fr) Procédé d'étalonnage pour l'étalonnage automatisé d'une caméra par rapport à un robot médical, et système d'assistance chirurgicale
WO2025132520A1 (fr) Procédé et dispositif pour étalonner un robot, procédé pour déterminer les coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et l'orientation d'un objet v mesurer, robot et programme d'ordinateur
EP1471401B1 (fr) Procédé pour mesurer le système de coordonnées d'une camera robot par rapport au système de coordonnées du robot ou vice versa
WO2025132540A1 (fr) Procédé et dispositif pour étalonner un robot, procédé pour déterminer des coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et de l'orientation d'un objet à mesurer, robot et programme informatique
WO2025132505A1 (fr) Procédé et dispositif pour étalonner un robot, procédé pour déterminer les coordonnées 3d d'un objet à mesurer, procédé pour déterminer la position et l'orientation d'un objet à mesurer, robot et programme informatique
DE102023105674B4 (de) Verfahren und Anordnung zur Kompensation nicht-geometrischer Fehlereinflüsse auf eine Roboterabsolutgenauigkeit mittels eines Laser-Sensor-Systems
DE102019108426A1 (de) Vorrichtung und Verfahren zum dreidimensionalen Erfassen wenigstens eines Objekts
WO2017207364A1 (fr) Dispositif de mesure d'objets
DE102024112958A1 (de) Optische Messeinrichtung und Verfahren zur optischen Vermessung von Messobjekten
DE102017107593B4 (de) Verfahren zum Bestimmen unbekannter Transformationen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24829518

Country of ref document: EP

Kind code of ref document: A1