[go: up one dir, main page]

CN112603542A - Hand-eye calibration method and device, electronic equipment and storage medium - Google Patents

Hand-eye calibration method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112603542A
CN112603542A CN202011432088.6A CN202011432088A CN112603542A CN 112603542 A CN112603542 A CN 112603542A CN 202011432088 A CN202011432088 A CN 202011432088A CN 112603542 A CN112603542 A CN 112603542A
Authority
CN
China
Prior art keywords
rotation
determining
coordinate system
posture
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011432088.6A
Other languages
Chinese (zh)
Other versions
CN112603542B (en
Inventor
王利峰
刘洪澎
沈晨
孙贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yake Wisdom Beijing Technology Co ltd
Original Assignee
Yake Wisdom Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yake Wisdom Beijing Technology Co ltd filed Critical Yake Wisdom Beijing Technology Co ltd
Priority to CN202011432088.6A priority Critical patent/CN112603542B/en
Publication of CN112603542A publication Critical patent/CN112603542A/en
Application granted granted Critical
Publication of CN112603542B publication Critical patent/CN112603542B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a hand-eye calibration method, a hand-eye calibration device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a rotation posture set of each terminal tool coordinate axis, wherein the rotation posture set comprises position posture information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot; determining coordinate axis straight lines of the coordinate axes of the end tools under a visual marking coordinate system based on the rotation posture set of the coordinate axes of the end tools; and determining a transformation relation between the coordinate system of the end tool and the coordinate system of the visual mark based on the coordinate axis straight lines of the coordinate axes of the end tools under the coordinate system of the visual mark. The method, the device, the electronic equipment and the storage medium provided by the invention have the advantages that the calibration process is simple to operate, complex calculation is not needed, and the reliable tracking of the end marker is facilitated.

Description

Hand-eye calibration method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of robotics, and in particular, to a method and an apparatus for calibrating a hand-eye, an electronic device, and a storage medium.
Background
The operation navigation system accurately corresponds the image data before or during operation of a patient to the anatomical structure of the patient on an operation bed, tracks the surgical instrument at the tail end of a robot in real time during the operation, and updates and displays the position of the surgical instrument on the image of the patient in a virtual instrument mode in real time, so that a doctor can clearly know the position of the surgical instrument relative to the anatomical structure of the patient, and the surgical operation is quicker, more accurate and safer.
In order to track a surgical instrument mounted at the end of a robot, a visual marker is usually fixed on the surgical instrument, and the position and posture of the visual marker are detected by a visual navigation instrument, so as to estimate the position of the surgical instrument relative to a patient. In order to accurately implement the robot navigation operation, a series of calibration works are required to be performed on the surgical instrument and the robot before the operation, and the transformation relation between the reference coordinate systems is determined, wherein the key step is to determine the position and the posture of the visual marker coordinate system relative to the robot end tool coordinate system, and the problem is called 'hand-eye calibration' in the robot field.
The existing hand-eye calibration method usually needs to solve a complex matrix equation or perform nonlinear optimization solution iteratively, and has high computational complexity and complicated calibration process.
Disclosure of Invention
The invention provides a hand-eye calibration method, a hand-eye calibration device, electronic equipment and a storage medium, which are used for solving the defects of large hand-eye calibration calculated amount and complicated calibration process in the prior art.
The invention provides a hand-eye calibration method, which comprises the following steps:
acquiring a rotation posture set of each terminal tool coordinate axis, wherein the rotation posture set comprises position posture information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot;
determining coordinate axis straight lines of the coordinate axes of the end tools under a visual marking coordinate system based on the rotation posture set of the coordinate axes of the end tools;
and determining a transformation relation between the coordinate system of the end tool and the coordinate system of the visual mark based on the coordinate axis straight lines of the coordinate axes of the end tools under the coordinate system of the visual mark.
According to the hand-eye calibration method provided by the invention, the coordinate axis straight line of each terminal tool coordinate axis under the visual marking coordinate system is determined based on the rotation gesture set of each terminal tool coordinate axis, and the method comprises the following steps:
determining a plurality of groups of rotation posture pairs based on a rotation posture set of any terminal tool coordinate axis, wherein each group of rotation posture pairs comprises position posture information of two different rotation angles;
determining the rotating shaft direction of each group of rotating posture pairs and determining the rotating shaft projection point position of each group of rotating posture pairs based on the relative position relation of each group of rotating posture pairs;
and determining a coordinate axis straight line of any terminal tool coordinate axis under the visual marking coordinate system based on the rotating axis direction and the rotating axis projection point position of each group of rotating posture pairs.
According to the hand-eye calibration method provided by the invention, the determining of the rotating shaft direction of each group of rotating posture pairs based on the relative position relationship of each group of rotating posture pairs comprises the following steps:
determining the direction of the axis of rotation of the respective set of rotational attitude pairs based on the following formula:
Figure BDA0002820955740000021
where v is a rotation axis direction vector of the pair of rotational orientations, θ is a rotation angle of the pair of rotational orientations, and θ is 2arccos (q)w),(qw,qx,qy,qz) Is a quaternion representing the relative positional relationship of the pair of rotational orientations.
According to the hand-eye calibration method provided by the invention, the determining of the positions of the rotating shaft projection points of each group of rotating posture pairs comprises the following steps:
determining the original point distance of two position posture information in each group of rotation posture pair based on the translation relation in the relative position relation of each group of rotation posture pair;
determining the distance between each group of rotation attitude pairs and the rotation axis based on the original point distance of two pieces of position attitude information in each group of rotation attitude pairs and the rotation angle of each group of rotation attitude pairs;
determining the direction from each group of rotation attitude pairs to the rotation axis based on the rotation direction of each group of rotation attitude pairs and the rotation axis direction;
and determining the position of the rotating shaft projection point of each group of rotating posture pairs based on the distance and the direction from each group of rotating posture pairs to the rotating shaft.
According to the hand-eye calibration method provided by the invention, the transformation relation between the terminal tool coordinate system and the visual mark coordinate system is determined based on the coordinate axis straight line of each terminal tool coordinate axis under the visual mark coordinate system, and the method comprises the following steps:
performing orthogonalization on a matrix formed by direction vectors of straight lines of all coordinate axes to obtain a rotation matrix;
determining an origin coordinate based on the midpoints of the common perpendicular lines between the straight lines of the coordinate axes;
based on the rotation matrix and the origin coordinates, a transformation relationship between the end tool coordinate system and the visual marker coordinate system is determined.
According to the hand-eye calibration method provided by the invention, the transformation relation between the terminal tool coordinate system and the visual mark coordinate system is determined based on the coordinate axis straight line of each terminal tool coordinate axis under the visual mark coordinate system, and then the method further comprises the following steps:
determining current pose information of the visual marker;
and determining the relative position relation between the visual navigation instrument and the robot base coordinate system based on the transformation relation between the end tool coordinate system and the visual mark coordinate system, the transformation relation between the end tool coordinate system and the robot base coordinate system and the current posture information of the visual mark.
The invention also provides a hand-eye calibration device, which comprises
The system comprises an attitude acquisition unit, a display unit and a control unit, wherein the attitude acquisition unit is used for acquiring a rotation attitude set of each terminal tool coordinate axis, the rotation attitude set comprises position attitude information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot;
the coordinate axis determining unit is used for determining coordinate axis straight lines of the coordinate axes of the terminal tools under the visual marking coordinate system based on the rotating posture set of the coordinate axes of the terminal tools;
and the calibration unit is used for determining the transformation relation between the terminal tool coordinate system and the visual mark coordinate system based on the coordinate axis straight line of each terminal tool coordinate axis under the visual mark coordinate system.
The invention further provides an electronic device, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of any of the above-mentioned hand-eye calibration methods when executing the computer program.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the hand-eye calibration method as described in any one of the above.
According to the hand-eye calibration method, the hand-eye calibration device, the electronic equipment and the storage medium, the transformation relation between the terminal tool coordinate system and the visual mark coordinate system can be calibrated only according to the position and posture information of the visual mark at each rotation angle in the process that the robot rotates around each terminal tool coordinate axis, the calibration process is simple to operate, complex calculation is not needed, and reliable terminal mark tracking is facilitated.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic view of a surgical navigation system provided by the present invention;
FIG. 2 is a schematic flow chart of a hand-eye calibration method provided by the present invention;
FIG. 3 is a schematic diagram of a set of rotational poses provided by the present invention;
FIG. 4 is a schematic diagram of the position calculation of the projection point of the rotating shaft according to the present invention;
FIG. 5 is a schematic diagram of the origin calibration provided by the present invention;
FIG. 6 is a schematic structural diagram of a hand-eye calibration device provided in the present invention;
FIG. 7 is a schematic structural diagram of an electronic device according to the present invention;
reference numerals:
1-a visual navigation instrument; 2-visual marking; 3-robot.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a schematic view of the surgical navigation system provided by the present invention, as shown in fig. 1, the visual marker 2 may be firmly and stably fixed on the end of the robot 3 by means of screw connection, and the visual navigation instrument 1 may acquire and track the position and posture information of the visual marker in real time, so as to display the position of the surgical instrument mounted on the end of the robot 3 on the virtual image by coordinate transformation. The hand-eye calibration performed by the present invention is to determine a coordinate system corresponding to the visual marker 2, i.e., a visual marker coordinate system, and a coordinate system corresponding to the end tool mounting position of the robot 3, i.e., an end tool coordinate system, which are transformation relations therebetween.
Fig. 2 is a schematic flow chart of a hand-eye calibration method provided by the present invention, and as shown in fig. 2, the method includes:
step 210, collecting a rotation attitude set of each end tool coordinate axis, wherein the rotation attitude set comprises position attitude information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding end tool coordinate axis, and the visual mark is arranged at the tail end of the robot.
In particular, the end tool coordinate axis, i.e., the coordinate axis in the end tool coordinate system, may be divided into X, Y and Z axes in the end tool coordinate system, typically in a three-dimensional space. When the hand and eye calibration is carried out, the robot can be controlled to rotate around the X axis, the Y axis and the Z axis under the terminal tool coordinate system by a certain angle, for example, 30 degrees, in the process, the visual navigation instrument can stably and accurately capture the visual mark at the terminal of the robot, and therefore the position and posture information of the visual mark under each rotation angle when the X axis, the Y axis and the Z axis under the terminal tool coordinate system are respectively used as rotation axes is obtained.
Taking the rotation of the robot around the X axis of the end tool coordinate system as an example, fig. 3 is a schematic diagram of the set of rotational postures provided by the present invention, and as shown in fig. 3, the robot can be controlled to rotate around the X axis by-15 ° from a proper initial position to reach a set starting position, then rotate around the X axis by 30 ° from this position to reach an end position, and finally return to the initial position. During the period of time from the starting position to the end position of the robot, the visual navigation instrument records the position and posture information of the robot end visual marker for n times (n is an even number, and when the recording times are odd numbers, the last data is discarded to ensure that n is an even number) at set time intervals (such as 20ms), so that a series of positions and postures acquired by the visual navigation instrument, namely a rotating posture set forming an X axis, can be expressed in the form of a homogeneous matrix, namely { T } TiI 1 … n, i representing the order of acquisition. It should be noted that the starting position and the ending position are set on the premise that the visual navigation instrument can capture the whole rotation process without occlusion.
And step 220, determining coordinate axis straight lines of the coordinate axes of the end tools under the visual marking coordinate system based on the rotation posture set of the coordinate axes of the end tools.
And step 230, determining a transformation relation between the terminal tool coordinate system and the visual mark coordinate system based on the coordinate axis straight lines of the terminal tool coordinate axes under the visual mark coordinate system.
Specifically, for any end tool coordinate axis, a coordinate axis straight line of the end tool coordinate axis in the visual marker coordinate system may be calculated based on the rotation posture set of the end tool coordinate axis, that is, the position posture information of each rotation angle during the rotation with the end tool coordinate axis as the rotation axis, the direction of the rotation axis, and the distance between the visual marker and the rotation axis. And determining the transformation relation between the coordinate system of the end tool and the coordinate system of the visual mark after obtaining the coordinate axis straight line of each coordinate axis of the end tool under the coordinate system of the visual mark.
According to the method provided by the embodiment of the invention, the transformation relation between the terminal tool coordinate system and the visual mark coordinate system can be calibrated only according to the position and posture information of the visual mark at each rotation angle in the process that the robot rotates around each terminal tool coordinate axis, the calibration process is simple to operate, complex calculation is not needed, and reliable terminal mark tracking is facilitated.
Based on the above embodiment, step 120 includes:
determining a plurality of groups of rotation posture pairs based on a rotation posture set of any terminal tool coordinate axis, wherein each group of rotation posture pairs comprises position posture information of two different rotation angles;
determining the rotating shaft direction of each group of rotating posture pairs and determining the rotating shaft projection point position of each group of rotating posture pairs based on the relative position relation of each group of rotating posture pairs;
and determining coordinate axis straight lines of the terminal tool coordinate axes under the visual marking coordinate system based on the rotating axis direction and the rotating axis projection point position of each group of rotating posture pairs.
Specifically, for a rotation posture set of any end tool coordinate axis, which includes position posture information of the visual marker at each rotation angle, the position posture information at each rotation angle may be combined to form a rotation posture pair, so as to represent a relative position relationship between the two rotation angles of the visual marker during the dynamic rotation process. The relative positional relationship here may be embodied as a rotation and translation process that needs to be passed to change from one positional attitude information to another positional attitude information.
After the position posture information under each rotation angle is grouped, a plurality of groups of rotation posture pairs can be obtained, the corresponding rotation axis direction and the rotation axis projection point position can be determined based on the relative position relation of each group of rotation posture pairs, and the straight line where the corresponding rotation axis is located can be obtained by combining the direction and the point. Because the robot rotates around the end tool coordinate axis, the rotating axis here is the end tool coordinate axis, and therefore the straight line where the end tool coordinate axis is located can be directly obtained.
Based on any of the above embodiments, in step 120, determining a plurality of sets of rotational orientation pairs based on the rotational orientation set of any of the end tool coordinate axes includes:
considering that when the two position posture information are too close, a large error may be introduced by the factors such as robot motion, and therefore the pairing interval can be determined according to the number of the position posture information in the rotation posture set, and further the position posture information is selected according to the pairing interval to form a rotation posture pair. Assuming that the number of position and posture information in the rotation posture set is n, n/2 can be used as a pairing interval, and rotation posture pairs are selected from the rotation posture set in the form of a homogeneous matrix at an interval of n/2, namely T is selected from the rotation posture setskAnd Tk+n/2As a set of rotational attitude pairs, k here is 1,2, …, n/2. For example, if data is collected 10 times during the rotation, 5 sets of rotation attitude pairs, each T, can be determined1And T6、T2And T7、…、T5And T10
Based on any of the above embodiments, in step 120, determining the rotation axis direction of each group of rotation posture pairs based on the relative position relationship of each group of rotation posture pairs includes:
the direction of the axis of rotation of each set of pairs of rotational poses is determined based on the following formula:
Figure BDA0002820955740000081
where v is a rotation axis direction vector of the rotation attitude pair, θ is a rotation angle of the rotation attitude pair, and θ is 2arccos (q)w),(qw,qx,qy,qz) Is a quaternion representing the relative positional relationship of the pair of rotational orientations.
Further, for a set of rotational pose pairs TkAnd Tk+n/2The relative positional relationship of the pair of rotational postures can be obtained by the following formula:
Figure BDA0002820955740000082
wherein M represents the relative positional relationship between the two acquired coordinate systems of the visual marker, i.e. M represents the relative positional relationship between the two acquired coordinate systems of the visual marker
Figure BDA0002820955740000083
On the basis, a correlation calculation method in robotics can be adopted to convert the homogeneous matrix M into a quaternion (q)w,qx,qy,qz) And then the rotation angle theta and the rotation axis direction v of the rotation attitude pair are calculated.
Based on any of the above embodiments, in step 120, determining the positions of the rotation axis projection points of each group of rotation attitude pairs includes:
determining the original point distance of two position posture information in each group of rotation posture pair based on the translation relation in the relative position relation of each group of rotation posture pair;
determining the distance between each group of rotation attitude pairs and the rotation axis based on the original point distance of two pieces of position attitude information in each group of rotation attitude pairs and the rotation angle of each group of rotation attitude pairs;
determining the direction from each group of rotation attitude pairs to the rotation axis based on the rotation direction of each group of rotation attitude pairs and the rotation axis direction;
and determining the position of the rotating shaft projection point of each group of rotating posture pairs based on the distance and the direction from each group of rotating posture pairs to the rotating shaft.
Specifically, for any group of rotation posture pairs, the relative position relationship of the rotation posture pairs can be further divided into a translation relationship and a rotation relationship, specifically, the fourth column of the relative position relationship M matrix includes the translation relationships Δ x, Δ y, and Δ z, and the origin distance of the two pieces of position posture information in the rotation posture pairs can be calculated based on Δ x, Δ y, and Δ z in the M matrix. The origin distance here is a distance between the origins of the visual marker coordinate systems corresponding to the two pieces of position and orientation information in the rotation orientation pair, respectively.
FIG. 4 is a schematic diagram of the pivot projection point position calculation provided by the present invention, as shown in FIG. 4, a set of rotational attitude pairs TkAnd Tk+n/2Can be expressed as s, i.e. T in the figurekAnd Tk+n/2The straight line connecting the two parts is a straight line,
Figure BDA0002820955740000091
on the basis, the rotation attitude pair T which is known in advance is combinedkAnd Tk+n/2The rotation angle θ, i.e. the distance between the rotation attitude pair and the rotation axis, i.e. the distance d in the figure, can be calculated by combining the trigonometric function:
Figure BDA0002820955740000092
the direction of the rotation position pair to the axis of rotation is understood to be the direction of the line on which the distance d lies, and can be determined by the direction of rotation of the rotation position pair, i.e. the direction of rotation
Figure BDA0002820955740000093
Is calculated by cross-multiplication with the rotation axis direction v.
After the direction from the rotating posture pair to the rotating shaft and the distance d from the rotating posture pair to the rotating shaft are obtained, the position of the rotating shaft projection point of the rotating posture pair can be calculated, which is equivalent to a coordinate system TkAnd Tk+n/2And the projection point of the middle point of the coordinate origin connecting line on the rotating shaft.
Based on any of the above embodiments, in step 120, determining a coordinate axis straight line of the coordinate axis of the end tool in the visual marker coordinate system based on the rotation axis direction and the rotation axis projection point position of each set of rotation attitude pairs includes:
and aiming at the rotating shaft direction and the rotating shaft projection point position of any group of rotating posture pairs, obtaining a rotating shaft straight line determined based on the rotating posture pairs. In order to reduce random calibration errors, the determined rotation axis lines based on each set of rotation postures can be averaged to determine coordinate axis lines of the coordinate axis of the end tool in the visual mark coordinate system, taking the X axis as an example, which can be expressed as a direction vector vxAnd point p thereonx
Based on the same way, the direction vector v of the Y axis can be obtainedyAnd point p thereonyAnd a direction vector v of the Z axiszAnd point p thereonz
Based on any of the above embodiments, step 130 includes:
performing orthogonalization on a matrix formed by direction vectors of straight lines of all coordinate axes to obtain a rotation matrix;
determining an origin coordinate based on the midpoints of the common perpendicular lines between the straight lines of the coordinate axes;
based on the rotation matrix and the origin coordinates, a transformation relationship between the end tool coordinate system and the visual marker coordinate system is determined.
Specifically, the direction vector v of each coordinate axis straight linex、vy、vzThe matrix R' may be constructed and may not be orthogonal due to the presence of errors. In consideration of this situation, it is necessary to orthogonalize the matrix R', and here the orthogonalization may be realized by QR decomposition or singular value decomposition of the matrix.
In addition, although the origin position of the end tool coordinate system under the visual marker coordinate system can be theoretically determined by calculating the intersection point of the coordinate axis straight lines, due to the existence of various errors, the three coordinate axis straight lines may not intersect, i.e., straight lines with different planes exist. For this case, the origin coordinates may be determined based on the midpoints of the common vertical lines between the straight lines of the coordinate axes.
FIG. 5 is a schematic diagram of the origin calibration provided by the present invention, as shown in FIG. 5, with the line l along the X-axis and the Y-axisx、lyFor example, the midpoint o of the common perpendicular to the two out-of-plane straight lines can be calculatedxyThe midpoint o of the common perpendicular line of the X axis and the Z axis can be calculated in the same wayxzMidpoint o of common perpendicular line of Y-axis and Z-axisyzTaking the average value o ═ o (o)xy+oxz+oyz) And/3 as the origin of the end tool coordinate system with reference to the visual marker coordinate system.
After the rotation matrix and the origin coordinates are obtained, the transformation relationship T between the end tool coordinate system and the visual marker coordinate system can be determined as follows:
Figure BDA0002820955740000111
wherein R is a rotation matrix obtained by orthogonalizing R'; o is the origin of coordinates.
Based on any of the above embodiments, step 130 further includes:
determining current pose information of the visual marker;
and determining the relative position relation between the visual navigation instrument and the robot base coordinate system based on the transformation relation between the end tool coordinate system and the visual mark coordinate system, the transformation relation between the end tool coordinate system and the robot base coordinate system and the current posture information of the visual mark.
Specifically, the transformation relationship between the end tool coordinate system and the visual marker coordinate system is obtained based on step 110-
Figure BDA0002820955740000112
The transformation relationship between the end tool coordinate system and the robot base coordinate system can be obtained from the positive kinematics of robotics and is noted as
Figure BDA0002820955740000113
In the process of surgical navigation, the visual mark at the current moment can be acquired in real timeAttitude information under visual navigation instruments, i.e. current attitude information
Figure BDA0002820955740000114
The relative position relationship between the visual navigation instrument and the robot base coordinate system can be calculated in real time:
Figure BDA0002820955740000115
in the above-mentioned formula,
Figure BDA0002820955740000116
is a matrix that can be determined from positive kinematics in robotics; the visual marker is typically secured to the robot tip securely and stably by a threaded connection or the like, thereby representing a matrix of relative positions of the visual marker and the tip tool coordinate system
Figure BDA0002820955740000117
Is also fixed and unchangeable; the visual navigation instrument is usually arranged on a movable base and is placed at a proper position of an operation area when in use, and the position and the posture of the visual navigation instrument can be adjusted when the visual mark moves out of a visual field range, so that the position and the posture of the visual navigation instrument relative to a robot base coordinate system can be changed, and the matrix is calculated in real time
Figure BDA0002820955740000118
The mapping relation between the visual coordinate system and the robot coordinate system can be accurately obtained, and the influence caused by adjusting the visual navigation instrument can be avoided.
In addition, another problem in the navigation operation is that the visual navigation instrument is fixed, and the visual mark at the tail end of the robot is not tracked by the navigation instrument due to the posture or the shielding, and the like, and then the matrix calculated according to the calibration can be used
Figure BDA0002820955740000119
And a matrix obtained from positive kinematics of robotics
Figure BDA00028209557400001110
The position and the attitude of the visual mark relative to the robot base coordinate system are calculated and are transformed by a transformation matrix
Figure BDA0002820955740000121
The visual markers are unified within the robot base coordinate system and the visual coordinate system.
Based on any of the above embodiments, fig. 6 is a schematic structural diagram of the hand-eye calibration device provided by the present invention, as shown in fig. 6, the device includes:
the attitude acquisition unit 610 is used for acquiring a rotation attitude set of each end tool coordinate axis, wherein the rotation attitude set comprises position attitude information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding end tool coordinate axis, and the visual mark is arranged at the tail end of the robot;
a coordinate axis determining unit 620, configured to determine, based on the rotation posture set of each end tool coordinate axis, a coordinate axis straight line of each end tool coordinate axis in the visual marker coordinate system;
and a calibration unit 630, configured to determine a transformation relationship between the end tool coordinate system and the visual marker coordinate system based on coordinate axis straight lines of each end tool coordinate axis under the visual marker coordinate system.
According to the device provided by the embodiment of the invention, the transformation relation between the terminal tool coordinate system and the visual mark coordinate system can be calibrated only according to the position and posture information of the visual mark at each rotation angle in the process that the robot rotates around each terminal tool coordinate axis, the calibration process is simple to operate, complex calculation is not needed, and reliable terminal mark tracking is facilitated.
Based on any one of the above embodiments, the coordinate axis determination unit includes:
the grouping subunit is used for determining a plurality of groups of rotation posture pairs based on the rotation posture set of any terminal tool coordinate axis, and each group of rotation posture pairs comprises position posture information of two different rotation angles;
the calculating subunit is used for determining the rotating shaft direction of each group of rotating posture pairs and determining the rotating shaft projection point position of each group of rotating posture pairs based on the relative position relation of each group of rotating posture pairs;
and the axis determining subunit is used for determining a coordinate axis straight line of any terminal tool coordinate axis under the visual marking coordinate system based on the rotating axis direction and the rotating axis projection point position of each group of rotating posture pairs.
Based on any of the above embodiments, the calculation subunit is configured to:
determining the direction of the axis of rotation of the respective set of rotational attitude pairs based on the following formula:
Figure BDA0002820955740000131
where v is a rotation axis direction vector of the pair of rotational orientations, θ is a rotation angle of the pair of rotational orientations, and θ is 2arccos (q)w),(qw,qx,qy,qz) Is a quaternion representing the relative positional relationship of the pair of rotational orientations.
Based on any of the above embodiments, the calculation subunit is further configured to:
determining the original point distance of two position posture information in each group of rotation posture pair based on the translation relation in the relative position relation of each group of rotation posture pair;
determining the distance between each group of rotation attitude pairs and the rotation axis based on the original point distance of two pieces of position attitude information in each group of rotation attitude pairs and the rotation angle of each group of rotation attitude pairs;
determining the direction from each group of rotation attitude pairs to the rotation axis based on the rotation direction of each group of rotation attitude pairs and the rotation axis direction;
and determining the position of the rotating shaft projection point of each group of rotating posture pairs based on the distance and the direction from each group of rotating posture pairs to the rotating shaft.
Based on any of the above embodiments, the calibration unit is configured to:
performing orthogonalization on a matrix formed by direction vectors of straight lines of all coordinate axes to obtain a rotation matrix;
determining an origin coordinate based on the midpoints of the common perpendicular lines between the straight lines of the coordinate axes;
based on the rotation matrix and the origin coordinates, a transformation relationship between the end tool coordinate system and the visual marker coordinate system is determined.
Based on any of the above embodiments, the apparatus further comprises a base coordinate calibration unit, configured to:
determining current pose information of the visual marker;
and determining the relative position relation between the visual navigation instrument and the robot base coordinate system based on the transformation relation between the end tool coordinate system and the visual mark coordinate system, the transformation relation between the end tool coordinate system and the robot base coordinate system and the current posture information of the visual mark.
Fig. 7 is a schematic structural diagram of an electronic device provided in the present invention, and as shown in fig. 7, the electronic device may include: a processor (processor)710, a communication Interface (Communications Interface)720, a memory (memory)730, and a communication bus 740, wherein the processor 710, the communication Interface 720, and the memory 730 communicate with each other via the communication bus 740. Processor 710 may call logical commands in memory 730 to perform the following method: acquiring a rotation posture set of each terminal tool coordinate axis, wherein the rotation posture set comprises position posture information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot; determining coordinate axis straight lines of the coordinate axes of the end tools under a visual marking coordinate system based on the rotation posture set of the coordinate axes of the end tools; and determining a transformation relation between the coordinate system of the end tool and the coordinate system of the visual mark based on the coordinate axis straight lines of the coordinate axes of the end tools under the coordinate system of the visual mark.
In addition, the logic commands in the memory 730 can be implemented in the form of software functional units and stored in a computer readable storage medium when the logic commands are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes a plurality of commands for enabling a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Embodiments of the present invention further provide a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented to perform the method provided in the foregoing embodiments when executed by a processor, and the method includes: acquiring a rotation posture set of each terminal tool coordinate axis, wherein the rotation posture set comprises position posture information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot; determining coordinate axis straight lines of the coordinate axes of the end tools under a visual marking coordinate system based on the rotation posture set of the coordinate axes of the end tools; and determining a transformation relation between the coordinate system of the end tool and the coordinate system of the visual mark based on the coordinate axis straight lines of the coordinate axes of the end tools under the coordinate system of the visual mark.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes commands for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A hand-eye calibration method is characterized by comprising the following steps:
acquiring a rotation posture set of each terminal tool coordinate axis, wherein the rotation posture set comprises position posture information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot;
determining coordinate axis straight lines of the coordinate axes of the end tools under a visual marking coordinate system based on the rotation posture set of the coordinate axes of the end tools;
and determining a transformation relation between the coordinate system of the end tool and the coordinate system of the visual mark based on the coordinate axis straight lines of the coordinate axes of the end tools under the coordinate system of the visual mark.
2. The hand-eye calibration method according to claim 1, wherein determining a coordinate axis straight line of each end tool coordinate axis in the visual marker coordinate system based on the rotation gesture set of each end tool coordinate axis comprises:
determining a plurality of groups of rotation posture pairs based on a rotation posture set of any terminal tool coordinate axis, wherein each group of rotation posture pairs comprises position posture information of two different rotation angles;
determining the rotating shaft direction of each group of rotating posture pairs and determining the rotating shaft projection point position of each group of rotating posture pairs based on the relative position relation of each group of rotating posture pairs;
and determining a coordinate axis straight line of any terminal tool coordinate axis under the visual marking coordinate system based on the rotating axis direction and the rotating axis projection point position of each group of rotating posture pairs.
3. The hand-eye calibration method according to claim 2, wherein the determining the rotation axis direction of each group of rotation attitude pairs based on the relative position relationship of each group of rotation attitude pairs comprises:
determining the direction of the axis of rotation of the respective set of rotational attitude pairs based on the following formula:
Figure FDA0002820955730000011
where v is a rotation axis direction vector of the pair of rotational orientations, θ is a rotation angle of the pair of rotational orientations, and θ is 2arccos (q)w),(qw,qx,qy,qz) Is a quaternion representing the relative positional relationship of the pair of rotational orientations.
4. A hand-eye calibration method according to claim 3, wherein the determining the positions of the projection points of the rotation axes of the sets of rotation posture pairs comprises:
determining the original point distance of two position posture information in each group of rotation posture pair based on the translation relation in the relative position relation of each group of rotation posture pair;
determining the distance between each group of rotation attitude pairs and the rotation axis based on the original point distance of two pieces of position attitude information in each group of rotation attitude pairs and the rotation angle of each group of rotation attitude pairs;
determining the direction from each group of rotation attitude pairs to the rotation axis based on the rotation direction of each group of rotation attitude pairs and the rotation axis direction;
and determining the position of the rotating shaft projection point of each group of rotating posture pairs based on the distance and the direction from each group of rotating posture pairs to the rotating shaft.
5. The hand-eye calibration method according to any one of claims 1 to 4, wherein the determining a transformation relationship between the end tool coordinate system and the visual marker coordinate system based on the coordinate axis straight lines of the coordinate axes of the end tools under the visual marker coordinate system comprises:
performing orthogonalization on a matrix formed by direction vectors of straight lines of all coordinate axes to obtain a rotation matrix;
determining an origin coordinate based on the midpoints of the common perpendicular lines between the straight lines of the coordinate axes;
based on the rotation matrix and the origin coordinates, a transformation relationship between the end tool coordinate system and the visual marker coordinate system is determined.
6. The hand-eye calibration method according to any one of claims 1 to 4, wherein the determining of the transformation relationship between the coordinate system of the end tool and the coordinate system of the visual marker based on the coordinate axis straight line of each coordinate axis of the end tool under the coordinate system of the visual marker further comprises:
determining current pose information of the visual marker;
and determining the relative position relation between the visual navigation instrument and the robot base coordinate system based on the transformation relation between the end tool coordinate system and the visual mark coordinate system, the transformation relation between the end tool coordinate system and the robot base coordinate system and the current posture information of the visual mark.
7. A hand-eye calibration device, comprising:
the system comprises an attitude acquisition unit, a display unit and a control unit, wherein the attitude acquisition unit is used for acquiring a rotation attitude set of each terminal tool coordinate axis, the rotation attitude set comprises position attitude information of a visual mark at each rotation angle in the process that the robot rotates around the corresponding terminal tool coordinate axis, and the visual mark is arranged at the tail end of the robot;
the coordinate axis determining unit is used for determining coordinate axis straight lines of the coordinate axes of the terminal tools under the visual marking coordinate system based on the rotating posture set of the coordinate axes of the terminal tools;
and the calibration unit is used for determining the transformation relation between the terminal tool coordinate system and the visual mark coordinate system based on the coordinate axis straight line of each terminal tool coordinate axis under the visual mark coordinate system.
8. The hand-eye calibration apparatus according to claim 7, wherein the coordinate axis determination unit comprises:
the grouping subunit is used for determining a plurality of groups of rotation posture pairs based on the rotation posture set of any terminal tool coordinate axis, and each group of rotation posture pairs comprises position posture information of two different rotation angles;
the calculating subunit is used for determining the rotating shaft direction of each group of rotating posture pairs and determining the rotating shaft projection point position of each group of rotating posture pairs based on the relative position relation of each group of rotating posture pairs;
and the axis determining subunit is used for determining a coordinate axis straight line of any terminal tool coordinate axis under the visual marking coordinate system based on the rotating axis direction and the rotating axis projection point position of each group of rotating posture pairs.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the hand-eye calibration method according to any one of claims 1 to 6 when executing the program.
10. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the hand-eye calibration method according to any one of claims 1 to 6.
CN202011432088.6A 2020-12-07 2020-12-07 Hand-eye calibration method and device, electronic equipment and storage medium Active CN112603542B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011432088.6A CN112603542B (en) 2020-12-07 2020-12-07 Hand-eye calibration method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011432088.6A CN112603542B (en) 2020-12-07 2020-12-07 Hand-eye calibration method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112603542A true CN112603542A (en) 2021-04-06
CN112603542B CN112603542B (en) 2022-03-29

Family

ID=75229513

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011432088.6A Active CN112603542B (en) 2020-12-07 2020-12-07 Hand-eye calibration method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112603542B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114343847A (en) * 2022-01-06 2022-04-15 广东工业大学 Hand-eye calibration method of surgical robot based on optical positioning system
CN115721417A (en) * 2022-09-09 2023-03-03 苏州铸正机器人有限公司 Full-view measuring device and method for end pose of surgical robot
CN116277035A (en) * 2023-05-15 2023-06-23 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113035A (en) * 2016-06-16 2016-11-16 华中科技大学 A kind of Six-DOF industrial robot end-of-arm tooling coordinate system caliberating device and method
CN106272412A (en) * 2016-08-25 2017-01-04 芜湖瑞思机器人有限公司 A kind of Zero calibration method of pinion and-rack four-freedom-degree parallel-connection robot
CN108210024A (en) * 2017-12-29 2018-06-29 威朋(苏州)医疗器械有限公司 Operation piloting method and system
CN108210070A (en) * 2017-12-29 2018-06-29 微创(上海)医疗机器人有限公司 Mechanical arm and its method of work and operating robot
CN108420529A (en) * 2018-03-26 2018-08-21 上海交通大学 The surgical navigational emulation mode guided based on image in magnetic tracking and art
CN108652740A (en) * 2018-04-26 2018-10-16 上海交通大学 A kind of scaling method of floating bone block position real-time tracking
US20190015988A1 (en) * 2017-07-11 2019-01-17 Seiko Epson Corporation Robot control device, robot, robot system, and calibration method of camera for robot
US20190133790A1 (en) * 2017-11-07 2019-05-09 Howmedica Osteonics Corp. Robotic System For Shoulder Arthroplasty Using Stemless Implant Components

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113035A (en) * 2016-06-16 2016-11-16 华中科技大学 A kind of Six-DOF industrial robot end-of-arm tooling coordinate system caliberating device and method
CN106272412A (en) * 2016-08-25 2017-01-04 芜湖瑞思机器人有限公司 A kind of Zero calibration method of pinion and-rack four-freedom-degree parallel-connection robot
US20190015988A1 (en) * 2017-07-11 2019-01-17 Seiko Epson Corporation Robot control device, robot, robot system, and calibration method of camera for robot
US20190133790A1 (en) * 2017-11-07 2019-05-09 Howmedica Osteonics Corp. Robotic System For Shoulder Arthroplasty Using Stemless Implant Components
CN108210024A (en) * 2017-12-29 2018-06-29 威朋(苏州)医疗器械有限公司 Operation piloting method and system
CN108210070A (en) * 2017-12-29 2018-06-29 微创(上海)医疗机器人有限公司 Mechanical arm and its method of work and operating robot
CN108420529A (en) * 2018-03-26 2018-08-21 上海交通大学 The surgical navigational emulation mode guided based on image in magnetic tracking and art
CN108652740A (en) * 2018-04-26 2018-10-16 上海交通大学 A kind of scaling method of floating bone block position real-time tracking

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114343847A (en) * 2022-01-06 2022-04-15 广东工业大学 Hand-eye calibration method of surgical robot based on optical positioning system
CN114343847B (en) * 2022-01-06 2023-05-30 广东工业大学 Hand-eye calibration method of surgical robot based on optical positioning system
CN115721417A (en) * 2022-09-09 2023-03-03 苏州铸正机器人有限公司 Full-view measuring device and method for end pose of surgical robot
CN115721417B (en) * 2022-09-09 2024-01-30 苏州铸正机器人有限公司 A device and method for measuring the full field of view of the end position of a surgical robot
CN116277035A (en) * 2023-05-15 2023-06-23 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment
CN116277035B (en) * 2023-05-15 2023-09-12 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment

Also Published As

Publication number Publication date
CN112603542B (en) 2022-03-29

Similar Documents

Publication Publication Date Title
CN109910016B (en) Visual acquisition calibration method, device and system based on multi-degree-of-freedom mechanical arm
CN112603542B (en) Hand-eye calibration method and device, electronic equipment and storage medium
CN113524201B (en) Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium
CN102848389B (en) Realization method for mechanical arm calibrating and tracking system based on visual motion capture
CN1322961C (en) Metering device
CN110842901B (en) Robot hand-eye calibration method and device based on novel three-dimensional calibration block
CN115429429B (en) A surgical instrument calibration and visual tracking method for optical navigation surgery
CN109099908A (en) Method, framework, device, the system of sensing equipment position tracking
CN113081266B (en) Robot registration method and device, electronic equipment and storage medium
CN116000925B (en) Mechanical arm tracking method, device, equipment and storage medium
CN111890356A (en) Mechanical arm coordinate system and camera coordinate system calibration method, device, equipment and medium
CN115229805B (en) Hand-eye calibration method and device for surgical robot, storage medium and processor
CN116038721B (en) Hand-eye calibration method and system without kinematic participation
CN115187672B (en) Geometric error calibration methods, devices, multi-axis motion systems, and storage media
CN116394254A (en) Zero calibration method and device for robot and computer storage medium
CN114209433A (en) Surgical robot navigation positioning method and device
WO2022252676A1 (en) Method and apparatus for calibrating robotic arm flange physical origin, and electronic device
CN116942314A (en) Positioning method and system for mixing optical positioning and mechanical positioning
CN117562661A (en) Method for detecting collision of mechanical arm and related product
CN113081267B (en) Error elimination method, error elimination device, electronic equipment and storage medium
CN118415756A (en) Motion simulation control method, device, medium and product of surgical navigation robot
CN118453131A (en) Surgical robot tail end posture adjustment method based on transformation matrix
CN117100399A (en) A surgical instrument calibration method and system
CN117754592A (en) Mechanical arm calibration method, device, equipment and medium based on vision
CN112712030A (en) Three-dimensional attitude information restoration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant