[go: up one dir, main page]

WO2021128787A1 - Positioning method and apparatus - Google Patents

Positioning method and apparatus Download PDF

Info

Publication number
WO2021128787A1
WO2021128787A1 PCT/CN2020/099476 CN2020099476W WO2021128787A1 WO 2021128787 A1 WO2021128787 A1 WO 2021128787A1 CN 2020099476 W CN2020099476 W CN 2020099476W WO 2021128787 A1 WO2021128787 A1 WO 2021128787A1
Authority
WO
WIPO (PCT)
Prior art keywords
movable end
information
pose information
pose
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2020/099476
Other languages
French (fr)
Chinese (zh)
Inventor
罗舟
何东杰
杨洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Unionpay Co Ltd
Original Assignee
China Unionpay Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Unionpay Co Ltd filed Critical China Unionpay Co Ltd
Publication of WO2021128787A1 publication Critical patent/WO2021128787A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Definitions

  • the present invention relates to the field of computer technology, in particular to a positioning method and device.
  • Positioning is achieved by designing a specific mechanical structure or installing a sensing device at a specific location.
  • this solution requires the design of a specific device (such as a three-dimensional laser scanner installed on a robotic arm) for positioning, which leads to an increase in the cost of the product.
  • a specific device such as a three-dimensional laser scanner installed on a robotic arm
  • its positioning accuracy in a fixed position is high, it is positioned in other positions The accuracy will be reduced.
  • the above solution requires an additional design of a specific positioning structure, which not only reduces the range of motion of the robotic arm, lacks versatility, but also increases the hardware cost, which is not conducive to popularization.
  • the present invention provides a positioning method and device to solve the problem of how to accurately position the movable end of a mechanical arm.
  • the present invention provides a positioning method, which is suitable for a robot provided with a robotic arm, the robotic arm being arranged on a base of the robot; including:
  • the first pose information is obtained by collecting image information
  • the second pose information is obtained by collecting motion information
  • the first pose information and the second pose information are fused to determine the pose information of the movable end.
  • the vision sensor information is combined with the kinematics information of the robotic arm to accurately locate the movable end pose of the robotic arm. This method does not require the design of additional mechanical structures, and can achieve precise positioning even in different working environments.
  • the determining the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm includes:
  • the second pose information of the movable end is determined.
  • the position information of the movable end is obtained through the first collector, the posture information of the movable end is obtained through the second collector, and then the two are combined to obtain the pose of the movable end.
  • This method takes into account the movement of the robot arm while positioning. Learning information can reduce the impact of external noise on positioning and help improve the robustness and accuracy of measurement results.
  • the determining the position information of the movable end according to the movement information of each joint and the position information of the base includes:
  • determining the second pose information of the movable end includes:
  • the position information of the movable end is obtained by using the position information of the base and the transformation matrix between the joints, which can quickly and accurately obtain the position information of the movable end of the manipulator without adding additional mechanical structure.
  • the second coordinate of the movable end in the world coordinate system is determined by formula (1):
  • [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system
  • [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm
  • n-1 T n is the first transformation matrix between the joints.
  • the second transformation matrix of the active end is determined by formula (2):
  • R h R( ⁇ roll )R( ⁇ pitch )R( ⁇ yaw ) is the rotation matrix
  • ⁇ roll is the roll angle of the movable end of the robot arm
  • ⁇ pitch is the pitch angle of the movable end of the robot arm
  • ⁇ yaw is the yaw angle of the movable end of the robotic arm.
  • the second pose information of the movable end is determined by formula (3):
  • W T r is a second movable end of the transformation matrix
  • P r is the position of the base in the world coordinate system.
  • the determining the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm includes:
  • the above solution predicts the current pose by using the second pose information of the active end at the last moment and the internal inertia of the robot arm at the last moment and the control information at the previous moment (such as the influence of the torque input by the motor on the current motion). Can improve the accuracy of positioning.
  • the determining the pose information of the movable end according to the first pose information and the second pose information includes:
  • the above scheme by using the first error, the second error, the first pose information, and the second pose information to determine the pose information of the active end, not only can predict the current pose of the active end, but also calculate the credibility of the predicted value , To further improve the accuracy of positioning.
  • the pose information of the movable end is determined by formula (5):
  • P′ w(i) is the second posture information
  • P′′ w(i) is the first posture information
  • K i is the gain coefficient
  • H is the observation matrix of the camera device
  • R is the uncertainty of the camera measurement.
  • the second error of the second pose information is determined by formula (6):
  • Pi -1 is the pose of the movable end at the previous moment
  • V is the environmental noise
  • the present invention provides a positioning device, including:
  • An acquiring unit configured to acquire image information through the camera device of the robot, and acquire movement information through a collector provided on the robotic arm;
  • the processing unit is used to determine the first posture information of the movable end of the robot arm through the image information acquired by the camera device of the robot, and determine the activity through the motion information acquired by the collector provided on the robot arm
  • the second pose information of the mobile terminal determines the pose information of the movable terminal according to the first pose information and the second pose information.
  • processing unit is specifically configured to:
  • the second pose information of the movable end is determined.
  • the processing unit is specifically configured to: determine a first transformation matrix between the joints from the base to the movable end according to the angle information of the joints;
  • determining the second pose information of the movable end includes:
  • the processing unit is specifically configured to determine the second coordinate of the movable end in the world coordinate system through formula (1):
  • [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system
  • [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm Coordinates in the world coordinate system
  • n-1 T n is the first transformation matrix between the joints
  • the second transformation matrix of the movable end is determined by formula (2):
  • R h R( ⁇ roll )R( ⁇ pitch )R( ⁇ yaw ) is the rotation matrix
  • ⁇ roll is the roll angle of the movable end of the robot arm
  • ⁇ pitch is the pitch angle of the movable end of the robot arm
  • ⁇ yaw is the yaw angle of the movable end of the robotic arm
  • the second pose information of the movable end is determined by formula (3):
  • W T r is a second movable end of the transformation matrix
  • P r is the position of the base in the world coordinate system.
  • the processing unit is specifically configured to: determine the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm, including:
  • the processing unit is specifically configured to determine the current second pose information of the active end through formula (4):
  • A is the state transition matrix of the robotic arm
  • B is the input transition matrix of the robotic arm
  • u i-1 is the joint input matrix of the i-1sth robotic arm.
  • the processing unit is specifically configured to: the determining the pose information of the movable end according to the first pose information and the second pose information includes:
  • the processing unit is specifically configured to determine the pose information of the movable end through formula (5):
  • P′ w(i) is the second posture information
  • P′′ w(i) is the first posture information
  • K i is the gain coefficient
  • H is the observation matrix of the camera device
  • R is the uncertainty of the camera measurement.
  • the processing unit is specifically configured to determine the second error of the second pose information through formula (6):
  • Pi -1 is the pose of the movable end at the previous moment
  • V is the environmental noise
  • the present invention provides a computer controlled device, including:
  • Memory used to store program instructions
  • the processor is configured to call the program instructions stored in the memory, and execute the method described in the first aspect above according to the obtained program.
  • the present invention provides a computer-readable non-volatile storage medium, including computer-readable instructions.
  • the computer reads and executes the computer-readable instructions, the computer is caused to execute the method described in the first aspect. .
  • FIG. 1 is a schematic diagram of a system architecture provided by an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a positioning method provided by an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a positioning method provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a positioning method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of a positioning method provided by an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of a positioning method provided by an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a positioning device provided by an embodiment of the present invention.
  • FIG. 1 exemplarily shows a system architecture to which an embodiment of the present invention is applicable.
  • the system architecture may include a camera device 100, a collector 200, and a robotic arm 300.
  • the camera device 100 is used to obtain image information of the movable end of the robotic arm 300.
  • the collector 200 is arranged on the robotic arm 300 and is used to obtain movement information of the movable end of the robotic arm 300.
  • the imaging device 100 may be a vision sensor such as a monocular camera, a depth camera, etc.
  • a vision sensor such as a monocular camera, a depth camera, etc.
  • the structure shown in FIG. 1 is only an example, and the embodiment of the present invention does not limit this.
  • FIG. 2 exemplarily shows a flow of a positioning method, which may be executed by a positioning device.
  • the process specifically includes:
  • Step 201 Determine the first pose information of the movable end of the robot arm through the image information obtained by the camera of the robot.
  • the camera device may be a vision sensor such as a monocular camera and a depth camera.
  • Step 202 Determine the second pose information of the movable end through the movement information acquired by the collector set on the robotic arm.
  • Step 203 Determine the pose information of the movable end according to the first pose information and the second pose information.
  • the first pose information is obtained by collecting image information
  • the second pose information is obtained by collecting motion information
  • the first pose information and the second pose information are merged to determine the pose information of the movable end.
  • a specific implementation process of the above step 201 may be as shown in Fig. 3:
  • Step 301 Obtain the positions of multiple points at the movable end of the robotic arm.
  • Step 302 Determine the first pose information of the movable end of the robotic arm according to the positions of the multiple points.
  • the installation of the depth camera can be completed first, and the depth camera can be installed above the robotic arm. After the depth camera is installed, the positions of multiple points on the movable end of the robotic arm can be obtained later to generate a position matrix .
  • the position coordinates of P0 can be obtained by the following formula:
  • the coordinates P0 of the fixed point P can be obtained by averaging here.
  • the coordinates of the remaining points cannot be simply averaged. Instead, the coordinates of each point are first weighted and then averaged according to the positional relationship between the remaining points and P.
  • one of the four points is occluded, as shown in Figure 4, first take the average of the coordinates of the two points a and b, that is, the coordinates of e, and then take the average of the coordinates of the two points b and c That is, the coordinate of f, and then the position coordinate of P is obtained according to the ordinate of e and the abscissa of f.
  • the simple example here takes two-dimensional coordinates as an example. In the actual process, each point takes the coordinates in space, that is, three-dimensional coordinates.
  • f is the posture conversion function.
  • w T c is the homogeneous transformation matrix that changes from the camera coordinate system to the world coordinate system.
  • step 202 A specific implementation process of step 202 may be as shown in FIG. 5:
  • Step 501 Obtain motion information of each joint of the robotic arm through the first collector.
  • Step 502 Determine the position information of the movable end according to the movement information of each joint and the position information of the base.
  • Step 503 Obtain posture information of the mobile terminal through the second collector.
  • Step 504 Determine the second pose information of the mobile terminal according to the position information of the mobile terminal and the posture information of the mobile terminal.
  • the first collector may be a code disc installed in each movement joint, or other sensors that can obtain movement information, which is not specifically limited in this application.
  • the acquired motion information may be information that can indicate the motion parameters of each joint, such as joint angles and joint angular speeds, which are not specifically limited in this application.
  • the first transformation matrix between the joints from the base to the movable end may be determined first according to the angle information of each joint, and then the first transformation matrix between the base and the base is determined according to the first transformation matrix of each joint.
  • the first coordinate in the world coordinate system determines the second coordinate of the movable end in the world coordinate system.
  • the second coordinate of the movable end in the world coordinate system can be determined by formula (1):
  • [x a ,y a ,z a ,1] are the coordinates of the base in the world coordinate system
  • [x h ′,y h ′,z h ′,1] are the movable end of the robot arm in the world coordinate system coordinate of.
  • the robot has 6 joints from its base to the movable end of the manipulator.
  • the motion information of each joint is collected through the code disk of each motion joint, which mainly includes joint angle, joint angular velocity, etc.
  • the homogeneous transformation matrix 0 T 1 , 1 T 2 , 2 T 3 , 3 T 4 , 4 T 5 , 5 T 6 of joint 1 to joint 6 can be obtained.
  • the second collector may be a sensor device such as a gyroscope that can collect attitude information of the movable end.
  • the attitude information includes, but is not limited to, the pitch angle ⁇ pitch , the yaw angle ⁇ yaw , and the roll of the movable end of the robotic arm.
  • Angle ⁇ roll, etc., this application does not specifically limit the second collector and the posture information collected by the second collector.
  • step 504 a specific implementation process of step 504 is as follows:
  • a second transformation matrix for transforming from the base to the movable end is determined.
  • the second pose information of the movable end is determined.
  • R h R( ⁇ roll )R( ⁇ pitch )R( ⁇ yaw )
  • homogeneous transformation matrix from the base of the robot arm to the movable end of the robot arm can be obtained as:
  • P r is the position of the robot arm base in the world coordinate system.
  • adding the coordinates of the movable end of the robotic arm in the world coordinate system to the transformation matrix is based on the principle of pure translation transformation, that is, moving in a constant posture in space. In this case, its direction unit vector remains the same. The direction remains unchanged, all changes are just the transformation of the origin of the coordinate system relative to the reference coordinate system.
  • the position of the new coordinate system can be represented by the original position vector of the original coordinate system plus the vector representing the displacement. If it is in matrix form, the representation of the new coordinate system can be obtained by multiplying the coordinate system by the transformation matrix, that is, the transformation matrix is as follows:
  • x 'h, y' h , z 'h flat is moved with respect to the amount of the reference coordinate axes of the three components.
  • the first 3 columns of the matrix indicate no rotational movement, and the last column indicates translational movement.
  • the rotation matrix R h is added to the first 3 columns of the matrix to obtain a homogeneous transformation matrix.
  • the position information of the movable end is obtained through the first collector, the posture information of the movable end is obtained through the second collector, and then the two are combined to obtain the pose of the movable end, which can combine the kinematics information of the robotic arm while positioning, which is helpful In order to reduce the influence of external noise on the positioning, the robustness and accuracy of the measurement results are improved.
  • the first error of the first pose information and the second error of the second pose information may be determined first, and then according to the first error, the second error, the first pose information and the The second pose information determines the pose information of the active end.
  • Step 601 Determine the state transition matrix of the robotic arm through the motion information acquired by the first collector.
  • Step 602 Determine the input transfer matrix and the joint input matrix of the robot arm according to the control information corresponding to the motion information.
  • Step 603 Determine the current second pose information of the movable end according to the second pose information, the state transition matrix, the input transition matrix, and the joint input matrix of the movable end at the previous moment.
  • the first error, the second error, the first pose information, and the second pose information to determine the pose information of the active end, not only the current pose of the active end can be predicted, but also the predictability of the predicted value can be calculated.
  • the degree of trustworthiness helps to further improve the accuracy of positioning.
  • A is the state transition matrix of the manipulator, which represents the function of calculating the position of the movable end from the joint angle, angular velocity and the parameters of the manipulator.
  • B is the input transfer matrix of the mechanical arm, which represents the function of calculating the position of the movable end from the torque input by the motor.
  • u i-1 is the joint input matrix of the i-1 s -th manipulator arm, which represents the input torque of each motor.
  • P i-1 is the actual error covariance matrix of the end pose of the i-1 s -th manipulator
  • v is the measurement uncertainty caused by environmental noise or manipulator deformation.
  • H is the observation matrix of the camera, representing the actual object, here is the function converted by the robot arm into image information
  • R is the uncertainty of the camera measurement, representing the error calculated by the algorithm when evaluating the active end of the robot arm.
  • the above scheme predicts the current pose based on the second pose information of the active end at the last moment and the influence of the inertia of the robot arm at the last moment and the control information (such as the torque input by the motor) on the current motion at the last moment. Helps improve the accuracy of positioning.
  • FIG. 6 exemplarily shows the structure of the positioning device provided by the embodiment of the present invention, and the device can execute the flow of the positioning method.
  • the device may include:
  • the acquiring unit 701 is configured to acquire image information through the camera device of the robot, and acquire movement information through a collector provided on the robotic arm;
  • the processing unit 702 is configured to determine the first pose information of the movable end of the robot arm through the image information acquired by the camera device of the robot, and determine the motion information acquired by the collector provided on the robot arm.
  • the second pose information of the movable end determines the pose information of the movable end according to the first pose information and the second pose information.
  • processing unit 702 is specifically configured to:
  • the second pose information of the movable end is determined.
  • the processing unit 702 is specifically configured to: determine a first transformation matrix between the joints from the base to the movable end according to the angle information of the joints;
  • determining the second pose information of the movable end includes:
  • processing unit 702 is specifically configured to determine the second coordinate of the movable end in the world coordinate system through formula (1):
  • [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system
  • [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm Coordinates in the world coordinate system
  • n-1 T n is the first transformation matrix between the joints
  • the second transformation matrix of the movable end is determined by formula (2):
  • R h R( ⁇ roll )R( ⁇ pitch )R( ⁇ yaw ) is the rotation matrix
  • ⁇ roll is the roll angle of the movable end of the robot arm
  • ⁇ pitch is the pitch angle of the movable end of the robot arm
  • ⁇ yaw is the yaw angle of the movable end of the robotic arm
  • the second pose information of the movable end is determined by formula (3):
  • W T r is a second movable end of the transformation matrix
  • P r is the position of the base in the world coordinate system.
  • the processing unit 702 is specifically configured to: determine the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm, including:
  • processing unit 702 is specifically configured to: determine the current second pose information of the active end through formula (4):
  • A is the state transition matrix of the robotic arm
  • B is the input transition matrix of the robotic arm
  • u i-1 is the joint input matrix of the i-1sth robotic arm.
  • the processing unit 702 is specifically configured to: the determining the pose information of the movable end according to the first pose information and the second pose information includes:
  • the processing unit 702 is specifically configured to determine the pose information of the movable end through formula (5):
  • P′ w(i) is the second posture information
  • P′′ w(i) is the first posture information
  • K i is the gain coefficient
  • H is the observation matrix of the camera device
  • R is the uncertainty of the camera measurement.
  • processing unit 702 is specifically configured to determine the second error of the second pose information through formula (6):
  • Pi -1 is the pose of the movable end at the previous moment
  • V is the environmental noise
  • an embodiment of the present invention also provides a computing controlled device, including:
  • Memory used to store program instructions
  • the processor is configured to call the program instructions stored in the memory, and execute the above positioning method according to the obtained program.
  • embodiments of the present invention also provide a computer-readable non-volatile storage medium, including computer-readable instructions.
  • the computer reads and executes the computer-readable instructions, the computer is caused to perform the above positioning.
  • the embodiments of the present invention can be provided as a method, a system, or a computer program product. Therefore, the present invention may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, the present invention may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, optical storage, etc.) containing computer-usable program codes.
  • a computer-usable storage media including but not limited to disk storage, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing controlled equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce a manufactured product including the instruction device,
  • the instruction device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

A positioning method and apparatus, wherein same are used for improving the positioning precision. The method comprises: determining first posture information of a movable end by means of image information acquired by a photographic apparatus (100); determining second posture information of the movable end by means of motion information acquired by a collector (200); and determining posture information of the movable end according to the first posture information and the second posture information. Posture information of a movable end is determined by means of fusing first posture information and second posture information, such that visual sensing information is acquired, and accurate positioning of a posture of the movable end of a mechanical arm (300) is realized in combination with kinematics information of the mechanical arm (300). In this way, accurate positioning can be realized, even under different working environments, without designing an additional mechanical structure.

Description

一种定位的方法及装置Method and device for positioning

相关申请的交叉引用Cross references to related applications

本申请要求在2019年12月23日提交中国专利局、申请号为201911340837.X、申请名称为“一种定位的方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office, the application number is 201911340837.X, and the application name is "a method and device for positioning" on December 23, 2019. The entire content of the Chinese patent application is incorporated herein by reference. Applying.

技术领域Technical field

本发明涉及计算机技术领域,尤其涉及一种定位的方法及装置。The present invention relates to the field of computer technology, in particular to a positioning method and device.

背景技术Background technique

传统服务器机房巡检运维采用人工检查的方式,不仅任务繁重,消耗大量人力物力,还常因人为疏忽等原因导致无法及时发现异常,影响机房中设备的及时修复。采用机器人来执行运维任务,可以在无人值守或少人值守的机房中进行24小时不间断的巡检运维,便于及时发现机房设备的异常情况。Traditional server room inspection, operation and maintenance uses manual inspection, which not only requires heavy tasks and consumes a lot of manpower and material resources, but also often fails to detect abnormalities in time due to human negligence and other reasons, which affects the timely repair of equipment in the computer room. The use of robots to perform operation and maintenance tasks can carry out 24-hour uninterrupted inspection and operation and maintenance in unattended or less-attended computer rooms, which is convenient for timely detection of abnormal conditions of the equipment in the computer room.

当机器人在机房中执行运维任务,例如执行诸如重启服务器、插拔服务器硬盘等动作时,需要驱动机器人的机械臂移动到固定的位置。然而,服务器上的按钮尺寸及硬盘厚度均在毫米级,这种情况下,若想要准确地触碰按钮或是拔出硬盘,则需要实时对机械臂活动端进行准确定位,以确保能运动到目标位置。为了实现准确定位,目前使用的技术方案如下:When the robot performs operation and maintenance tasks in the computer room, such as performing actions such as restarting the server, plugging and unplugging the server hard disk, etc., it is necessary to drive the robot arm to move to a fixed position. However, the size of the buttons on the server and the thickness of the hard disk are in the millimeter level. In this case, if you want to accurately touch the button or pull out the hard disk, you need to accurately position the movable end of the robotic arm in real time to ensure that it can move. To the target location. In order to achieve accurate positioning, the technical solutions currently used are as follows:

通过设计特定的机械结构或在特定位置安装传感装置来实现定位。此方案一方面需要设计特定装置(比如安装在机械臂上的三维激光扫描仪)进行定位,导致产品的成本增加,另一方面其虽然在固定位置的定位精度较高,然而在其他位置的定位精度则会降低。Positioning is achieved by designing a specific mechanical structure or installing a sensing device at a specific location. On the one hand, this solution requires the design of a specific device (such as a three-dimensional laser scanner installed on a robotic arm) for positioning, which leads to an increase in the cost of the product. On the other hand, although its positioning accuracy in a fixed position is high, it is positioned in other positions The accuracy will be reduced.

也就是说,上述方案需要额外设计特定的定位结构,不仅减小了机械臂的运动范围,缺乏通用性,还增加了硬件成本,不利于推广。In other words, the above solution requires an additional design of a specific positioning structure, which not only reduces the range of motion of the robotic arm, lacks versatility, but also increases the hardware cost, which is not conducive to popularization.

发明内容Summary of the invention

本发明提供一种定位的方法及装置,用以解决如何对机械臂活动端进行精准定位的问题。The present invention provides a positioning method and device to solve the problem of how to accurately position the movable end of a mechanical arm.

第一方面,本发明提供一种定位的方法,适用于设有机械臂的机器人,所述机械臂设置在所述机器人的基座上;包括:In a first aspect, the present invention provides a positioning method, which is suitable for a robot provided with a robotic arm, the robotic arm being arranged on a base of the robot; including:

通过所述机器人的摄像装置获取的图像信息,确定所述机械臂的活动端的第一位姿信息;Determine the first pose information of the movable end of the robot arm through the image information acquired by the camera device of the robot;

通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息;Determine the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm;

根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。Determine the pose information of the movable end according to the first pose information and the second pose information.

上述方案,通过采集图像信息得到第一位姿信息,通过采集运动信息获得第二位姿信息,将第一位姿信息和第二位姿信息进行融合以确定活动端的位姿信息,能够在获取视觉传感信息的同时结合机械臂的运动学信息,以精准定位出机械臂的活动端位姿。该种方式不需要设计额外的机械结构,即使在不同的工作环境下也能实现精确定位。In the above solution, the first pose information is obtained by collecting image information, the second pose information is obtained by collecting motion information, and the first pose information and the second pose information are fused to determine the pose information of the movable end. The vision sensor information is combined with the kinematics information of the robotic arm to accurately locate the movable end pose of the robotic arm. This method does not require the design of additional mechanical structures, and can achieve precise positioning even in different working environments.

可选的,所述通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,包括:Optionally, the determining the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm includes:

通过第一采集器获取所述机械臂的各关节的运动信息,根据所述各关节的运动信息和所述基座的位置信息,确定所述活动端的位置信息;Obtain the movement information of each joint of the mechanical arm through the first collector, and determine the position information of the movable end according to the movement information of the joints and the position information of the base;

通过第二采集器获取所述活动端的姿态信息;Acquiring the posture information of the movable end through the second collector;

根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息。According to the position information of the movable end and the posture information of the movable end, the second pose information of the movable end is determined.

上述方案,通过第一采集器得到活动端的位置信息,通过第二采集器得到活动端的姿态信息,然后将二者结合得到活动端的位姿,这种方式在定位的同时综合考虑了机械臂的运动学信息,能够减小外界噪声对定位的影响,有助于提升测量结果的鲁棒性及精度。In the above scheme, the position information of the movable end is obtained through the first collector, the posture information of the movable end is obtained through the second collector, and then the two are combined to obtain the pose of the movable end. This method takes into account the movement of the robot arm while positioning. Learning information can reduce the impact of external noise on positioning and help improve the robustness and accuracy of measurement results.

可选的,所述根据所述各关节的运动信息和所述基座的位置信息,确定所述活动端的位置信息,包括:Optionally, the determining the position information of the movable end according to the movement information of each joint and the position information of the base includes:

根据所述各关节的角度信息,确定自所述基座至所述活动端之间的各关节之间的第一变换矩阵;Determine the first transformation matrix between the joints from the base to the movable end according to the angle information of the joints;

根据所述各关节的第一变换矩阵和所述基座在世界坐标系下的第一坐标,确定所述活动端在世界坐标系下的第二坐标;Determine the second coordinate of the movable end in the world coordinate system according to the first transformation matrix of each joint and the first coordinate of the base in the world coordinate system;

根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息,包括:According to the position information of the movable end and the posture information of the movable end, determining the second pose information of the movable end includes:

根据所述活动端的姿态信息,确定所述活动端的旋转矩阵;Determine the rotation matrix of the movable end according to the posture information of the movable end;

根据所述旋转矩阵和所述第二坐标,确定从所述基座转换至所述活动端的第二变换矩阵;Determining a second transformation matrix to transform from the base to the movable end according to the rotation matrix and the second coordinates;

根据所述第二变换矩阵和所述第二坐标,确定所述活动端的第二位姿信息。Determine the second pose information of the movable end according to the second transformation matrix and the second coordinates.

上述方案,通过使用基座的位置信息和各关节之间的变换矩阵得到活动端的位置信息,能够在不增加额外机械结构的情况下,快速且准确地获得机械臂活动端的位置信息。In the above solution, the position information of the movable end is obtained by using the position information of the base and the transformation matrix between the joints, which can quickly and accurately obtain the position information of the movable end of the manipulator without adding additional mechanical structure.

可选的,通过公式(1)确定所述活动端在世界坐标系下的第二坐标:Optionally, the second coordinate of the movable end in the world coordinate system is determined by formula (1):

Figure PCTCN2020099476-appb-000001
Figure PCTCN2020099476-appb-000001

其中,[x a,y a,z a,1]为所述基座在世界坐标系下的坐标,[x h′,y h′,z h′,1]为所述机械臂活动端在世界坐标系下的坐标, n-1T n为各关节之间的第一变换矩阵。 Among them, [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system, and [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm For the coordinates in the world coordinate system, n-1 T n is the first transformation matrix between the joints.

可选的,通过公式(2)确定所述活动端的第二变换矩阵:Optionally, the second transformation matrix of the active end is determined by formula (2):

Figure PCTCN2020099476-appb-000002
Figure PCTCN2020099476-appb-000002

其中,R h=R(θ roll)R(θ pitch)R(θ yaw)为旋转矩阵,θ roll为所述机械臂活动端的横滚角,θ pitch为所述机械臂活动端的俯仰角,θ yaw为所述机械臂活动端的偏航角。 Among them, R h =R(θ roll )R(θ pitch )R(θ yaw ) is the rotation matrix, θ roll is the roll angle of the movable end of the robot arm, θ pitch is the pitch angle of the movable end of the robot arm, θ yaw is the yaw angle of the movable end of the robotic arm.

可选的,通过公式(3)确定所述活动端的第二位姿信息:Optionally, the second pose information of the movable end is determined by formula (3):

公式(3)P w′= wT r·P r Formula (3) P w ′ = w T r ·P r

其中,P′ W为活动端的第二位姿信息, WT r为所述活动端的第二变换矩阵,P r为所述基座在世界坐标系下的位置。 Wherein, P 'W pose information of the second movable end, W T r is a second movable end of the transformation matrix, P r is the position of the base in the world coordinate system.

可选的,所述通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,包括:Optionally, the determining the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm includes:

通过第一采集器获取的运动信息,确定所述机械臂的状态转移矩阵;Determine the state transition matrix of the robotic arm through the motion information acquired by the first collector;

根据所述运动信息对应的控制信息,确定所述机械臂的输入转移矩阵以及关节输入矩阵;Determine the input transfer matrix and the joint input matrix of the robot arm according to the control information corresponding to the motion information;

根据上一时刻所述活动端的第二位姿信息、状态转移矩阵、输入转移矩阵以及关节输入矩阵确定所述活动端在当前的第二位姿信息。Determine the current second pose information of the movable end according to the second pose information, the state transition matrix, the input transition matrix, and the joint input matrix of the movable end at the previous moment.

上述方案,通过使用上一时刻活动端的第二位姿信息以及上一时刻机械臂内部由于惯性以及上一时刻控制信息(比如电机输入的力矩对当前运动的影响)对当前的位姿进行预测,能够提高定位的准确度。The above solution predicts the current pose by using the second pose information of the active end at the last moment and the internal inertia of the robot arm at the last moment and the control information at the previous moment (such as the influence of the torque input by the motor on the current motion). Can improve the accuracy of positioning.

可选的,所述根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息,包括:Optionally, the determining the pose information of the movable end according to the first pose information and the second pose information includes:

确定所述第一位姿信息的第一误差和所述第二位姿信息的第二误差;Determining the first error of the first pose information and the second error of the second pose information;

根据所述第一误差、所述第二误差、所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。Determine the pose information of the movable end according to the first error, the second error, the first pose information, and the second pose information.

上述方案,通过使用第一误差、第二误差、第一位姿信息和第二位姿信息确定活动端的位姿信息,不仅能够预测活动端当前的位姿,还能计算预测值的可信程度,进一步提高定位的精确度。The above scheme, by using the first error, the second error, the first pose information, and the second pose information to determine the pose information of the active end, not only can predict the current pose of the active end, but also calculate the credibility of the predicted value , To further improve the accuracy of positioning.

可选的,通过公式(5)确定所述活动端的位姿信息:Optionally, the pose information of the movable end is determined by formula (5):

公式(5)P w(i)=P′ w(i)+K i·(P″ w(i)-H·P′ w(i)) Formula (5) P w(i) = P′ w(i) + K i ·(P″ w(i) -H·P′ w(i) )

其中,

Figure PCTCN2020099476-appb-000003
P′ w(i)为第二姿态信息,P″ w(i)为第一姿态信息,K i为增益系数,H为摄像装置的观测矩阵,
Figure PCTCN2020099476-appb-000004
为第二位姿信息的第二误差,R为相机测量的不确定度。 among them,
Figure PCTCN2020099476-appb-000003
P′ w(i) is the second posture information, P″ w(i) is the first posture information, K i is the gain coefficient, H is the observation matrix of the camera device,
Figure PCTCN2020099476-appb-000004
Is the second error of the second pose information, and R is the uncertainty of the camera measurement.

可选的,通过公式(6)确定所述第二位姿信息的第二误差:Optionally, the second error of the second pose information is determined by formula (6):

Figure PCTCN2020099476-appb-000005
Figure PCTCN2020099476-appb-000005

其中,P i-1为上一时刻所述活动端的位姿,V环境噪音。 Among them, Pi -1 is the pose of the movable end at the previous moment, and V is the environmental noise.

第二方面,本发明提供一种定位的装置,包括:In the second aspect, the present invention provides a positioning device, including:

获取单元,用于通过所述机器人的摄像装置获取图像信息,通过设置在所述机械臂的采集器获取运动信息;An acquiring unit, configured to acquire image information through the camera device of the robot, and acquire movement information through a collector provided on the robotic arm;

处理单元,用于通过所述机器人的摄像装置获取的图像信息,确定所述机械臂的活动端的第一位姿信息,通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。The processing unit is used to determine the first posture information of the movable end of the robot arm through the image information acquired by the camera device of the robot, and determine the activity through the motion information acquired by the collector provided on the robot arm The second pose information of the mobile terminal determines the pose information of the movable terminal according to the first pose information and the second pose information.

可选的,所述处理单元具体用于:Optionally, the processing unit is specifically configured to:

通过第一采集器获取所述机械臂的各关节的运动信息,根据所述各关节的运动信息和所述基座的位置信息,确定所述活动端的位置信息;Obtain the movement information of each joint of the mechanical arm through the first collector, and determine the position information of the movable end according to the movement information of the joints and the position information of the base;

通过第二采集器获取所述活动端的姿态信息;Acquiring the posture information of the movable end through the second collector;

根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息。According to the position information of the movable end and the posture information of the movable end, the second pose information of the movable end is determined.

可选的,所述处理单元具体用于:根据所述各关节的角度信息,确定自 所述基座至所述活动端之间的各关节之间的第一变换矩阵;Optionally, the processing unit is specifically configured to: determine a first transformation matrix between the joints from the base to the movable end according to the angle information of the joints;

根据所述各关节的第一变换矩阵和所述基座在世界坐标系下的第一坐标,确定所述活动端在世界坐标系下的第二坐标;Determine the second coordinate of the movable end in the world coordinate system according to the first transformation matrix of each joint and the first coordinate of the base in the world coordinate system;

根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息,包括:According to the position information of the movable end and the posture information of the movable end, determining the second pose information of the movable end includes:

根据所述活动端的姿态信息,确定所述活动端的旋转矩阵;Determine the rotation matrix of the movable end according to the posture information of the movable end;

根据所述旋转矩阵和所述第二坐标,确定从所述基座转换至所述活动端的第二变换矩阵;Determining a second transformation matrix to transform from the base to the movable end according to the rotation matrix and the second coordinates;

根据所述第二变换矩阵和所述第二坐标,确定所述活动端的第二位姿信息。Determine the second pose information of the movable end according to the second transformation matrix and the second coordinates.

可选的,所述处理单元具体用于:通过公式(1)确定所述活动端在世界坐标系下的第二坐标:Optionally, the processing unit is specifically configured to determine the second coordinate of the movable end in the world coordinate system through formula (1):

Figure PCTCN2020099476-appb-000006
Figure PCTCN2020099476-appb-000006

其中,[x a,y a,z a,1]为所述基座在世界坐标系下的坐标,[x h′,y h′,z h′,1]为所述机械臂活动端在世界坐标系下的坐标, n-1T n为各关节之间的第一变换矩阵; Among them, [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system, and [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm Coordinates in the world coordinate system, n-1 T n is the first transformation matrix between the joints;

通过公式(2)确定所述活动端的第二变换矩阵:The second transformation matrix of the movable end is determined by formula (2):

Figure PCTCN2020099476-appb-000007
Figure PCTCN2020099476-appb-000007

其中,R h=R(θ roll)R(θ pitch)R(θ yaw)为旋转矩阵,θ roll为所述机械臂活动端的横滚角,θ pitch为所述机械臂活动端的俯仰角,θ yaw为所述机械臂活动端的偏航角; Among them, R h =R(θ roll )R(θ pitch )R(θ yaw ) is the rotation matrix, θ roll is the roll angle of the movable end of the robot arm, θ pitch is the pitch angle of the movable end of the robot arm, θ yaw is the yaw angle of the movable end of the robotic arm;

通过公式(3)确定所述活动端的第二位姿信息:The second pose information of the movable end is determined by formula (3):

公式(3)P w′= wT r·P r Formula (3) P w ′ = w T r ·P r

其中,P′ W为活动端的第二位姿信息, WT r为所述活动端的第二变换矩阵,P r为所述基座在世界坐标系下的位置。 Wherein, P 'W pose information of the second movable end, W T r is a second movable end of the transformation matrix, P r is the position of the base in the world coordinate system.

可选的,所述处理单元具体用于:,所述通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,包括:Optionally, the processing unit is specifically configured to: determine the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm, including:

通过第一采集器获取的运动信息,确定所述机械臂的状态转移矩阵;Determine the state transition matrix of the robotic arm through the motion information acquired by the first collector;

根据所述运动信息对应的控制信息,确定所述机械臂的输入转移矩阵以及关节输入矩阵;Determine the input transfer matrix and the joint input matrix of the robot arm according to the control information corresponding to the motion information;

根据上一时刻所述活动端的第二位姿信息、状态转移矩阵、输入转移矩阵以及关节输入矩阵确定所述活动端在当前的第二位姿信息。Determine the current second pose information of the movable end according to the second pose information, the state transition matrix, the input transition matrix, and the joint input matrix of the movable end at the previous moment.

可选的,所述处理单元具体用于:通过公式(4)确定所述活动端在当前的第二位姿信息:Optionally, the processing unit is specifically configured to determine the current second pose information of the active end through formula (4):

公式(4)P′ w(i)=A·P w(i-1)+B·u i-1 Formula (4) P′ w(i) =A·P w(i-1) +B·u i-1

其中,A为所述机械臂状态转移矩阵,B为所述机械臂输入转移矩阵,u i-1为第i-1s所述机械臂的关节输入矩阵。 Wherein, A is the state transition matrix of the robotic arm, B is the input transition matrix of the robotic arm, and u i-1 is the joint input matrix of the i-1sth robotic arm.

可选的,所述处理单元具体用于:所述根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息,包括:Optionally, the processing unit is specifically configured to: the determining the pose information of the movable end according to the first pose information and the second pose information includes:

确定所述第一位姿信息的第一误差和所述第二位姿信息的第二误差;Determining the first error of the first pose information and the second error of the second pose information;

根据所述第一误差、所述第二误差、所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。Determine the pose information of the movable end according to the first error, the second error, the first pose information, and the second pose information.

可选的,所述处理单元具体用于:通过公式(5)确定所述活动端的位姿信息:Optionally, the processing unit is specifically configured to determine the pose information of the movable end through formula (5):

公式(5)P w(i)=P′ w(i)+K i·(P″ w(i)-H·P′ w(i)) Formula (5) P w(i) = P′ w(i) + K i ·(P″ w(i) -H·P′ w(i) )

其中,

Figure PCTCN2020099476-appb-000008
P′ w(i)为第二姿态信息,P″ w(i)为第一姿态 信息,K i为增益系数,H为摄像装置的观测矩阵,
Figure PCTCN2020099476-appb-000009
为第二位姿信息的第二误差,R为相机测量的不确定度。 among them,
Figure PCTCN2020099476-appb-000008
P′ w(i) is the second posture information, P″ w(i) is the first posture information, K i is the gain coefficient, H is the observation matrix of the camera device,
Figure PCTCN2020099476-appb-000009
Is the second error of the second pose information, and R is the uncertainty of the camera measurement.

可选的,所述处理单元具体用于:通过公式(6)确定所述第二位姿信息的第二误差:Optionally, the processing unit is specifically configured to determine the second error of the second pose information through formula (6):

Figure PCTCN2020099476-appb-000010
Figure PCTCN2020099476-appb-000010

其中,P i-1为上一时刻所述活动端的位姿,V为环境噪音。 Among them, Pi -1 is the pose of the movable end at the previous moment, and V is the environmental noise.

第三方面,本发明提供一种计算机被控设备,包括:In a third aspect, the present invention provides a computer controlled device, including:

存储器,用于存储程序指令;Memory, used to store program instructions;

处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行上述第一方面所述的方法。The processor is configured to call the program instructions stored in the memory, and execute the method described in the first aspect above according to the obtained program.

第四方面,本发明提供一种计算机可读非易失性存储介质,包括计算机可读指令,当计算机读取并执行所述计算机可读指令时,使得计算机执行上述第一方面所述的方法。In a fourth aspect, the present invention provides a computer-readable non-volatile storage medium, including computer-readable instructions. When the computer reads and executes the computer-readable instructions, the computer is caused to execute the method described in the first aspect. .

附图说明Description of the drawings

为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简要介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to explain the technical solutions in the embodiments of the present invention more clearly, the following will briefly introduce the drawings needed in the description of the embodiments. Obviously, the drawings in the following description are only some embodiments of the present invention. For those of ordinary skill in the art, other drawings can be obtained from these drawings without creative labor.

图1为本发明实施例提供的一种系统架构的示意图;FIG. 1 is a schematic diagram of a system architecture provided by an embodiment of the present invention;

图2为本发明实施例提供的一种定位方法的流程示意图;2 is a schematic flowchart of a positioning method provided by an embodiment of the present invention;

图3为本发明实施例提供的一种定位方法的流程示意图;FIG. 3 is a schematic flowchart of a positioning method provided by an embodiment of the present invention;

图4为本发明实施例提供的一种定位方法的示意图;4 is a schematic diagram of a positioning method provided by an embodiment of the present invention;

图5为本发明实施例提供的一种定位方法的流程示意图;FIG. 5 is a schematic flowchart of a positioning method provided by an embodiment of the present invention;

图6为本发明实施例提供的一种定位方法的流程示意图;FIG. 6 is a schematic flowchart of a positioning method provided by an embodiment of the present invention;

图7为本发明实施例提供的一种定位装置的结构示意图。FIG. 7 is a schematic structural diagram of a positioning device provided by an embodiment of the present invention.

具体实施方式Detailed ways

为了更好的理解上述技术方案,下面将结合说明书附图及具体的实施方式对上述技术方案进行详细的说明,应当理解本发明实施例以及实施例中的具体特征是对本发明技术方案的详细的说明,而不是对本发明技术方案的限定,在不冲突的情况下,本发明实施例以及实施例中的技术特征可以相互结合。In order to better understand the above technical solutions, the above technical solutions will be described in detail below in conjunction with the drawings and specific implementations of the specification. It should be understood that the embodiments of the present invention and the specific features in the embodiments are detailed to the technical solutions of the present invention. Note, rather than limiting the technical solution of the present invention, the embodiments of the present invention and the technical features in the embodiments can be combined with each other under the condition of no conflict.

本申请实施例提供的定位方法,可以适用于设有机械臂的机器人,其中,机械臂可以设置在机器人的基座上。图1示例性的示出了本发明实施例所适用的一种系统架构,该系统架构可以包括摄像装置100、采集器200以及机械臂300。The positioning method provided by the embodiment of the present application may be applicable to a robot provided with a mechanical arm, wherein the mechanical arm may be set on the base of the robot. FIG. 1 exemplarily shows a system architecture to which an embodiment of the present invention is applicable. The system architecture may include a camera device 100, a collector 200, and a robotic arm 300.

其中,摄像装置100用于获取机械臂300活动端的图像信息。Among them, the camera device 100 is used to obtain image information of the movable end of the robotic arm 300.

采集器200设置在机械臂300上,用于获取机械臂300活动端的运动信息。The collector 200 is arranged on the robotic arm 300 and is used to obtain movement information of the movable end of the robotic arm 300.

需要说明的是,摄像装置100可以为单目相机、深度相机等视觉传感器,上述图1所示的结构仅是一种示例,本发明实施例对此不做限定。It should be noted that the imaging device 100 may be a vision sensor such as a monocular camera, a depth camera, etc. The structure shown in FIG. 1 is only an example, and the embodiment of the present invention does not limit this.

为了更好的解释上述实施例,图2示例性的示出了一种定位的方法的流程,该流程可以由定位的装置执行。In order to better explain the foregoing embodiment, FIG. 2 exemplarily shows a flow of a positioning method, which may be executed by a positioning device.

如图2所示,该流程具体包括:As shown in Figure 2, the process specifically includes:

步骤201,通过机器人的摄像装置获取的图像信息,确定机械臂的活动端的第一位姿信息。Step 201: Determine the first pose information of the movable end of the robot arm through the image information obtained by the camera of the robot.

需要说明的是,摄像装置可以为单目相机、深度相机等视觉传感器。It should be noted that the camera device may be a vision sensor such as a monocular camera and a depth camera.

步骤202,通过设置在机械臂的采集器获取的运动信息,确定活动端的第二位姿信息。Step 202: Determine the second pose information of the movable end through the movement information acquired by the collector set on the robotic arm.

步骤203,根据第一位姿信息和第二位姿信息,确定活动端的位姿信息。Step 203: Determine the pose information of the movable end according to the first pose information and the second pose information.

上述方案,通过采集图像信息得到第一位姿信息,通过采集运动信息获得第二位姿信息,并将第一位姿信息和第二位姿信息进行融合以确定活动端的位姿信息,能够在获取视觉传感信息的同时结合机械臂的运动学信息,实 现对机械臂的活动端位姿的精准定位。这种方式不需要设计额外的机械结构,且在不同的工作环境下均能实现精确定位。In the above solution, the first pose information is obtained by collecting image information, the second pose information is obtained by collecting motion information, and the first pose information and the second pose information are merged to determine the pose information of the movable end. Acquire the visual sensor information and combine the kinematics information of the robotic arm to realize the precise positioning of the active end pose of the robotic arm. In this way, no additional mechanical structure is required, and precise positioning can be achieved in different working environments.

上述步骤201的一种具体的实现流程可以如图3所示:A specific implementation process of the above step 201 may be as shown in Fig. 3:

步骤301,获取机械臂活动端多个点的位置。Step 301: Obtain the positions of multiple points at the movable end of the robotic arm.

步骤302,根据多个点的位置确定机械臂活动端的第一位姿信息。Step 302: Determine the first pose information of the movable end of the robotic arm according to the positions of the multiple points.

在一种可能的实施方式中,可以首先完成深度相机的安装,深度相机可以安装在机械臂的上方,在安装好深度相机后,随后可以获取机械臂活动端多个点的位置,生成位置矩阵。In a possible implementation, the installation of the depth camera can be completed first, and the depth camera can be installed above the robotic arm. After the depth camera is installed, the positions of multiple points on the movable end of the robotic arm can be obtained later to generate a position matrix .

举个例子,获取机械臂活动端四个点在相机坐标系下的位置矩阵M i(i=1…4),由这四个坐标获取机械臂活动端上一个固定点P(这里优选的取四个点的中心)在相机坐标系下的坐标P0。 For example, obtain the position matrix M i (i=1...4) of the four points at the movable end of the robotic arm in the camera coordinate system, and obtain a fixed point P on the movable end of the robotic arm from these four coordinates (here, it is preferred to take The center of the four points) is the coordinate P0 in the camera coordinate system.

具体的,首先确定四个点是否存在遮挡,如果发生遮挡,则利用剩下的点完成测量。在四个点都不存在遮挡的情况下,P0位置坐标可由下式获得:Specifically, first determine whether the four points are occluded, and if occlusion occurs, use the remaining points to complete the measurement. When there is no occlusion in the four points, the position coordinates of P0 can be obtained by the following formula:

Figure PCTCN2020099476-appb-000011
Figure PCTCN2020099476-appb-000011

需要说明的是,由于P位于四个点的中心,此处取平均即可得到固定点P的坐标P0。而在四个点存在遮挡的情况下,则不能简单地将剩余点的坐标取平均值,而是根据剩余各个点与P的位置关系,将各个点的坐标先加权再求平均。举个简单的例子,比如四个点中有一个点存在遮挡,如图4所示,则先取a、b两点位置坐标的平均即e的坐标,再取b、c两点位置坐标的平均即f的坐标,再根据e的纵坐标以及f的横坐标得到P的位置坐标。当然,此处简单的例子以二维坐标为例,实际过程中每个点取的是在空间中的坐标,即三维坐标。It should be noted that since P is located at the center of the four points, the coordinates P0 of the fixed point P can be obtained by averaging here. When the four points are occluded, the coordinates of the remaining points cannot be simply averaged. Instead, the coordinates of each point are first weighted and then averaged according to the positional relationship between the remaining points and P. For a simple example, for example, one of the four points is occluded, as shown in Figure 4, first take the average of the coordinates of the two points a and b, that is, the coordinates of e, and then take the average of the coordinates of the two points b and c That is, the coordinate of f, and then the position coordinate of P is obtained according to the ordinate of e and the abscissa of f. Of course, the simple example here takes two-dimensional coordinates as an example. In the actual process, each point takes the coordinates in space, that is, three-dimensional coordinates.

进一步的,机械臂活动端的姿态则由下式完成:Further, the posture of the movable end of the robotic arm is completed by the following formula:

R O=f(M i) R O =f(M i )

其中,f为姿态转换函数。Among them, f is the posture conversion function.

基于上述内容,将机械臂活动端在相机坐标系下的位姿坐标P o转换至世界坐标系下,记为P w″。P w″可以由下式得到: Based on the foregoing, the position and orientation coordinates of the free end of the robot arm in the camera coordinate system P o converted to world coordinates, denoted by ".P w" can be obtained by the following formula P w:

P w″= wT c·P o P w "= w T c ·P o

其中, wT c为从相机坐标系变化至世界坐标系的齐次变换矩阵。 Among them, w T c is the homogeneous transformation matrix that changes from the camera coordinate system to the world coordinate system.

需要说明的是,上述内容以4个点举例,本申请实施例可以取任意个数的点,比如5个点、6个点等,本申请对此不做具体限定,当然,取点的个数越多,得到点P的位置越精确。It should be noted that the above content uses 4 points as an example. The embodiments of this application can take any number of points, such as 5 points, 6 points, etc. This application does not specifically limit this. Of course, only one point is selected. The more the number, the more accurate the position of the point P is obtained.

步骤202的一种具体的实现流程可以如图5所示:A specific implementation process of step 202 may be as shown in FIG. 5:

步骤501,通过第一采集器获取机械臂的各关节的运动信息。Step 501: Obtain motion information of each joint of the robotic arm through the first collector.

步骤502,根据各关节的运动信息和基座的位置信息,确定活动端的位置信息。Step 502: Determine the position information of the movable end according to the movement information of each joint and the position information of the base.

步骤503,通过第二采集器获取活动端的姿态信息。Step 503: Obtain posture information of the mobile terminal through the second collector.

步骤504,根据活动端的位置信息和活动端的姿态信息,确定活动端的第二位姿信息。Step 504: Determine the second pose information of the mobile terminal according to the position information of the mobile terminal and the posture information of the mobile terminal.

上述步骤501中,第一采集器可以为各运动关节安装的码盘,或其它可以获取运动信息的传感器等,本申请对此不做具体限定。获取的运动信息可以为关节角度、关节角速度等可以标志各关节运动参数的信息,本申请对此不做具体限定。In the above step 501, the first collector may be a code disc installed in each movement joint, or other sensors that can obtain movement information, which is not specifically limited in this application. The acquired motion information may be information that can indicate the motion parameters of each joint, such as joint angles and joint angular speeds, which are not specifically limited in this application.

具体的,在步骤502中,可以首先根据各关节的角度信息,确定自基座至活动端之间的各关节之间的第一变换矩阵,然后根据各关节的第一变换矩阵和基座在世界坐标系下的第一坐标,确定活动端在世界坐标系下的第二坐标。Specifically, in step 502, the first transformation matrix between the joints from the base to the movable end may be determined first according to the angle information of each joint, and then the first transformation matrix between the base and the base is determined according to the first transformation matrix of each joint. The first coordinate in the world coordinate system determines the second coordinate of the movable end in the world coordinate system.

进一步的,可以通过公式(1)确定活动端在世界坐标系下的第二坐标:Further, the second coordinate of the movable end in the world coordinate system can be determined by formula (1):

Figure PCTCN2020099476-appb-000012
Figure PCTCN2020099476-appb-000012

其中,[x a,y a,z a,1]为基座在世界坐标系下的坐标,[x h′,y h′,z h′,1]为机械臂活动端在世界坐标系下的坐标。 Among them, [x a ,y a ,z a ,1] are the coordinates of the base in the world coordinate system, [x h ′,y h ′,z h ′,1] are the movable end of the robot arm in the world coordinate system coordinate of.

基于上述内容,举个具体的例子,比如机器人自其基座到机械臂活动端之间共有6个关节,通过各运动关节的码盘采集各关节运动信息,主要包括关节角度、关节角速度等,可以获得关节1至关节6的齐次变换矩阵 0T 11T 22T 33T 44T 55T 6。设机器人基座在世界坐标系下的坐标为[x a,y a,z a,1],设机械臂活动端固定点P在世界坐标系下的坐标为[x h′,y h′,z h′,1],则根据运动学原理可以得到: Based on the above content, to give a specific example, for example, the robot has 6 joints from its base to the movable end of the manipulator. The motion information of each joint is collected through the code disk of each motion joint, which mainly includes joint angle, joint angular velocity, etc. The homogeneous transformation matrix 0 T 1 , 1 T 2 , 2 T 3 , 3 T 4 , 4 T 5 , 5 T 6 of joint 1 to joint 6 can be obtained. Suppose the coordinates of the robot base in the world coordinate system are [x a ,y a ,z a ,1], and the coordinates of the fixed point P of the movable end of the robot arm in the world coordinate system are [x h ′,y h ′, z h ′,1], according to the principle of kinematics, we can get:

Figure PCTCN2020099476-appb-000013
Figure PCTCN2020099476-appb-000013

由此得到了机械臂活动端在世界坐标系下的位置信息。From this, the position information of the movable end of the robotic arm in the world coordinate system is obtained.

进一步的,步骤503中,第二采集器可以为陀螺仪等可以采集活动端姿态信息的传感装置,姿态信息包括但不限于机械臂活动端的俯仰角θ pitch、偏航角θ yaw、横滚角θ roll等,本申请对第二采集器以及第二采集器采集的姿态信息均不作具体限定。 Further, in step 503, the second collector may be a sensor device such as a gyroscope that can collect attitude information of the movable end. The attitude information includes, but is not limited to, the pitch angle θ pitch , the yaw angle θ yaw , and the roll of the movable end of the robotic arm. Angle θ roll, etc., this application does not specifically limit the second collector and the posture information collected by the second collector.

本申请实施例中,步骤504的一种具体的实现流程如下:In the embodiment of the present application, a specific implementation process of step 504 is as follows:

首先,根据活动端的姿态信息,确定活动端的旋转矩阵。First, determine the rotation matrix of the movable end according to the posture information of the movable end.

然后,根据旋转矩阵和第二坐标,确定从基座转换至活动端的第二变换矩阵。Then, according to the rotation matrix and the second coordinates, a second transformation matrix for transforming from the base to the movable end is determined.

最后,根据第二变换矩阵和第二坐标,确定活动端的第二位姿信息。Finally, according to the second transformation matrix and the second coordinates, the second pose information of the movable end is determined.

具体的,比如先采集机械臂活动端上惯性测量仪测得的机械臂活动端姿态信息,得到机械臂活动端的俯仰角θ pitch、偏航角θ yaw和横滚角θ roll,由此得到机械臂活动端的旋转矩阵R h为: Specifically, for example, first collect the attitude information of the movable end of the manipulator measured by the inertial measuring instrument on the movable end of the manipulator to obtain the pitch angle θ pitch , yaw angle θ yaw and roll angle θ roll of the movable end of the manipulator, thereby obtaining the mechanical The rotation matrix R h of the movable end of the arm is:

R h=R(θ roll)R(θ pitch)R(θ yaw) R h =R(θ roll )R(θ pitch )R(θ yaw )

进一步的,可以获得从机械臂基座转换至机械臂活动端的齐次变换矩阵为:Further, the homogeneous transformation matrix from the base of the robot arm to the movable end of the robot arm can be obtained as:

Figure PCTCN2020099476-appb-000014
Figure PCTCN2020099476-appb-000014

最后,机械臂活动端在世界坐标系下的位姿信息可以由下式得到:Finally, the pose information of the movable end of the robotic arm in the world coordinate system can be obtained by the following formula:

P w′= wT r·P r P w ′= w T r ·P r

其中,P r为机械臂基座在世界坐标系下的位置。 Among them, P r is the position of the robot arm base in the world coordinate system.

基于上述内容,下面简要的介绍齐次变换矩阵wTr的原理。Based on the above content, the following briefly introduces the principle of the homogeneous transformation matrix wTr.

首先,在变换矩阵中加入机械臂活动端在世界坐标系下的坐标,是基于纯平移变换即在空间内以不变的姿态运动的原理,在这种情况下,它的方向单位向量保持同一方向不变,所有的改变只是坐标系原点相对于参考坐标系的变换。First of all, adding the coordinates of the movable end of the robotic arm in the world coordinate system to the transformation matrix is based on the principle of pure translation transformation, that is, moving in a constant posture in space. In this case, its direction unit vector remains the same. The direction remains unchanged, all changes are just the transformation of the origin of the coordinate system relative to the reference coordinate system.

相对于固定参考坐标系,新坐标系的位置可以用原来坐标系的原点位置向量加上表示位移的向量来表示。若用矩阵形式,则新坐标系的表示可以通过坐标系左乘变换矩阵得到,即变换矩阵如下:Relative to the fixed reference coordinate system, the position of the new coordinate system can be represented by the original position vector of the original coordinate system plus the vector representing the displacement. If it is in matrix form, the representation of the new coordinate system can be obtained by multiplying the coordinate system by the transformation matrix, that is, the transformation matrix is as follows:

Figure PCTCN2020099476-appb-000015
Figure PCTCN2020099476-appb-000015

其中,x’ h,y’ h,z’ h是纯平移向量相对于参考坐标系轴的3个分量。矩阵 前3列表示没有旋转运动,最后一列表示平移运动。 Wherein, x 'h, y' h , z 'h flat is moved with respect to the amount of the reference coordinate axes of the three components. The first 3 columns of the matrix indicate no rotational movement, and the last column indicates translational movement.

基于上述内容,在矩阵前3列加入旋转矩阵R h则得到了齐次变换矩阵。 Based on the above content, the rotation matrix R h is added to the first 3 columns of the matrix to obtain a homogeneous transformation matrix.

上述方案,通过第一采集器得到活动端的位置信息,通过第二采集器得到活动端的姿态信息,然后将二者结合得到活动端的位姿,能够在定位的同时结合机械臂运动学信息,有助于减小外界噪声对定位的影响,提升测量结果的鲁棒性及精度。In the above scheme, the position information of the movable end is obtained through the first collector, the posture information of the movable end is obtained through the second collector, and then the two are combined to obtain the pose of the movable end, which can combine the kinematics information of the robotic arm while positioning, which is helpful In order to reduce the influence of external noise on the positioning, the robustness and accuracy of the measurement results are improved.

本申请实施例中,在步骤203中,可以首先确定第一位姿信息的第一误差和第二位姿信息的第二误差,然后根据第一误差、第二误差、第一位姿信息和第二位姿信息,确定活动端的位姿信息。In the embodiment of the present application, in step 203, the first error of the first pose information and the second error of the second pose information may be determined first, and then according to the first error, the second error, the first pose information and the The second pose information determines the pose information of the active end.

具体的流程如图6所示:The specific process is shown in Figure 6:

步骤601,通过第一采集器获取的运动信息,确定机械臂的状态转移矩阵。Step 601: Determine the state transition matrix of the robotic arm through the motion information acquired by the first collector.

步骤602,根据运动信息对应的控制信息,确定机械臂的输入转移矩阵以及关节输入矩阵。Step 602: Determine the input transfer matrix and the joint input matrix of the robot arm according to the control information corresponding to the motion information.

步骤603,根据上一时刻活动端的第二位姿信息、状态转移矩阵、输入转移矩阵以及关节输入矩阵,确定活动端在当前的第二位姿信息。Step 603: Determine the current second pose information of the movable end according to the second pose information, the state transition matrix, the input transition matrix, and the joint input matrix of the movable end at the previous moment.

上述方案,通过使用第一误差、第二误差、第一位姿信息和第二位姿信息确定活动端的位姿信息,不仅能够预测出活动端当前的位姿,还可以计算得到预测值的可信程度,有助于进一步提高定位的精确度。In the above scheme, by using the first error, the second error, the first pose information, and the second pose information to determine the pose information of the active end, not only the current pose of the active end can be predicted, but also the predictability of the predicted value can be calculated. The degree of trustworthiness helps to further improve the accuracy of positioning.

下面针对于融合第一位姿信息和第二位姿信息获取机械臂活动端最终的位姿信息P w的方案,举个具体的例子进行说明。 In the following, a specific example will be given for the solution of fusing the first pose information and the second pose information to obtain the final pose information P w of the movable end of the manipulator.

设第i-1 s机械臂活动端的位姿为P w(i-1),则可以获得第i s机械臂活动端位姿的预测值为: Supposing the pose of the movable end of the i-1 s manipulator arm is P w(i-1) , the predicted value of the pose of the movable end of the i s manipulator arm can be obtained as:

P′ w(i)=A·P w(i-1)+B·u i-1 P′ w(i) =A·P w(i-1) +B·u i-1

其中,A为机械臂状态转移矩阵,表示由关节角度、角速度及机械臂参数计算出活动端位置的函数。B为机械臂输入转移矩阵,表示由电机输入的力矩计算出活动端位置的函数。u i-1为第i-1 s机械臂的关节输入矩阵,表示 各个电机的输入力矩。 Among them, A is the state transition matrix of the manipulator, which represents the function of calculating the position of the movable end from the joint angle, angular velocity and the parameters of the manipulator. B is the input transfer matrix of the mechanical arm, which represents the function of calculating the position of the movable end from the torque input by the motor. u i-1 is the joint input matrix of the i-1 s -th manipulator arm, which represents the input torque of each motor.

则第i s机械臂末端位姿的预测误差协方差矩阵为: Then the prediction error covariance matrix of the end pose of the i s manipulator is:

Figure PCTCN2020099476-appb-000016
Figure PCTCN2020099476-appb-000016

其中,P i-1为第i-1 s机械臂末端位姿的实际误差协方差矩阵,v为由环境噪声或机械臂形变造成的测量的不确定度。 Among them, P i-1 is the actual error covariance matrix of the end pose of the i-1 s -th manipulator, and v is the measurement uncertainty caused by environmental noise or manipulator deformation.

进一步的,第i s的系统增益系数K i可由下式获得: Further, the system gain coefficient K i of the ith s can be obtained by the following formula:

Figure PCTCN2020099476-appb-000017
Figure PCTCN2020099476-appb-000017

其中,H为相机的观测矩阵,表示实际物体,在这里为机械臂转换为图像信息的函数,R为相机测量的不确定度,表示算法评估机械臂活动端时计算出的误差。Among them, H is the observation matrix of the camera, representing the actual object, here is the function converted by the robot arm into image information, and R is the uncertainty of the camera measurement, representing the error calculated by the algorithm when evaluating the active end of the robot arm.

于是,第i s机械臂末端位姿为: Therefore, the end pose of the i s- th robotic arm is:

P w(i)=P′ w(i)+K i·(P″ w(i)-H·P′ w(i)) P w(i) =P′ w(i) +K i ·(P″ w(i) -H·P′ w(i) )

其中,P″ w(i)为由相机测得的第i s机械臂末端的位姿。 Wherein, P "w (i) as measured by the i-th camera position and orientation of the robot arm end s.

由此更新下一秒的误差协方差矩阵为:From this, update the error covariance matrix for the next second as:

Figure PCTCN2020099476-appb-000018
Figure PCTCN2020099476-appb-000018

上述方案,根据上一时刻活动端的第二位姿信息以及上一时刻机械臂内部由于惯性以及上一时刻控制信息(比如电机输入的力矩)对当前运动的影响对当前的位姿进行预测,有助于提高定位的准确度。The above scheme predicts the current pose based on the second pose information of the active end at the last moment and the influence of the inertia of the robot arm at the last moment and the control information (such as the torque input by the motor) on the current motion at the last moment. Helps improve the accuracy of positioning.

基于相同的技术构思,图6示例性的示出了本发明实施例提供的定位的装置的结构,该装置可以执行定位的方法的流程。Based on the same technical concept, FIG. 6 exemplarily shows the structure of the positioning device provided by the embodiment of the present invention, and the device can execute the flow of the positioning method.

如图7所示,该装置可以包括:As shown in Figure 7, the device may include:

获取单元701,用于通过所述机器人的摄像装置获取图像信息,通过设置在所述机械臂的采集器获取运动信息;The acquiring unit 701 is configured to acquire image information through the camera device of the robot, and acquire movement information through a collector provided on the robotic arm;

处理单元702,用于通过所述机器人的摄像装置获取的图像信息,确定所述机械臂的活动端的第一位姿信息,通过设置在所述机械臂的采集器获取的 运动信息,确定所述活动端的第二位姿信息,根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。The processing unit 702 is configured to determine the first pose information of the movable end of the robot arm through the image information acquired by the camera device of the robot, and determine the motion information acquired by the collector provided on the robot arm. The second pose information of the movable end determines the pose information of the movable end according to the first pose information and the second pose information.

可选的,所述处理单元702具体用于:Optionally, the processing unit 702 is specifically configured to:

通过第一采集器获取所述机械臂的各关节的运动信息,根据所述各关节的运动信息和所述基座的位置信息,确定所述活动端的位置信息;Obtain the movement information of each joint of the mechanical arm through the first collector, and determine the position information of the movable end according to the movement information of the joints and the position information of the base;

通过第二采集器获取所述活动端的姿态信息;Acquiring the posture information of the movable end through the second collector;

根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息。According to the position information of the movable end and the posture information of the movable end, the second pose information of the movable end is determined.

可选的,所述处理单元702具体用于:根据所述各关节的角度信息,确定自所述基座至所述活动端之间的各关节之间的第一变换矩阵;Optionally, the processing unit 702 is specifically configured to: determine a first transformation matrix between the joints from the base to the movable end according to the angle information of the joints;

根据所述各关节的第一变换矩阵和所述基座在世界坐标系下的第一坐标,确定所述活动端在世界坐标系下的第二坐标;Determine the second coordinate of the movable end in the world coordinate system according to the first transformation matrix of each joint and the first coordinate of the base in the world coordinate system;

根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息,包括:According to the position information of the movable end and the posture information of the movable end, determining the second pose information of the movable end includes:

根据所述活动端的姿态信息,确定所述活动端的旋转矩阵;Determine the rotation matrix of the movable end according to the posture information of the movable end;

根据所述旋转矩阵和所述第二坐标,确定从所述基座转换至所述活动端的第二变换矩阵;Determining a second transformation matrix to transform from the base to the movable end according to the rotation matrix and the second coordinates;

根据所述第二变换矩阵和所述第二坐标,确定所述活动端的第二位姿信息。Determine the second pose information of the movable end according to the second transformation matrix and the second coordinates.

可选的,所述处理单元702具体用于:通过公式(1)确定所述活动端在世界坐标系下的第二坐标:Optionally, the processing unit 702 is specifically configured to determine the second coordinate of the movable end in the world coordinate system through formula (1):

Figure PCTCN2020099476-appb-000019
Figure PCTCN2020099476-appb-000019

其中,[x a,y a,z a,1]为所述基座在世界坐标系下的坐标,[x h′,y h′,z h′,1]为所述机械臂活动端在世界坐标系下的坐标, n-1T n为各关节之间的第一变换矩阵; Among them, [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system, and [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm Coordinates in the world coordinate system, n-1 T n is the first transformation matrix between the joints;

通过公式(2)确定所述活动端的第二变换矩阵:The second transformation matrix of the movable end is determined by formula (2):

Figure PCTCN2020099476-appb-000020
Figure PCTCN2020099476-appb-000020

其中,R h=R(θ roll)R(θ pitch)R(θ yaw)为旋转矩阵,θ roll为所述机械臂活动端的横滚角,θ pitch为所述机械臂活动端的俯仰角,θ yaw为所述机械臂活动端的偏航角; Among them, R h =R(θ roll )R(θ pitch )R(θ yaw ) is the rotation matrix, θ roll is the roll angle of the movable end of the robot arm, θ pitch is the pitch angle of the movable end of the robot arm, θ yaw is the yaw angle of the movable end of the robotic arm;

通过公式(3)确定所述活动端的第二位姿信息:The second pose information of the movable end is determined by formula (3):

公式(3)P w′= wT r·P r Formula (3) P w ′ = w T r ·P r

其中,P′ W为活动端的第二位姿信息, WT r为所述活动端的第二变换矩阵,P r为所述基座在世界坐标系下的位置。 Wherein, P 'W pose information of the second movable end, W T r is a second movable end of the transformation matrix, P r is the position of the base in the world coordinate system.

可选的,所述处理单元702具体用于:,所述通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,包括:Optionally, the processing unit 702 is specifically configured to: determine the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm, including:

通过第一采集器获取的运动信息,确定所述机械臂的状态转移矩阵;Determine the state transition matrix of the robotic arm through the motion information acquired by the first collector;

根据所述运动信息对应的控制信息,确定所述机械臂的输入转移矩阵以及关节输入矩阵;Determine the input transfer matrix and the joint input matrix of the robot arm according to the control information corresponding to the motion information;

根据上一时刻所述活动端的第二位姿信息、状态转移矩阵、输入转移矩阵以及关节输入矩阵确定所述活动端在当前的第二位姿信息。Determine the current second pose information of the movable end according to the second pose information, the state transition matrix, the input transition matrix, and the joint input matrix of the movable end at the previous moment.

可选的,所述处理单元702具体用于:通过公式(4)确定所述活动端在当前的第二位姿信息:Optionally, the processing unit 702 is specifically configured to: determine the current second pose information of the active end through formula (4):

公式(4)P′ w(i)=A·P w(i-1)+B·u i-1 Formula (4) P′ w(i) =A·P w(i-1) +B·u i-1

其中,A为所述机械臂状态转移矩阵,B为所述机械臂输入转移矩阵,u i-1为第i-1s所述机械臂的关节输入矩阵。 Wherein, A is the state transition matrix of the robotic arm, B is the input transition matrix of the robotic arm, and u i-1 is the joint input matrix of the i-1sth robotic arm.

可选的,所述处理单元702具体用于:所述根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息,包括:Optionally, the processing unit 702 is specifically configured to: the determining the pose information of the movable end according to the first pose information and the second pose information includes:

确定所述第一位姿信息的第一误差和所述第二位姿信息的第二误差;Determining the first error of the first pose information and the second error of the second pose information;

根据所述第一误差、所述第二误差、所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。Determine the pose information of the movable end according to the first error, the second error, the first pose information, and the second pose information.

可选的,所述处理单元702具体用于:通过公式(5)确定所述活动端的位姿信息:Optionally, the processing unit 702 is specifically configured to determine the pose information of the movable end through formula (5):

公式(5)P w(i)=P′ w(i)+K i·(P″ w(i)-H·P′ w(i)) Formula (5) P w(i) = P′ w(i) + K i ·(P″ w(i) -H·P′ w(i) )

其中,

Figure PCTCN2020099476-appb-000021
P′ w(i)为第二姿态信息,P″ w(i)为第一姿态信息,K i为增益系数,H为摄像装置的观测矩阵,
Figure PCTCN2020099476-appb-000022
为第二位姿信息的第二误差,R为相机测量的不确定度。 among them,
Figure PCTCN2020099476-appb-000021
P′ w(i) is the second posture information, P″ w(i) is the first posture information, K i is the gain coefficient, H is the observation matrix of the camera device,
Figure PCTCN2020099476-appb-000022
Is the second error of the second pose information, and R is the uncertainty of the camera measurement.

可选的,所述处理单元702具体用于:通过公式(6)确定所述第二位姿信息的第二误差:Optionally, the processing unit 702 is specifically configured to determine the second error of the second pose information through formula (6):

Figure PCTCN2020099476-appb-000023
Figure PCTCN2020099476-appb-000023

其中,P i-1为上一时刻所述活动端的位姿,V为环境噪音。 Among them, Pi -1 is the pose of the movable end at the previous moment, and V is the environmental noise.

基于相同的技术构思,本发明实施例还提供了一种计算被控设备,包括:Based on the same technical concept, an embodiment of the present invention also provides a computing controlled device, including:

存储器,用于存储程序指令;Memory, used to store program instructions;

处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行上述定位的方法。The processor is configured to call the program instructions stored in the memory, and execute the above positioning method according to the obtained program.

基于相同的技术构思,本发明实施例还提供了一种计算机可读非易失性存储介质,包括计算机可读指令,当计算机读取并执行所述计算机可读指令时,使得计算机执行上述定位的方法。Based on the same technical concept, embodiments of the present invention also provide a computer-readable non-volatile storage medium, including computer-readable instructions. When the computer reads and executes the computer-readable instructions, the computer is caused to perform the above positioning. Methods.

最后应说明的是:本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、光学存储器等)上实施的计算机程序产品的形 式。Finally, it should be noted that those skilled in the art should understand that the embodiments of the present invention can be provided as a method, a system, or a computer program product. Therefore, the present invention may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, the present invention may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, optical storage, etc.) containing computer-usable program codes.

本发明是参照根据本发明的方法、被控设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理被控设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理被控设备的处理器执行的指令产生用于实现在流程图一个流程或多个和/或方框图一个方框或多个方框中指定的功能的装置。The present invention is described with reference to the flowchart and/or block diagram of the method, the controlled device (system), and the computer program product according to the present invention. It should be understood that each process and/or block in the flowchart and/or block diagram, and the combination of processes and/or blocks in the flowchart and/or block diagram can be realized by computer program instructions. These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor or other programmable data processing controlled equipment to produce a machine, which can be executed by the processor of the computer or other programmable data processing controlled equipment The instructions generate means for realizing the functions specified in one process or multiple blocks in the flowchart and/or one block or multiple blocks in the block diagram.

这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理被控设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing controlled equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce a manufactured product including the instruction device, The instruction device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。Obviously, those skilled in the art can make various changes and modifications to the present invention without departing from the scope of the present invention. In this way, if these modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalent technologies, the present invention is also intended to include these modifications and variations.

Claims (12)

一种定位的方法,其特征在于,适用于设有机械臂的机器人,所述机械臂设置在所述机器人的基座上;所述方法包括:A positioning method, characterized in that it is suitable for a robot provided with a mechanical arm, the mechanical arm being arranged on a base of the robot; the method includes: 通过所述机器人的摄像装置获取的图像信息,确定所述机械臂的活动端的第一位姿信息;Determine the first pose information of the movable end of the robot arm through the image information acquired by the camera device of the robot; 通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息;Determine the second pose information of the movable end through the movement information acquired by the collector provided on the robotic arm; 根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。Determine the pose information of the movable end according to the first pose information and the second pose information. 根据权利要求1所述的方法,其特征在于,所述通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,包括:The method according to claim 1, wherein the determining the second pose information of the movable end through the movement information obtained by the collector provided on the mechanical arm comprises: 通过第一采集器获取所述机械臂的各关节的运动信息,根据所述各关节的运动信息和所述基座的位置信息,确定所述活动端的位置信息;Obtain the movement information of each joint of the mechanical arm through the first collector, and determine the position information of the movable end according to the movement information of the joints and the position information of the base; 通过第二采集器获取所述活动端的姿态信息;Acquiring the posture information of the movable end through the second collector; 根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息。According to the position information of the movable end and the posture information of the movable end, the second pose information of the movable end is determined. 根据权利要求2所述的方法,其特征在于,所述根据所述各关节的运动信息和所述基座的位置信息,确定所述活动端的位置信息,包括:The method according to claim 2, wherein the determining the position information of the movable end according to the movement information of the joints and the position information of the base comprises: 根据所述各关节的角度信息,确定自所述基座至所述活动端之间的各关节之间的第一变换矩阵;Determine the first transformation matrix between the joints from the base to the movable end according to the angle information of the joints; 根据所述各关节的第一变换矩阵和所述基座在世界坐标系下的第一坐标,确定所述活动端在世界坐标系下的第二坐标;Determine the second coordinate of the movable end in the world coordinate system according to the first transformation matrix of each joint and the first coordinate of the base in the world coordinate system; 根据所述活动端的位置信息和所述活动端的姿态信息,确定所述活动端的第二位姿信息,包括:According to the position information of the movable end and the posture information of the movable end, determining the second pose information of the movable end includes: 根据所述活动端的姿态信息,确定所述活动端的旋转矩阵;Determine the rotation matrix of the movable end according to the posture information of the movable end; 根据所述旋转矩阵和所述第二坐标,确定从所述基座转换至所述活动端 的第二变换矩阵;Determining a second transformation matrix to transform from the base to the movable end according to the rotation matrix and the second coordinates; 根据所述第二变换矩阵和所述第二坐标,确定所述活动端的第二位姿信息。Determine the second pose information of the movable end according to the second transformation matrix and the second coordinates. 根据权利要求3所述的方法,其特征在于,通过公式(1)确定所述活动端在世界坐标系下的第二坐标:The method according to claim 3, wherein the second coordinate of the movable end in the world coordinate system is determined by formula (1):
Figure PCTCN2020099476-appb-100001
Figure PCTCN2020099476-appb-100001
其中,[x a,y a,z a,1]为所述基座在世界坐标系下的坐标,[x h′,y h′,z h′,1]为所述机械臂活动端在世界坐标系下的坐标, n-1T n为各关节之间的第一变换矩阵; Among them, [x a , y a , z a , 1] are the coordinates of the base in the world coordinate system, and [x h ′, y h ′, z h ′, 1] are the coordinates of the movable end of the robotic arm Coordinates in the world coordinate system, n-1 T n is the first transformation matrix between the joints; 通过公式(2)确定所述活动端的第二变换矩阵:The second transformation matrix of the movable end is determined by formula (2):
Figure PCTCN2020099476-appb-100002
Figure PCTCN2020099476-appb-100002
其中,R h=R(θ roll)R(θ pitch)R(θ yaw)为旋转矩阵,θ roll为所述机械臂活动端的横滚角,θ pitch为所述机械臂活动端的俯仰角,θ yaw为所述机械臂活动端的偏航角; Among them, R h =R(θ roll )R(θ pitch )R(θ yaw ) is the rotation matrix, θ roll is the roll angle of the movable end of the robot arm, θ pitch is the pitch angle of the movable end of the robot arm, θ yaw is the yaw angle of the movable end of the robotic arm; 通过公式(3)确定所述活动端的第二位姿信息:The second pose information of the movable end is determined by formula (3): 公式(3)  P w′= wT r·P r Formula (3) P w ′ = w T r ·P r 其中,P′ W为活动端的第二位姿信息, WT r为所述活动端的第二变换矩阵,P r为所述基座在世界坐标系下的位置。 Wherein, P 'W pose information of the second movable end, W T r is a second movable end of the transformation matrix, P r is the position of the base in the world coordinate system.
根据权利要求1所述的方法,其特征在于,所述通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,包括:The method according to claim 1, wherein the determining the second pose information of the movable end through the movement information obtained by the collector provided on the mechanical arm comprises: 通过第一采集器获取的运动信息,确定所述机械臂的状态转移矩阵;Determine the state transition matrix of the robotic arm through the motion information acquired by the first collector; 根据所述运动信息对应的控制信息,确定所述机械臂的输入转移矩阵以 及关节输入矩阵;Determining the input transfer matrix and the joint input matrix of the robot arm according to the control information corresponding to the motion information; 根据上一时刻所述活动端的第二位姿信息、状态转移矩阵、输入转移矩阵以及关节输入矩阵确定所述活动端在当前的第二位姿信息。Determine the current second pose information of the movable end according to the second pose information, the state transition matrix, the input transition matrix, and the joint input matrix of the movable end at the previous moment. 根据权利要求5所述的方法,其特征在于,通过公式(4)确定所述活动端在当前的第二位姿信息The method according to claim 5, wherein the current second pose information of the active end is determined by formula (4) 公式(4)  P′ w(i)=A·P w(i-1)+B·u i-1 Formula (4) P′ w(i) =A·P w(i-1) +B·u i-1 其中,A为所述机械臂的状态转移矩阵,B为所述机械臂的输入转移矩阵,u i-1为第i-1s所述机械臂的关节输入矩阵。 Wherein, A is the state transition matrix of the robotic arm, B is the input transition matrix of the robotic arm, and u i-1 is the joint input matrix of the i-1sth robotic arm. 根据权利要求1至6任一项所述的方法,其特征在于,所述根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息,包括:The method according to any one of claims 1 to 6, wherein the determining the pose information of the movable end according to the first pose information and the second pose information comprises: 确定所述第一位姿信息的第一误差和所述第二位姿信息的第二误差;Determining the first error of the first pose information and the second error of the second pose information; 根据所述第一误差、所述第二误差、所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。Determine the pose information of the movable end according to the first error, the second error, the first pose information, and the second pose information. 根据权利要求7所述的方法,其特征在于,通过公式(5)确定所述活动端的位姿信息:The method according to claim 7, characterized in that the pose information of the movable end is determined by formula (5): 公式(5)  P w(i)=P′ w(i)+K i·(P″ w(i)-H·P′ w(i)) Formula (5) P w(i) = P′ w(i) + K i ·(P″ w(i) -H·P′ w(i) ) 其中,
Figure PCTCN2020099476-appb-100003
P′ w(i)为第二姿态信息,P″ w(i)为第一姿态信息,K i为增益系数,H为摄像装置的观测矩阵,
Figure PCTCN2020099476-appb-100004
为第二位姿信息的第二误差,R为相机测量的不确定度。
among them,
Figure PCTCN2020099476-appb-100003
P′ w(i) is the second posture information, P″ w(i) is the first posture information, K i is the gain coefficient, H is the observation matrix of the camera device,
Figure PCTCN2020099476-appb-100004
Is the second error of the second pose information, and R is the uncertainty of the camera measurement.
根据权利要求7所述的方法,其特征在于,通过公式(6)确定所述第二位姿信息的第二误差:The method according to claim 7, wherein the second error of the second pose information is determined by formula (6):
Figure PCTCN2020099476-appb-100005
Figure PCTCN2020099476-appb-100005
其中,P i-1为上一时刻所述活动端的位姿,V为环境噪音。 Among them, Pi -1 is the pose of the movable end at the previous moment, and V is the environmental noise.
一种定位装置,其特征在于,适用于设有机械臂的机器人,所述机 械臂设置在所述机器人的基座上,所述装置包括:A positioning device, characterized in that it is suitable for a robot provided with a robotic arm, the robotic arm is arranged on a base of the robot, and the device comprises: 获取单元,用于通过所述机器人的摄像装置获取图像信息,通过设置在所述机械臂的采集器获取运动信息;An acquiring unit, configured to acquire image information through the camera device of the robot, and acquire movement information through a collector provided on the robotic arm; 处理单元,用于通过所述机器人的摄像装置获取的图像信息,确定所述机械臂的活动端的第一位姿信息,通过设置在所述机械臂的采集器获取的运动信息,确定所述活动端的第二位姿信息,根据所述第一位姿信息和所述第二位姿信息,确定所述活动端的位姿信息。The processing unit is used to determine the first posture information of the movable end of the robot arm through the image information acquired by the camera device of the robot, and determine the activity through the motion information acquired by the collector provided on the robot arm The second pose information of the mobile terminal determines the pose information of the movable terminal according to the first pose information and the second pose information. 一种计算设备,其特征在于,包括:A computing device, characterized in that it comprises: 存储器,用于存储程序指令;Memory, used to store program instructions; 处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行权利要求1至9任一项所述的方法。The processor is configured to call the program instructions stored in the memory, and execute the method according to any one of claims 1 to 9 according to the obtained program. 一种计算机可读非易失性存储介质,其特征在于,包括计算机可读指令,当计算机读取并执行所述计算机可读指令时,使得计算机执行如权利要求1至9任一项所述的方法。A computer-readable non-volatile storage medium, characterized by comprising computer-readable instructions, when the computer reads and executes the computer-readable instructions, the computer is caused to execute any one of claims 1 to 9 Methods.
PCT/CN2020/099476 2019-12-23 2020-06-30 Positioning method and apparatus Ceased WO2021128787A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911340837.X 2019-12-23
CN201911340837.XA CN110977985B (en) 2019-12-23 2019-12-23 Method and device for positioning

Publications (1)

Publication Number Publication Date
WO2021128787A1 true WO2021128787A1 (en) 2021-07-01

Family

ID=70075805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/099476 Ceased WO2021128787A1 (en) 2019-12-23 2020-06-30 Positioning method and apparatus

Country Status (2)

Country Link
CN (1) CN110977985B (en)
WO (1) WO2021128787A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114147717A (en) * 2021-12-09 2022-03-08 乐聚(深圳)机器人技术有限公司 Robot motion trajectory estimation method and device, controller and storage medium
CN114305686A (en) * 2021-12-20 2022-04-12 杭州堃博生物科技有限公司 Positioning processing method, device, equipment and medium based on magnetic sensor
CN114536399A (en) * 2022-01-07 2022-05-27 中国人民解放军海军军医大学第一附属医院 Error detection method based on multiple pose identifications and robot system
CN114800499A (en) * 2022-04-20 2022-07-29 北京三快在线科技有限公司 Pose adjusting method and device, computer readable storage medium and electronic equipment
CN115005984A (en) * 2022-05-07 2022-09-06 推想医疗科技股份有限公司 Calibration method and system of surgical instrument and positioning method and device of surgical instrument
CN115805593A (en) * 2022-12-22 2023-03-17 苏州艾利特机器人有限公司 Method, device, equipment and medium for determining installation information of force sensor
CN115816448A (en) * 2022-11-25 2023-03-21 广州艾目易科技有限公司 Mechanical arm calibration method, device, equipment and medium based on optical position indicator
CN115837670A (en) * 2022-11-11 2023-03-24 思看科技(杭州)股份有限公司 Calibration path planning method and calibration method of three-dimensional scanning system
CN116175544A (en) * 2022-08-25 2023-05-30 北京空间飞行器总体设计部 Space manipulator on-orbit parameter identification system based on binocular vision
CN116277023A (en) * 2023-04-13 2023-06-23 安徽省配天机器人集团有限公司 Robot trajectory planning method, device and computer-readable storage medium
CN117032027A (en) * 2023-08-15 2023-11-10 广东美的智能科技有限公司 Visual control system and control method thereof, visual motion controller and storage medium

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110977985B (en) * 2019-12-23 2021-10-01 中国银联股份有限公司 Method and device for positioning
CN111881411B (en) * 2020-07-10 2024-02-06 广联达科技股份有限公司 Determination method and determination device for mechanical node position
CN111923043A (en) * 2020-07-30 2020-11-13 苏州富鑫林光电科技有限公司 Multi-manipulator positioning method based on multi-sensor fusion
CN112405526A (en) * 2020-10-26 2021-02-26 北京市商汤科技开发有限公司 Robot positioning method and device, equipment and storage medium
CN113220017A (en) * 2021-04-16 2021-08-06 同济大学 Underground unmanned aerial vehicle flight method and system
CN114441807B (en) * 2021-07-22 2023-07-07 荣耀终端有限公司 A wiring method and system
CN114274140B (en) * 2021-09-30 2024-04-16 武汉大学 Mechanical arm action planning method and system based on outdoor antenna calibration site position
CN114283447B (en) * 2021-12-13 2024-03-26 北京元客方舟科技有限公司 Motion capturing system and method
CN114427652B (en) * 2021-12-20 2023-10-10 哈尔滨理工大学 Indoor three-dimensional reconstruction information collection device and device camera position acquisition method
CN114993945A (en) * 2022-04-19 2022-09-02 燕山大学 Hub surface defect detection device and method adopting parallel mechanism
CN115194770A (en) * 2022-07-25 2022-10-18 Oppo广东移动通信有限公司 Conversion parameter acquisition method and device, mechanical arm and storage medium
CN117817667B (en) * 2024-01-26 2024-06-25 合肥工业大学 A method for adjusting the posture of the end of a robotic arm based on SVD decomposition method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
CN106052683A (en) * 2016-05-25 2016-10-26 速感科技(北京)有限公司 Robot motion attitude estimating method
CN107314778A (en) * 2017-08-04 2017-11-03 广东工业大学 A kind of scaling method of relative attitude, apparatus and system
CN109262610A (en) * 2018-08-30 2019-01-25 珠海格力电器股份有限公司 Method and system for solving tail end pose of serial multi-degree-of-freedom robot and robot
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110977985A (en) * 2019-12-23 2020-04-10 中国银联股份有限公司 Method and device for positioning

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4453085A (en) * 1981-05-11 1984-06-05 Diffracto Ltd. Electro-optical systems for control of robots, manipulator arms and co-ordinate measuring machines
CN106247932B (en) * 2016-07-25 2019-03-12 天津大学 A robot online error compensation device and method based on a camera system
JP6707485B2 (en) * 2017-03-22 2020-06-10 株式会社東芝 Object handling device and calibration method thereof
JP6564428B2 (en) * 2017-08-03 2019-08-21 ファナック株式会社 Calibration system and calibration method
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN109129466B (en) * 2018-07-26 2021-07-20 清华大学 Active vision device for stereotaxic robot and control method thereof
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
CN106052683A (en) * 2016-05-25 2016-10-26 速感科技(北京)有限公司 Robot motion attitude estimating method
CN107314778A (en) * 2017-08-04 2017-11-03 广东工业大学 A kind of scaling method of relative attitude, apparatus and system
CN109262610A (en) * 2018-08-30 2019-01-25 珠海格力电器股份有限公司 Method and system for solving tail end pose of serial multi-degree-of-freedom robot and robot
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110977985A (en) * 2019-12-23 2020-04-10 中国银联股份有限公司 Method and device for positioning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHANG TIANLIN: "Robot Arm End Pose Measurement Based on Multi-sensor Information Fusion", MASTER THESIS, 9 December 2019 (2019-12-09), pages 1 - 75, XP009528660, DOI: 10.27251/d.cnki.gnjdc.2019.000716 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114147717B (en) * 2021-12-09 2024-05-24 乐聚(深圳)机器人技术有限公司 Robot motion trajectory estimation method, device, controller and storage medium
CN114147717A (en) * 2021-12-09 2022-03-08 乐聚(深圳)机器人技术有限公司 Robot motion trajectory estimation method and device, controller and storage medium
CN114305686A (en) * 2021-12-20 2022-04-12 杭州堃博生物科技有限公司 Positioning processing method, device, equipment and medium based on magnetic sensor
CN114536399B (en) * 2022-01-07 2023-04-25 中国人民解放军海军军医大学第一附属医院 Error detection method based on multiple pose identifications and robot system
CN114536399A (en) * 2022-01-07 2022-05-27 中国人民解放军海军军医大学第一附属医院 Error detection method based on multiple pose identifications and robot system
CN114800499A (en) * 2022-04-20 2022-07-29 北京三快在线科技有限公司 Pose adjusting method and device, computer readable storage medium and electronic equipment
CN114800499B (en) * 2022-04-20 2023-08-25 北京三快在线科技有限公司 Pose adjustment method and device, computer readable storage medium and electronic equipment
CN115005984A (en) * 2022-05-07 2022-09-06 推想医疗科技股份有限公司 Calibration method and system of surgical instrument and positioning method and device of surgical instrument
CN116175544A (en) * 2022-08-25 2023-05-30 北京空间飞行器总体设计部 Space manipulator on-orbit parameter identification system based on binocular vision
CN115837670A (en) * 2022-11-11 2023-03-24 思看科技(杭州)股份有限公司 Calibration path planning method and calibration method of three-dimensional scanning system
CN115816448A (en) * 2022-11-25 2023-03-21 广州艾目易科技有限公司 Mechanical arm calibration method, device, equipment and medium based on optical position indicator
CN115805593B (en) * 2022-12-22 2023-11-28 苏州艾利特机器人有限公司 Force sensor installation information determining method, device, equipment and medium
CN115805593A (en) * 2022-12-22 2023-03-17 苏州艾利特机器人有限公司 Method, device, equipment and medium for determining installation information of force sensor
CN116277023A (en) * 2023-04-13 2023-06-23 安徽省配天机器人集团有限公司 Robot trajectory planning method, device and computer-readable storage medium
CN117032027A (en) * 2023-08-15 2023-11-10 广东美的智能科技有限公司 Visual control system and control method thereof, visual motion controller and storage medium

Also Published As

Publication number Publication date
CN110977985B (en) 2021-10-01
CN110977985A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
WO2021128787A1 (en) Positioning method and apparatus
CN110640747B (en) Hand-eye calibration method and system for robot, electronic equipment and storage medium
CN110197461B (en) Coordinate conversion relation determining method, device, equipment and storage medium
CN110238849A (en) Robot hand-eye calibration method and device
WO2020024178A1 (en) Hand-eye calibration method and system, and computer storage medium
WO2023134237A1 (en) Coordinate system calibration method, apparatus and system for robot, and medium
CN111123280A (en) Laser radar positioning method, device and system, electronic equipment and storage medium
CN116038701B (en) A hand-eye calibration method and device for a four-axis robotic arm
WO2023168849A1 (en) Mechanical arm motion capture method, medium, electronic device, and system
CN117140517A (en) A high-precision automatic hand-eye calibration method and system for robotic arms
CN118111474A (en) Assessment method and system for positioning accuracy of 6-dimensional pose of mobile robot
CN118882444B (en) Coordinate system calibration method, apparatus, storage medium, and computer program product
CN116136388A (en) Calibration method, device, equipment and storage medium of robot tool coordinate system
CN113768535A (en) Method, system and device for self-calibration of ultrasonic profiling probe attitude for teleoperation
US20240246237A1 (en) Robot control device, robot control system, and robot control method
JP2020203368A (en) Robot system
TWI788253B (en) Adaptive mobile manipulation apparatus and method
CN117232514A (en) Indoor path planning method based on two-dimensional laser radar and binocular camera
CN112060083B (en) Binocular stereo vision system for robotic arm and its measurement method
JP2005186193A (en) Calibration method and three-dimensional position measuring method for robot
CN110675445B (en) Visual positioning method, device and storage medium
CN113459084A (en) Robot parameter calibration method, device, equipment and storage medium
CN114332236A (en) Visual system holder calibration method and device, electronic equipment and storage medium
CN114227693A (en) Multi-mechanical-arm base coordinate system calibration method based on visual marks
CN113075647A (en) Robot positioning method, device, equipment and medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20906813

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20906813

Country of ref document: EP

Kind code of ref document: A1