WO2025008652A1 - Procédé de positionnement et système de positionnement - Google Patents
Procédé de positionnement et système de positionnement Download PDFInfo
- Publication number
- WO2025008652A1 WO2025008652A1 PCT/HU2024/050055 HU2024050055W WO2025008652A1 WO 2025008652 A1 WO2025008652 A1 WO 2025008652A1 HU 2024050055 W HU2024050055 W HU 2024050055W WO 2025008652 A1 WO2025008652 A1 WO 2025008652A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- positioning
- imaging unit
- image
- plane
- target feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01R—MEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
- G01R1/00—Details of instruments or arrangements of the types included in groups G01R5/00 - G01R13/00 and G01R31/00
- G01R1/02—General constructional details
- G01R1/06—Measuring leads; Measuring probes
- G01R1/067—Measuring probes
- G01R1/06705—Apparatus for holding or moving single probes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37555—Camera detects orientation, position workpiece, points of workpiece
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39026—Calibration of manipulator while tool is mounted
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39045—Camera on end effector detects reference pattern
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39057—Hand eye calibration, eye, camera on hand, end effector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40003—Move end effector so that image center is shifted to desired position
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40082—Docking, align object on end effector with target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40087—Align hand on workpiece to pick up workpiece, peg and hole
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45026—Circuit board, pcb
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
Definitions
- the subject matter disclosed herein relates to a positioning method, a positioning system, a computer program product and a computer readable medium for positioning a reference feature with respect to a target feature. More particularly, the disclosed subject matter relates to systems and methods for using visual servoing techniques in a unique way to achieve accurate positioning in case if exact calibrations, exact mutual positions/orientations, exact camera scale factors are not necessarily available.
- the subject matter disclosed herein is especially suitable for positioning a tool with respect to a workpiece to enable carrying out various operations in an accurately positioned way.
- PCB repair shops usually employ human operators to find and identify the root cause of the malfunctions of the broken PCBs.
- these shops face a large variety of products in small batch sizes but with many, frequently recurring product types.
- measuring operations often contain repetitive steps - seemingly good candidates for automation - automated solutions cannot yet provide a flexibility for repair shops to be worth investing into.
- TCP tool center point
- This method can be performed with the precision determined by the repeatability of the robot, which is usually in the order of 0.1 -0.01 mm for industrial robot arms.
- taught points can only be used successfully if the robot repeatability, fixturing precision of the workpieces, and the manufacturing and assembly precision of the workpieces are such that the accumulating errors still allow the precision required for the desired subsequent action. Otherwise, the accumulating errors also need to be taken into account during the measurement.
- Visual servoing also known as vision-based robot control
- vision-based robot control is a known technique, which uses feedback information extracted from a vision sensor, i.e. visual feedback to control the motion of a robot.
- vision-based robot control uses feedback information extracted from a vision sensor, i.e. visual feedback to control the motion of a robot.
- visual feedback to control the motion of a robot.
- US 6,208,375 B1 discloses a test probe positioning method and system.
- a target feature is selected either by manual means such as a computer mouse, or by automatic means based on previously-stored information.
- the pixel address of the selected target feature within a displayed image is determined.
- a test probe equipped with probe-identifying markers is introduced into the field of view of the camera, and its pixel address is determined on the camera image. With target and probe addresses known, the probe is moved to bring it into contact with the target feature.
- a closed loop system can be employed to target a feature to be probed and then to servo the probe to that feature.
- This prior art solution is based on point-to-point servoing, which has drawbacks as detailed later.
- US 2015/0120047 A1 discloses a control device, which includes a reception unit that receives first operation information and second operation information different from the first operation information; and a process unit that instructs a robot to execute operations based on the first operation information and the second operation information using a plurality of captured images of an imaged target object, the images being captured multiple times while the robot moves from a first posture to a second posture different from the first posture.
- This prior art solution is also based on point-to-point servoing.
- Point-to-point servoing with a two-camera arrangement also aims to bring a “tip” of the tool fixed to the manipulator into overlap with the target feature, and to bring the tip into contact with the target feature for the desired state before the action.
- the marker can be the tip of the tool, which is a point on the camera images, representing the straight-line projection of the lines determined by the focal point of the two cameras and the vertex corresponding to the tip (which is the same point for both camera images). These two lines intersect at the tip, and if the fine positioning is successful, the intersection will coincide with the target feature, establishing contact between the tip and the target feature. Fine positioning is not feasible for deviating lines.
- one marker point per camera can be set up further away on the tool instead of using the tip of the tool as a marker.
- calibration is required to ensure that the lines determined by these marker points intersect.
- marker points may only exist on the images as virtual marker points by drawing (marking) them on the camera images. In this case, their detection and localization are not necessary.
- the virtual marker point setup and calibration must be repeated. For this reason, the flexibility of the method is highly limited.
- the offset distance is not accurately achieved by moving the robot TCP to a preliminary, initial position.
- the intersection of the two virtual lines cannot be moved into the target feature by a planar (e.g. X-Y) movement, it is necessary to move in the offset direction (e.g. Z) in the fine positioning iteration, otherwise the overlapping of the markers and the target feature cannot be realized in both camera images at the same time, i.e. the system becomes overdetermined.
- the TCP frame directions for the offset movement can be calculated from a model or determined by measurement. The movement in this direction must be performed iteratively with the planar movement until the target feature point and the intersection point of the two virtual axes coincide.
- Point-to-point servoing with a two-camera arrangement is inflexible also because the marker placement is constrained (the lines must intersect) and the fine positioning must either take place in a specific plane (corresponding to a fixed geometric distance) so that the intersection of the spatial lines coincides with the target feature, or along more than two axes, where it is not possible to separate the compensation vectors on the camera images to correspond to a single motion axis, due to the more than two axes of the movement.
- point-to-point overlapping can be more challenging to process, light conditions may vary, and it may not be possible to determine the reference feature, e.g. the tip of the measuring pin in the image due to light and visibility conditions e.g. due to glint of metallic parts, occlusions, or shadows.
- point-to-point servoing is sensitive to external light sources, and to visual or even geometric mismatches between subsequently processed workpieces.
- the known positioning solutions do not provide a sufficiently flexible positioning.
- a solution allowing an improvement over existing methods and systems.
- Model-free definition provides great flexibility, as visual and approximate geometric information can be used to store the target feature, e.g. capturing an image of the target feature and storing the robot TCP position when the robot is positioned close to the expected position of the action when defining the target feature.
- the invention is not limited to the measurement and testing of printed circuit boards using a robot or robotic arm equipped with a measurement tool.
- Other applications of the invention are also possible, therefore generic tasks/actions performed on generic workpieces, products or objects using generic manipulator(s) equipped with generic tool(s) or object(s) are within the scope of the present disclosure.
- a novel, robotic, visual servoing probe test method and corresponding measurement tool is proposed.
- An experimental PCB measurement cell is provided to automatically measure relevant measurement points of a PCB motherboard without precisely locating the motherboard assembly in the robot workspace.
- the preferred embodiment is suitable for repair shops to test PCBs using simple locating pins and basic test specifications, without the need for accurate geometric information about the product.
- the positioning error can be in the range of a couple of millimeters.
- a special visual servo technique is provided. The idea in this preferred embodiment is to observe the test-pin (connected to one of the terminals of the measuring instrument), and the measured feature simultaneously using a 2D camera. By identifying the axis of the test-pin together with the measured feature, a positioning error can be calculated in the image space, which determines the direction and an estimated magnitude of the required compensation in the physical space. To realize the compensation, motion commands are sent to the robot to reduce the error on the camera images. As planar compensation is required in a plane parallel to the top plane of the PCB, bi-directional observation is necessary. Therefore, a second camera is also introduced, which detects the position error perpendicular to the first one.
- the test-pin mounted on the robot converges to the measured feature through the servoing robot motion, wherein the planar motion is performed on a plane above the PCB (safety plane) using a board-specific safety distance to avoid collision with any mounted components.
- the test-pin’s axis becomes aligned with the measurement point and using a downward feeding motion along this axis, the electric measurement can be carried out by pushing the test-pin against the measured feature, which establishes the galvanic contact.
- Fig. 1 schematically illustrates a part of a preferred embodiment of an inventive positioning system
- FIG. 2 schematically illustrates an enlarged portion of Fig. 1 ,
- Fig. 3 schematically illustrates a view field of a first imaging unit of the system of Fig. 1 ,
- Fig. 4 schematically illustrates a view field of a second imaging unit of the system of Fig. 1 ,
- Figs. 5A and 5B schematically illustrate respective view fields of the first imaging unit in two phases of positioning in a first direction
- Figs. 6Aand 6B schematically illustrate respective view fields of the second imaging unit in two phases of positioning in a second direction
- Fig. 7 schematically illustrates a block diagram of visual servoing in a preferred embodiment of the invention
- Fig. 8 schematically illustrates a constellation-based matching usable for identifying a target feature
- Fig. 9 schematically illustrates a template matching usable for identifying a target feature
- Fig. 10 schematically illustrates a practical application of a preferred embodiment with a workpiece which is not aligned with respect to locating pins
- Fig. 11 schematically illustrates an enlarged part of Fig. 10 showing the workpiece in a fixture-aligned position. MODES FOR CARRYING OUT THE INVENTION
- the description below mainly refers to an embodiment, where the tool is mounted on the robot’s flange and the workpiece is placed in the robot's workspace, in a robotic measurement cell.
- the aim is to adjust with high precision a reference feature (robot object feature) of a tool/workpiece/object mounted on a manipulator/robot arm to a defined geometric relationship with respect to a target feature (static or even moving object feature) of a workpiece/tool/object in a workspace of a manipulator/robot.
- the aim is to fit a reference feature and a target feature into a known relative geometric position with respect to each- other, in order to perform a certain subsequent action.
- Fig. 1 schematically illustrates a part of a preferred embodiment of an inventive positioning system
- Fig. 2 schematically illustrates an enlarged portion of Fig. 1
- the subject matter disclosed herein is a positioning method and a positioning system for positioning a reference feature 10, such as a tip of a test pin as depicted in Fig. 2, with respect to a target feature 11 , such as contact point on a PCB as depicted in Fig. 2, along a positioning plane.
- target feature and “reference feature” in the present context denote, each on a respective object, any specific point, location, area or portion, which are to be positioned with respect of each other.
- the action to be performed after positioning can be a galvanic contact formation and measurement task, with or without some kind of previous robotic motion.
- one of said reference and target features 10, 11 is on a first object attached to a robotic manipulator 100 (see Figs. 7, 10 and 11 ) and the other one of said reference and target features 10, 11 is on a second object being arranged in a workspace accessible by the robotic manipulator 100 and being detached from, i.e. being not attached to the robotic manipulator 100.
- the first object is a tool 12 comprising the test pin
- the second object is a workpiece 13, which is here a PCB to be tested.
- the positioning plane is preferably above the workpiece 13 and is practically parallel with a surface of the workpiece 13 carrying the target feature 11 , but the invention is also applicable in cases where the positioning plane is non-parallel with such a surface.
- a first imaging unit 20 with an estimated first scale factor is used for imaging, from a first viewpoint 24, the target feature 11 and at least one marker 21 defining a virtual first line 22 in the image of the first imaging unit 20.
- the target feature 11 and at least one marker 21 defining a virtual first line 22 in the image of the first imaging unit 20 Preferably there are two markers 21 formed as dots or crosshairs defining the first line 22.
- the first imaging unit 20 and the at least one marker 21 imaged by the first imaging unit 20 are fixed with respect to the reference feature 10.
- a second imaging unit 30 with an estimated second scale factor is also used for imaging, from a second viewpoint 34 different from the first viewpoint 24, the target feature 11 and at least one marker 31 defining a virtual second line 32 in the image of the second imaging unit 30.
- the target feature 11 and at least one marker 31 defining a virtual second line 32 in the image of the second imaging unit 30 there are preferably two markers 31 formed as dots or crosshairs defining the second line 32.
- the second imaging unit 30 and the at least one marker 31 imaged by the second imaging unit 30 are also fixed with respect to the reference feature 10.
- the viewpoints 24, 34 and the markers 21 , 31 are arranged so that a virtual first plane 23 defined by the first viewpoint 24 and the at least one marker 21 defining the first line 22, and a virtual second plane 33 defined by the second viewpoint 34 and the at least one marker 31 defining the second line 32 intersect each other along a virtual axis 40, which intersects the positioning plane.
- the detected marker(s) 21 , 31 and the respective viewpoints 24, 34 being preferably camera focal points or camera image center points, shall form the first and second planes 23, 33, respectively.
- the first line 22 and the second line 32 are projections of the spatial virtual first plane 23 and spatial virtual second plane 33 onto an image plane corresponding to the first imaging unit 20 and the second imaging unit 30, respectively.
- the intersection of the virtual planes 23, 33 defined by the two imaging units 20, 30 and detected markers 21 , 31 thus define the spatial virtual axis 40.
- viewpoint in this context may denote the focal points or center points of the respective imaging units, practically corresponding to a center pixel or a center point of the imaging camera pixel matrix.
- other pixels or points of the imaging camera pixel matrix can also be assigned as a reference point used as a viewpoint for the positioning.
- marker detection/localization has two purposes. Firstly, it needs to determine the image lines 22, 32, which can be done using any kind of line definition, such as using two points or a point and a direction vector. These can be represented by artificial or natural markers 21 , 31 , the markers being identifiable portions, areas or points of objects appearing in the images. Secondly, it needs to determine the virtual planes 23, 33 in order to form the virtual axis 40 corresponding to e.g. a tool. Marker detection can be carried out e.g. by conventional methods like blob or edge detection, or by machine learning-based methods.
- the reference feature 10 needs to be calibrated to the virtual axis 40, in this way, after the positioning is successfully finished and the target feature is on the virtual axis, the reference feature gets to a known geometric relation relative to the target feature, which allows the performing of the desired action by considering the said relative geometric relation. Otherwise, if the virtual axis 40 does intersect the reference feature 10, it is sufficient for the intended operation if the virtual axis 40 is known in the robot TCP frame, relative to the TCP axes.
- Positioning requires at least one, preferably two imaging units 20, 30 capable of providing different viewpoints, positioned so that the corresponding image planes are not parallel, i.e. the planes intersect.
- the imaging units 20, 30 can be realized by means of pinhole cameras or other cameras preferably with calibration and undistortion that match the perspective mapping. Preferred realizations can use - two cameras acting as two imaging units providing two different viewpoints, or
- Fig. 3 schematically illustrates a view field of the first imaging unit 20
- Fig. 4 schematically illustrates a view field of the second imaging unit 30.
- the positioning method comprises first positioning steps in a first direction of the positioning plane, and second positioning steps in a second direction of the positioning plane. Each of the first positioning steps comprise:
- the first positioning step is repeated until the at least one stop condition is achieved.
- the term “compensating” in the present context means a movement with the robotic manipulator 100 that aims to eliminate the physical distance present between an actual and a desired position. Distance determining within the image is preferably carried out by determining a distance between the first line 22 and a center point or center pixel of the first imaging 111 of the target feature 11 .
- each of the second positioning steps comprise:
- the second positioning step is also repeated until the at least one stop condition is achieved.
- the physical target feature 11 appears from a different viewpoint in each camera image, so the imagings 111 , 112 of the target feature vary from camera to camera in the image space. It is important that in each iteration both the target feature 11 to be detected and the virtual axis 40 to be detected remain the same.
- the first and second lines 22, 32 are determined by means of respective imagings 211 , 311 of the markers 21 , 31 within the respective images.
- the imaging units 20, 30 and the markers 21 , 31 can be drawn (marked) onto the image after calibration as virtual markers with known pixel addresses, and can be used instead of physical ones.
- the TCP frame-camera-tool geometry does not change, there is no need to redetect the markers 21 , 31.
- the geometric calibration of the tool markers can be carried out at the time of tool manufacture. Geometric calibration of cameras, markers and of the reference feature 10 can also be performed during tool manufacture.
- the at least one stop condition for the positioning method comprises the condition that the calculated respective physical distance is within a threshold limit, which corresponds to successful positioning, i.e. when the deviation between the desired and attained relative geometric relation between the reference feature 10 and the target feature 11 is sufficiently low.
- a threshold limit which corresponds to successful positioning, i.e. when the deviation between the desired and attained relative geometric relation between the reference feature 10 and the target feature 11 is sufficiently low.
- further conditions can also be applied as a stop condition.
- the at least one stop condition also comprises one or more of the following further conditions:
- a number of termination conditions can be thus defined for the servo method. For example, a maximum initial error radius can be assumed, i.e., if the robot TCP moves out of the circle defined by this radius around a starting point, the servo method can be terminated (displacement error).
- This phenomenon can occur when the detection and/or localization of the target feature is incorrect due to e.g., a significant visual deviation in the area of the target feature. For example, this occurs in case of template matching-based target feature detection, when the correlation between the template pattern and the camera image is weak and a false positive matching is found.
- detecting a collision or close to collision state in the robotic cell may cause the method to stop. This can be detected e.g.
- the servoing can be terminated. Furthermore, servoing can be terminated if the servo iteration does not converge quickly enough towards the acceptable alignment between the tool 12 and the measured point (control error).
- the intended action e.g. in case of PCB measurement, the forward feeding translational motion and measurement is omitted in order to avoid any collision, damage or potential shorting of components. Otherwise, if no error is detected after the servo target is reached, the desired action can be performed e.g. a forward translational motion, measurement and a backward translational motion back to a safety plane above the workpiece after the measured values are read from a measuring instrument.
- the robot is positioned so that the virtual axis 40 and the target feature 11 become overlapped, or in other words become intersecting, or in still other words the target feature 11 gets to be on the virtual axis 40. If this does not happen within a given threshold (tolerance), the robot's position further iterates using the above detailed fine positioning.
- the inventive method uses perspective projection in such a way that when an image line (image virtual axis) in an image of an imaging unit is overlapped by the imaging of the target feature, in reality, it is achieved that a virtual plane belonging to the imaging unit (whose projection is the image line) will be overlapped by the physical target feature.
- the physical target feature will fall on the plane 23, 33 of both imaging units 20, 30, which means that the target feature will fall on the intersection, i.e. on the spatial virtual axis 40 defined by the planes 23, 33 of the two imaging units 20, 30.
- the spatial calibrated position of the virtual axis 40 is known in the robot's TCP frame, so that due to rigid body constraints, a motion command can be issued to the robot that can bring the physical reference feature 10 to the desired relative geometric position with respect to the physical target feature 11 .
- Fine positioning is thus performed by performing compensatory movements identified in the camera images. If the exact value of the ratio between image space and physical space (pixel/mm), called as the scale factor of the imaging units, would be readily and steadily available, as well as an exact compensation motion could be carried out, while the physical position of the target feature remain the same, one compensatory motion would achieve an overlap of the virtual axis 40 with the target feature 11 within a predefined tolerance. In such cases servoing would not be necessarily as a single compensatory motion could achieve the fine positioned state. However, in real life processes no such conditions can be ensured or could only be ensured with high specific costs and with high computational demand.
- the invention ensures that by iteratively performing the compensation process in a number of steps, fine positioning could be accomplished in a safe and simply implementable way.
- inventive accurate positioning it is sufficient to know an assumed or estimated scale factor of the imaging units 20, 30.
- the inventive visual servoing provides a converging and accurate positioning.
- a deviation of up to +/-10% of the estimated scale factors results in a very fast converging visual servoing. In this case, even significantly less robustness is required for the image processing algorithms and for the robotic control,
- a higher noise tolerance may be required in the image processing algorithms, or for example, a higher depth of camera field may be needed, and the of the robot control may need to be adjusted more carefully.
- the two calculated physical distances determined based on the image of the two imaging units 20, 30, i.e. the first and second compensation values - which may or may not correspond directly to the first and second different directions of motion - can be combined and output to the robot controller at the same time iteratively, until the distances between the virtual axis 40 and the target feature 11 in both directions fall below a threshold.
- a position and/or orientation defined by an initial position and/or orientation can be maintained throughout the process.
- there can be different strategies for setting the undefined degrees of freedom during the fine positioning process e.g.
- the field of view of the camera can be increased, making the method more suitable for rough positioning with a larger initial error; then, when the error becomes lower, by moving towards the workpiece in the Z direction, the scale factor can be improved and more precise fine positioning can be achieved.
- the threshold limit for the calculated respective physical distances can be adjusted according to the given application, in the light of the acceptable tolerances and expected accuracy. Individual steps of the first and second positioning steps can be merged, mixed and/or their order can be changed, as best suitable for the given application.
- the specified accuracy is physically achievable by the robotic system.
- a stop condition is required to avoid infinite iteration loops, e.g. by allowing a maximum number of iterations or a maximum positioning time.
- the former condition means that the absolute distance between the virtual axis 40 and the target feature 11 must decrease both in the image and in physical space. It does not necessarily need to show a decrease in every iteration step, for example, it is possible that a false detection occurs in one iteration due to a flicker and therefore this distance temporarily increases, or it is possible that a vibration causes the target feature 11 to move further away from the virtual axis 40 between two iterations. Regardless, if the distance shows a decreasing trend over several iterations, the controller is considered adequate.
- the robot's motion must be such that the distances between the first and second lines 22, 32 and the first and second imagings 111 , 112 of the target feature 11 in the image space is expected to decrease in each iteration.
- one of each of the first and second positioning steps can be carried out, or only one of those in an alternating manner.
- a sufficient waiting time e.g. 0.1 s can be introduced before taking the next image, so that the movement of the robotic manipulator 100 can settle.
- One embodiment of the iterative solution of the visual servo method is a step-wise compensation. If the time of the operations performed in an iteration (imaging, image processing, compensation computation, command issuing, etc.) allows, i.e. the operations can be performed in near real time, continuous mode compensation can be performed instead of step-wise mode. In this case, for example, speed control may be used instead of position-based control, where imaging is performed in motion, e.g. by snapshot capture or by cropping a snapshot from a camera stream.
- a velocity can be determined from the calculated compensation distances in the iterations, taking into account the cycle time of the iteration, and compensation can be performed using the distance, computation time and velocity values. Continuous operation has the advantage of faster compensation, as it saves the acceleration and deceleration phases in each cycle of the robot's motion.
- the target feature 11 can be a point or an area or a portion on an object and the reference feature 10 can be a point or an area or a portion of another object, where one of the two objects is attached to the robotic manipulator 100, while the other of the two objects is attached to the workspace, and not attached to the robotic manipulator 100.
- the imaging units 20, 30 do not necessarily need to be fixed to the robot, for example, fixed external imaging units can also be used, with the markers 21 , 31 and the reference feature 10 fixed with respect to the workspace, while the target feature 11 is bound to the robot.
- the reference feature 10 is not or not only bound to the tool 12 (e.g. attached to the robot’s flange), but e.g. if the tool is a gripper, the reference feature 10 can also be bound to the workpiece 13 attached to the gripper.
- the virtual axis 40 can be the axis of the cylindrical workpiece and the reference feature 10 can be on the cylindrical workpiece.
- Target features 11 can be defined on the basis of a model of the workpiece 13, e.g. geometric information can be defined for a target feature 11 using the 3D model of the workpiece, and visual information can be prepared for the target feature 11 based on rendered images (using a virtual camera) of the area of the target feature 11 in the 3D model.
- visual and geometric information is required for the target feature.
- template patterns can be set on images captured of the target feature and the corresponding starting robot TCP pose (position and orientation) can preferably be manually taught for a PCB measurement point.
- a teaching operator can position the robot with the probe above the feature to be measured in a safety plane above the workpiece.
- images can be captured, possibly with additional, built in light source(s) (e.g. using LED panels) turned on, and the target feature can be selected for example on the zoomed camera images with pixel accuracy.
- This can be then stored together with the robot TCP coordinates by reading the values from the robot controller, and the operator can attach a label, a measurement type and reference electric parameters to the set measurement point.
- the TCP pose can serve as an initial pose for the positioning method, and this pose is preferably maintained during the positioning, except in the positioning directions determined by the positioning plane.
- a measurement profile can be formed, which can be assigned to a specific product type.
- the measurement profile can be automatically loaded and executed.
- the fine positioning is carried out for the first measurement point (target feature), if the positioning is successful, the corresponding motions and measurement (desired action) are executed, next, the same operation is performed for the second measurement point, etc., until the operation is performed for every measurement point.
- target features 11 are defined in a way that they can be consistently recognized throughout the fine positioning.
- An exact robot start pose is not required; it is sufficient to define an approximate starting pose. In this case, no prior geometric information (e.g. digital model, shop drawing) is required.
- a movement is carried out with the first object towards the second object, e.g. in form of a translational movement with the tool 12 in parallel with or along the virtual axis 40 until the tool 12 contacts the workpiece 13.
- a movement is carried out with the first object towards the second object, e.g. in form of a translational movement with the tool 12 in parallel with or along the virtual axis 40 until the tool 12 contacts the workpiece 13.
- the viewpoints 24, 34 and the markers 21 , 31 are arranged so that the reference feature 10 is on the virtual axis 40.
- the target feature 11 can always be touched by making an approximation in the direction of the virtual axis 40; the target feature 11 always remains on the virtual axis 40.
- the action with the tool 12 can be started as soon as positioning is complete and there is no need to further position the tool 12 to compensate any distance vector between the virtual axis 40 and the reference feature 10.
- an additional distance compensation is necessary.
- the reference feature 10 is not necessarily arranged on the virtual axis 40, and it can be an action point of a respective tool 12, e.g. the tip of a gauge tip, a bit head tip for screwing, or a welding tip for spot welding.
- the system will be tolerant to changes in the imaging unit-spatial virtual axis relationship. Namely, in case of any imaging unit displacement, the displaced virtual planes still define the same intersection line, so the same spatial virtual axis 40 will remain, as it is defined by the stationary physical markers 21 , 31 .
- movements in the positioning plane in an X direction and a Y direction correspond to either
- the imaging units 20, 30 are rigidly fixed to the robot’s flange and the viewpoints 24, 34 and the markers 21 , 31 are arranged so that the first plane 23 is perpendicular to the second plane 33, more preferably the first plane 23 is perpendicular to the X axis of the TCP frame and the second plane 33 is perpendicular to the Y axis of the TCP frame.
- This is related to the TCP-imaging unit-marker arrangement, which is considered to be fixed in a robotic system.
- each image has a preferred movement direction, and if the target feature 11 and the image virtual axis 40 overlap in one image, the fine positioning movement of the other imaging unit will not move the target feature 11 out of this already existing overlapping.
- TCP-imaging unit-marker arrangements are also possible, where the respective axes and planes are not aligned perpendicularly, in these cases the relative geometric relation between the planes and the axes need to be determined, and applied during the positioning process to be able issue motion commands suitable for the positioning directions in the positioning plane. This also means that the images do not have a preferred movement direction, and both images may influence the movement in both positioning directions.
- the controlled movements of the robot in the X and Y direction of the positioning plane are aligned with the X and Y axes of the TCP frame, which are also in parallel with the plane normal of the first and second virtual planes 23, 33, respectively.
- the viewpoints 24, 34 and the markers 21 , 31 are arranged so that the first plane 23 is perpendicular to a first controlled movement direction of the robotic manipulator 100, and the second plane 33 is perpendicular to a second controlled movement direction of the robotic manipulator 100.
- the compensation vector determined on the image of the first imaging unit 20 directly corresponds to the X directional motion in the positioning plane
- the image of the second imaging unit 30 directly corresponds to the Y directional motion in the positioning plane.
- the virtual axis 40 will be perpendicular to the positioning plane.
- the X and Y axes of the TCP frame are not in parallel with the positioning plane, resulting that the virtual axis 40 is also not perpendicular to the positioning plane.
- the projection of the X and Y axes of the robot TCP frame onto the positioning plane are used as X and Y positioning directions.
- TCP pose used for one positioning task is determined by the initial TCP pose assigned to the target feature, which can be set either using a model-based approach by the programming operator or using a model-free approach by the teaching operator.
- the robot movement is preferably set up in such a way that each image has a preferred movement direction, and if the target feature 11 and the image virtual axis 40 overlap in one image, the fine positioning movement of the other imaging unit will not move the target feature 11 out of this already existing overlapping.
- the distance determining steps preferably comprise image processing for identifying respective positions of the first and second imagings 111 , 112 of the target feature 11 within the respective images and/or image processing for identifying respective positions of the imagings 211 , 311 of the markers 21 , 31 within the respective images.
- a positioning point can be defined on the virtual axis 40 by means of a third plane intersecting the virtual axis 40, wherein the third plane can be defined by a third viewpoint of a third imaging unit and/or further at least one marker defining a third line in an image of the first or second imaging units 20, 30.
- fine positioning including e.g. rotation around the Z-axis or Z-directional translation
- fine positioning can be implemented by defining multiple virtual planes. For example, the intersection of three virtual planes can result in a virtual point that can be overlapped with the target feature 11 .
- two imaging units are used to position the target point using two virtual planes, then first the target point is positioned on the spatial virtual axis 40. For the actual positioning, the target point is then moved along the virtual axis 40. Without the inclusion of a third plane intersecting the virtual axis 40, reaching the desired position can be identified e.g. by means of metallic contact, force or other type of sensor signals.
- a second plane is selected on one of the imaging units - i.e. three planes in total on the two imaging units - then the third plane can be used to overlap any point on the virtual axis 40 with the target point.
- Fig. 7 schematically illustrates a block diagram of visual servoing in a preferred embodiment of the invention.
- Visual feedback for the visual servoing is provided by a compound image processing algorithm, e.g. by using standard or machine learning-based tools.
- the images are taken in step 50.
- the image processing is responsible for two main tasks, with the final goal being the calculation of the compensation vector for the servo technique.
- a feature extraction step 51 firstly, marker points are detected on the images of the imaging units 20, 30 to determine the virtual axis 40, which in the present embodiment ideally coincides with the testpin’s axis, from both viewpoints 24, 34 for the bi-directional error compensation.
- the target feature is identified on the images of the imaging units 20, 30.
- a template pattern can be selected during the target feature setup, which serves as a reference image.
- the corresponding pattern can be sought on each camera image in every servo iteration.
- the pixel position resulting in the highest correlation can be selected as the target point of the compensation vector.
- the distance within the image is computed between the imaging of the target feature 11 and the image virtual axis (in this case, representing the pin axis), perpendicular to the image virtual axis.
- the markers 21 , 31 can be identified and localized in the image space by visual imaging and image processing methods on the images of the imaging units 20, 30.
- Image keypoints can be used to determine the target feature 11 on the images of the imaging units 20, 30, this could be invariant to image rotation and image scaling, and less sensitive to variation in the lighting conditions and colors.
- Fig. 8 schematically illustrates a constellation-based matching usable for identifying a target feature, which is a variant of image keypoint-based matching.
- This can involve a machine learning-based or conventional detection algorithm for prominent objects on the images of the imaging units 20, 30. Whether or not an object is prominent is determined by the workpiece type.
- many PCBs have via holes on the outer layer, which have a precise position on the board and are distinct enough to detect and localize.
- a set of via holes can be considered as a 2D point cloud on the image (“constellation”), which maps the image, and features can be localized relative to this constellation using e.g. homography. This can be performed when defining the target feature 11 on the reference image, and the resulting constellation can be compared with the constellation identified on the actual camera image during the visual servo process.
- Fig. 9 schematically illustrates a template matching usable for identifying a target feature, e.g. by using normalized cross correlation.
- reference images e.g. in different colors for defining the target feature, and these can be associated with a single target feature 11 , while an autonomous selection of the best fitting initial reference image can be provided for the visual servo operation.
- the reference images corresponding to target features 11 can also be masked to further improve robustness. Any kind of image processing/lighting-based preparation can be applied to improve robustness.
- Fig. 10 schematically illustrates a practical application of a preferred embodiment with a workpiece 13, which is not aligned with respect to locating pins 60
- Fig. 11 schematically illustrates an enlarged part of Fig. 10 showing the workpiece 13 in a fixture-aligned position where the workpiece 13 is in abutment with the locating pins 60.
- the invention is used for screwing and unscrewing with a screwing tool 12. If the spatial virtual axis is constrained to the given screw, it may be useful to have two spatial virtual axes 40, one for the tool 12 and one for the screw.
- the spatial virtual axis 40 associated with the current screw against the virtual axis of the tool 12 or even against the TCP frame each time the screw is gripped; or it is possible to use the image virtual axes associated with the screw for fine positioning.
- the reference feature 10 the position of which varies relative to the TCP frame per screw grip
- the action can be e.g. a twisting task, or some kind of translation and a twisting task.
- the invention does not necessarily require the workpiece 13 to be in an instrumented, exactly located state, (i.e. in known position and orientation).
- the workpiece 13 may be clamped without being precisely located in the positioning plane. In non-contact type action cases, neither clamping nor locating may be necessary. In case the workpiece is not in an exactly located state, a maximum deviation in the position of the workpiece 13 shall be such that the condition that the target feature 11 is in the field of view of the imaging units 20, 30 at the initial position of the fine positioning is still satisfied. This can be achieved, for example, by placing the workpiece 13 in the robot's workspace with a slight orientation error, e.g. by manually placing the workpiece 13 in a visually marked position without using locating pins. For larger position and/or orientation errors, an accuracy enhancement technique - e.g.
- homography-based measurement using an image of a larger portion of the workpiece, possibly containing identifiable portions of the workspace as well - can be used to determine the approximate in-plane position and/or orientation error of the workpiece 13. This further enhances the flexibility of the invention, as it does not necessarily require instrumentation, which can result in actions being performed on workpieces 13 arranged on a simple flat surface, such as a table.
- the invention overcomes the disadvantages of prior art solutions of conventional positioning, which are due to accumulation of errors in robot, workpiece, fixturing and tool manufacturing, as well as errors in assembly and control (measurement, resolution) and errors in imaging and image processing. This often results in insufficient accuracy during positioning, which makes it impossible to successfully perform the desired action, or necessitate exceptional manufacturing and assembly precision, control and imaging resolution for the positioning to be feasible for the desired precision.
- one of the reference and target features 10, 11 is bound to the robotic cell or even to the robot, and can be set when the robotic cell is installed, but the other one of the reference and target features 10, 11 is bound to the application (i.e. the workpiece or object on which the action is performed).
- the definition of one of the said features can be flexible and simple so that it can be easily set from model, or manually by a teaching operator.
- a low-cost, compact, robot-mountable measurement tool is provided by the invention, e.g. for the realization of flexible PCB measurement and visual servo function.
- the measuring tool may carry LED light sources together with their control modules to enable robust image processing with the imaging units 20, 30. Their role is to counteract any uncertainties of environmental lighting conditions.
- a test-pin holding probe has an elongated shape so that the measurement tool can reach measurement points even in the close vicinity of higher components on the PCB, such as heatsinks.
- the test-pin is fixed in the probe with a screw, which also provides the electric contact for the crimped positive terminal cable of the measuring instrument.
- a spring-loaded pin (pogo pin probe) is used for the measurement, which makes robot control simpler and more robust.
- the robot TCP is configured so that the origin of the TCP frame coincides with the tip of the pin and the Z axis of the TCP frame is aligned with the axis of the test-pin.
- the testpin can be easily replaced by unscrewing the fastener in case of damage or wear.
- the imaging units 20, 30 may be fixed in such an angle that their axes intersect the axis of the test-pin under the tip of the pin. In this way, the images of the imaging units 20, 30 cover the measured feature as well as the probe.
- the probe preferably carries two marker points per imaging unit as shown in the Figs. 1 , 3 and 4, which preferably indicate the axis of the test-pin.
- the invention is not only applicable to a single static target feature 11 , but it can also be extended to tracking, which can be conceived e.g. as a sequence of static target features 11 instead of a single static target feature 11 .
- a dynamic target feature 11 may also be used, which target feature 11 is a point of interest in motion.
- Another possibility is to perform trajectory tracking in two steps, first, performing fine positioning for a target position sequence above the workpiece 13, e.g.
- Straight-line tracking can also be implemented by defining two target features 11 instead of one target feature 11 , and detecting and localizing both of them in both images of the imaging units 20, 30.
- the two target features 11 define a vector that makes an angle with the image virtual axes in both image of the imaging units 20, 30.
- the angle between the image virtual axis and the target feature vector can be adjusted to the target angle value.
- a linear trajectory can be traced between the two target features, or along the line defined by them, in the direction corresponding to the spatial virtual axis 40.
- multiple robots can also be applied for example for PCB diagnostic tasks, for cases when different terminals need to be in contact with different target points, simultaneously.
- a manipulator with less degrees of freedom than a robotic arm can be applied.
- the spatial virtual axis 40 is preferably calibrated to the robot TCP frame, and thereafter the reference feature 10 is preferably calibrated to the virtual axis 40 or to the TCP frame.
- the cameras are assumed to be pin-hole cameras, or other cameras that are calibrated and have perspective, i.e. line preserving, undistorted images.
- Automated, semi-automated or manual geometric calibration can be performed using a calibration device and method that is able to calibrate the system in two levels:
- the spatial virtual axis 40 of e.g. a tool relative to the robot joint it is mounted to, for example to the Z axis of the joint; this involves the geometric relation of the imaging unit and the robot joint, as well as the geometric relation of the marker points relative to the robot joint.
- This can be realized by for example applying visual servoing to a target feature 11 using different robot TCP orientations.
- the spatial virtual axis 40 is going to intersect the same target feature 11 in different orientations.
- the TCP frame can be measured relative to the robot base.
- the orientation of the spatial virtual axis 40 can be determined e.g. using the least squares method.
- Geometric calibration of the position of the reference feature 10 relative to the identified spatial virtual axis 40 can be determined for example manually, after visual servoing (with or without performing the action afterwards), the reference feature 10 can be moved by e.g. jogging the robot to meet the target feature with the desired geometric relation.
- the manually introduced displacement shows the spatial position of the reference feature 10 relative to the spatial virtual axis 40.
- An alternative, automated calibration method can be for example performing visual servoing to one or a set of artificial target features 11 with known positions relative to the robot base frame, and analyzing the image of the imaging units 20, 30.
- the reference feature 10 can be determined in the images of the imaging units 20, 30 relative to the target feature(s) 11 in the image space, and hence (e.g. by using homography) its physical position can also be determined in the robot base frame, in the robot workspace or in the robot TCP frame.
- both calibrations can be avoided if the spatial virtual axis 40 coincides with one of the main axes of the TCP frame, and the reference feature 10 is coincident with the spatial virtual axis 40.
- Novel visual servoing method and system are provided e.g. to overcome the challenges of automated PCB measurement in repair shop environments.
- the method can handle inaccurately located workpieces and move the tool, e.g. measurement probe to a measurement point with the required accuracy.
- the invention provides
- the teaching process only requires pixel point selections on the camera images and an approximate positioning of the robot TCP;
- the visual servo method is capable of compensating a couple of millimeter errors in positioning, and the workspace of the robot arm allows large variation in PCB size;
- the method is less sensitive to deviations along/around the axes defined by the undefined degrees of freedom relative to the set initial pose;
- the invention is a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the steps of the inventive positioning method.
- Embodiments of the invention can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment including both hardware and software elements.
- An embodiment may be implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
L'invention concerne un procédé de positionnement pour positionner une caractéristique de référence (10) par rapport à une caractéristique cible (11) le long d'un plan de positionnement à l'intérieur d'un espace de travail accessible par un manipulateur robotique, l'une desdites caractéristiques de référence et cibles (10, 11) étant sur un premier objet fixé au manipulateur robotique et l'autre desdites caractéristiques de référence et cible (10, 11) étant sur un second objet disposé dans l'espace de travail et étant détaché du manipulateur robotique. Des première et seconde étapes de positionnement sont effectuées et sont répétées jusqu'à ce qu'au moins une condition d'arrêt soit obtenue, la ou les conditions d'arrêt comprenant la condition selon laquelle la distance physique respective calculée se situe dans une limite de seuil. L'invention concerne également un système de positionnement, un produit programme d'ordinateur et un support lisible par ordinateur pour mettre en œuvre le procédé.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| HUP2300241 | 2023-07-06 | ||
| HUP2300241 | 2023-07-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025008652A1 true WO2025008652A1 (fr) | 2025-01-09 |
Family
ID=94171330
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/HU2024/050055 Pending WO2025008652A1 (fr) | 2023-07-06 | 2024-07-05 | Procédé de positionnement et système de positionnement |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025008652A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6208375B1 (en) | 1999-05-21 | 2001-03-27 | Elite Engineering Corporation | Test probe positioning method and system for micro-sized devices |
| US20150120047A1 (en) | 2013-10-31 | 2015-04-30 | Seiko Epson Corporation | Control device, robot, robot system, and control method |
| JP2017050376A (ja) * | 2015-09-01 | 2017-03-09 | 富士電機株式会社 | 電子部品実装装置及び電子部品実装方法 |
| CN116323115A (zh) * | 2020-10-08 | 2023-06-23 | 松下知识产权经营株式会社 | 控制装置、机器人臂系统以及机器人臂装置的控制方法 |
-
2024
- 2024-07-05 WO PCT/HU2024/050055 patent/WO2025008652A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6208375B1 (en) | 1999-05-21 | 2001-03-27 | Elite Engineering Corporation | Test probe positioning method and system for micro-sized devices |
| US20150120047A1 (en) | 2013-10-31 | 2015-04-30 | Seiko Epson Corporation | Control device, robot, robot system, and control method |
| JP2017050376A (ja) * | 2015-09-01 | 2017-03-09 | 富士電機株式会社 | 電子部品実装装置及び電子部品実装方法 |
| CN116323115A (zh) * | 2020-10-08 | 2023-06-23 | 松下知识产权经营株式会社 | 控制装置、机器人臂系统以及机器人臂装置的控制方法 |
| US20230219231A1 (en) * | 2020-10-08 | 2023-07-13 | Panasonic Intellectual Property Management Co., Ltd. | Control apparatus for controlling robot arm apparatus that holds holdable object |
Non-Patent Citations (1)
| Title |
|---|
| JOSHI R ET AL: "Application of feature-based multi-view servoing for lamp filament alignment", ROBOTICS AND AUTOMATION, 1996. PROCEEDINGS., 1996 IEEE INTERNATIONAL C ONFERENCE ON MINNEAPOLIS, MN, USA 22-28 APRIL 1996, NEW YORK, NY, USA,IEEE, US, vol. 2, 22 April 1996 (1996-04-22), pages 1306 - 1313, XP010162925, ISBN: 978-0-7803-2988-1, DOI: 10.1109/ROBOT.1996.506887 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11911914B2 (en) | System and method for automatic hand-eye calibration of vision system for robot motion | |
| EP4165364B1 (fr) | Procédé d'alignement robotique pour systèmes de mesure de pièce à travailler | |
| JP4021413B2 (ja) | 計測装置 | |
| EP1607194B1 (fr) | Système robotisé comprenant plusieurs robots munis de moyens pour calibrer leur position relative | |
| JP4191080B2 (ja) | 計測装置 | |
| CN107428009B (zh) | 用于工业机器人调试的方法、使用该方法的工业机器人系统和控制系统 | |
| WO1998057782A1 (fr) | Procede et dispositif permettant d'etalonner le systeme de coordonnees d'un outil de robot | |
| US11679508B2 (en) | Robot device controller for controlling position of robot | |
| JP6900290B2 (ja) | ロボットシステム | |
| JP2004508954A (ja) | 位置決め装置およびシステム | |
| US20170339335A1 (en) | Finger camera offset measurement | |
| WO1993005479A1 (fr) | Chassis de commande d'organe terminal de robot (tcf), procede et dispositif d'etalonnage | |
| JP2015136770A (ja) | 視覚センサのデータ作成システム及び検出シミュレーションシステム | |
| TWI704028B (zh) | 因應治具偏移的刀具路徑定位補償系統 | |
| JP2019063954A (ja) | ロボットシステム、キャリブレーション方法及びキャリブレーションプログラム | |
| CN112529856A (zh) | 确定操作对象位置的方法、机器人和自动化系统 | |
| CN101213049B (zh) | 旋转中心点计算方法、旋转轴线计算方法、程序的生成、动作方法以及机器人装置 | |
| CN113352345A (zh) | 快换装置的更换系统、方法、装置、电子设备及存储介质 | |
| CN109773589A (zh) | 对工件表面进行在线测量和加工导引的方法及装置、设备 | |
| EP3322959A1 (fr) | Procédé de mesure d'un artefact | |
| EP3919240A1 (fr) | Dispositif de génération de coordonnées de travail | |
| CN116803628A (zh) | 对象物的检测方法和检测装置 | |
| JP2012066321A (ja) | ロボットシステムおよびロボット組立システム | |
| WO2025008652A1 (fr) | Procédé de positionnement et système de positionnement | |
| CN117565107A (zh) | 机器人空间定位的方法、系统、介质及设备 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24752147 Country of ref document: EP Kind code of ref document: A1 |