[go: up one dir, main page]

WO2025077953A1 - A method of generating instructions for a stationary robot and a system for performing this method - Google Patents

A method of generating instructions for a stationary robot and a system for performing this method Download PDF

Info

Publication number
WO2025077953A1
WO2025077953A1 PCT/CZ2024/050063 CZ2024050063W WO2025077953A1 WO 2025077953 A1 WO2025077953 A1 WO 2025077953A1 CZ 2024050063 W CZ2024050063 W CZ 2024050063W WO 2025077953 A1 WO2025077953 A1 WO 2025077953A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
data
sensor unit
work tool
processing module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CZ2024/050063
Other languages
French (fr)
Inventor
Megi MEJDRECHOVA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robotwin SRO
Original Assignee
Robotwin SRO
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robotwin SRO filed Critical Robotwin SRO
Publication of WO2025077953A1 publication Critical patent/WO2025077953A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B3/00Spraying or sprinkling apparatus with moving outlet elements or moving deflecting elements
    • B05B3/02Spraying or sprinkling apparatus with moving outlet elements or moving deflecting elements with rotating elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05BSPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
    • B05B13/00Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
    • B05B13/02Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
    • B05B13/04Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
    • B05B13/0431Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to three-dimensional [3D] surfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/408Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by data handling or data format, e.g. reading, buffering or conversion of data
    • G05B19/4086Coordinate conversions; Other special calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/414Structure of the control system, e.g. common controller or multiprocessor systems, interface to servo, programmable interface controller
    • G05B19/4148Structure of the control system, e.g. common controller or multiprocessor systems, interface to servo, programmable interface controller characterised by using several processors for different functions, distributed (real-time) systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41805Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by assembly
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36184Record actions of human expert, teach by showing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36436Arm follows movement of handheld device, camera detects, analyses motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36442Automatically teaching, teach by showing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40391Human to robot skill transfer

Definitions

  • the present invention relates to a method and system for generating instructions for a stationary robot based on a demonstration of a work operation for surface treatment of a piece. More particularly, the invention relates to generating instructions for e.g. a painting robot.
  • the generated program should include not only motion instructions, i.e. information about the trajectory along which the robotic work tool should move, but also additional instructions for controlling the robotic work tool, e.g. when the robotic work tool should perform the work operation and when it should not, i.e. e.g. when the robot’s painting gun should be activated.
  • the desired solution should also allow the demonstrated work operation to be recorded sufficiently accurately and reliably such that the automated work operation qualitatively corresponds to the manually performed work operation. At the same time, the recording of the demonstrated work operation should not affect the course of the work operation or restrict the user performing the work operation n any way.
  • step of monitoring the position and orientation of the hand work tool comprises a step of recording acceleration and angular velocity over time by an inertial sensor and a step of recording visual images of the surrounding environment at different time points by at least one camera,
  • the calibration data provided to the data processing module comprises information about the relative spatial relationship between the reference coordinate systems of the sensor unit, the hand work tool, the stationary robot, and the piece, and
  • step of processing the data comprises a step of generating instructions for the stationary robot in a programming language of at least one stationary robot manufacturer.
  • the above method enables to generate an executable robotic program (instructions) for a stationary robot of different manufacturers, such that the program includes all essential characteristics of the work operation for surface treatment of the piece.
  • the work operation is monitored with high accuracy and reliability, wherein the recording of the work operation by means of the sensor unit attached to the hand work tool does not affect in any way the course of the work operation so performed, nor does it restrict the person handling the hand work tool in any way.
  • the work operation for surface treatment of the piece may be, for example, industrial painting (wet or powder), but also sanding, polishing, chamfering, blasting, etc.
  • the work operation for surface treatment of the piece may also be a quality check performed in connection with the surface treatment, i.e. for example a quality check of the painted piece.
  • stationary robot refers to a robot that cannot move from one place to another by its own means.
  • a stationary robot may be an industrial robot, e.g. a painting robot, or a collaborative robot.
  • the stationary robot may be fastened to an external mechanism adding an additional degree of freedom, such as a rail conveyor.
  • the generated instructions may then also include instructions for controlling this additional mechanism.
  • the generated instructions can also be used not only for the real stationary robot, but also for a virtual stationary robot, i.e. the so-called digital twin thereof. This can be used for simulation purposes.
  • the hand work tool is an actual work tool that allows the desired work operation for surface treatment of the piece to be performed.
  • this hand work tool is a hand painting gun.
  • the monitoring of the position and orientation (i.e. rotation) of the hand work tool by the sensor unit is accomplished by the combined use of the inertial sensor and the at least one camera, wherein multiple cameras may also be used to increase accuracy.
  • the camera together with the inertial sensor can be collectively referred to as a so-called tracking camera, wherein it should be noted that its output is data with six degrees of freedom, namely the position of the tracking camera over time in three spatial axes and the orientation of the tracking camera over time in three spatial axes. That is, the position and orientation data in its reference coordinate system. Since the sensor unit is attached to the hand work tool and since these recorded data are then transformed using the calibration data, it can be said that the position and orientation of the hand work tool is monitored by the sensor unit as described above.
  • the tracking camera may comprise its own computational unit.
  • the computational unit of the tracking camera preferably comprises a detection module for detecting visual elements in the camera images.
  • These visual elements are, for example, various visually prominent points in the surrounding environment, e.g. corners or edges visible in the images.
  • the detection of the visual elements is advantageous primarily because there is an order of magnitude reduction in the number of points to be considered, typically from thousands of pixels (depending on the resolution of the camera) to a few hundred visual elements.
  • Each visual element is marked, spatially mapped, and arranged into a vector of visual elements. After a new image is acquired, the detection module compares the newly marked elements arranged in the newly constructed visual element vector and computes the change in position based on this comparison.
  • the position and orientation may be computed from the recorded data elsewhere than in the tracking camera, e.g. by a microcomputer in the control unit or by a data processing module that may be part of a remote server.
  • the output of the tracking camera may be only acceleration over time, angular velocity over time, and visual images, wherein the actual computation of the position and orientation may be performed afterwards, e.g. by a microcomputer in the control unit or by the data processing module.
  • the sensor unit may preferably also comprise a magnetometer to make the computation of the position and orientation more accurate.
  • the step of monitoring the position and orientation of the hand work tool is performed in parallel with the step of monitoring the work state of the hand work tool, which is performed by the work state sensor.
  • the work state sensor provides information on whether the hand work tool is activated.
  • the work state sensor may provide a signal which can be analyzed to obtain the desired work state of the hand work tool. This analysis may be a comparison of the measured signal values with predetermined threshold values.
  • the analysis of this signal to obtain the work state may be performed, for example, by the data processing module. Alternatively, this analysis may be performed e.g. by a microcomputer in the control unit or by the computational unit of the work state sensor, or by another computational unit.
  • other information may also be obtained from the work state sensor signal, e.g. what specific activity of the given work operation was performed at a certain time, etc.
  • the recorded data from the inertial sensor, the camera, and the work state sensor are sent to the data processing module.
  • the recorded data may also be sent to the data processing module continuously.
  • the calibration data to be provided to the data processing module are also sent to this module.
  • the provided calibration data comprise at least information about the relative spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system C of the sensor unit, information about the relative spatial relationship between the reference coordinate system C of the sensor unit and the reference coordinate system P of the piece, and information about the relative spatial relationship between the reference coordinate system R of the stationary robot and the reference coordinate system P of the piece.
  • the information about the relative spatial relationship (in other words, the transformation relationships) between the respective reference coordinate systems is used for transformation between these coordinate systems and may preferably be written in the form of transformation matrices.
  • the provided calibration data preferably comprise at least transformation matrices TGC, TCP, and TRP. Alternatively, however, this information may also be written in another mathematical form.
  • the method preferably comprises the step of performing the calibration measurement.
  • the calibration data are measured, but in general can be obtained by another method and provided to the data processing module.
  • the calibration measurement may be performed at various stages of the method, wherein at least a part of the calibration is performed at the beginning, during, or at the end of the step of monitoring the position and orientation when the recording of the position and orientation data is still active and the recording of the data has not been restarted.
  • Unique visual markers are preferably used for the step of performing the calibration measurement, wherein the information about the relative spatial relationship between the reference coordinate systems G and C, the information about the relative spatial relationship between the reference coordinate systems C and P, the information about the relative spatial relationship between the reference coordinate systems R and P, and additionally the information about the relative spatial relationship between the reference coordinate systems C and R, is obtained by detecting the visual markers by the at least one camera and by determining the relative orientation of the sensor unit with respect to the visual markers.
  • the uniqueness of the visual markers is that each visual marker includes its own identifier to distinguish it from the other visual markers. Thus, when the camera detects a visual marker, it can identify which visual marker it is and associate this information with its position information.
  • This visual marker can be e.g. a QR code.
  • the step of the calibration measurement comprises four parts of the calibration that use calibration fixtures in addition to the visual markers.
  • at least one first calibration fixture with at least one calibration opening and a set of unique first visual markers and also a second calibration fixture comprising a unique second visual marker are provided for the calibration measurement, wherein the relative position of the second calibration fixture and the piece is fixed.
  • the second calibration fixture may be located on e.g. a frame on which the piece is suspended. The relative positions of the first visual markers with respect to each calibration opening are known.
  • the step of performing the calibration measurement using the visual markers comprises the following four parts, wherein the first part comprises the steps of:
  • the second part of the calibration comprises the steps of:
  • the third part of the calibration comprises the steps of:
  • the fourth part of the calibration comprises the steps of:
  • the first and third parts of the calibration are performed before, during, or at the end of the step of monitoring the position and orientation when the recording of the position and orientation data is still active and the recording of the data has not been restarted.
  • the second and fourth parts of the calibration i.e. calibration using the robotic work tool
  • the order of the individual parts of the calibration does not have to be fixed and e.g. the last steps of the sub-parts of the calibration (obtaining information about the spatial relationship, i.e. specifically e.g. constructing the respective transformation matrices) may be performed later.
  • an additional sensor unit may be used for the second and fourth parts of the calibration.
  • a single sensor unit may be used for the entire calibration and may be transferred between the hand work tool and the robotic work tool to perform the individual parts of the calibration.
  • the user may give signal to the control unit using the user interface of the sensor unit, alternatively using the user interface of the control unit.
  • the calibration measurement may be performed on a purely mechanical principle, but the process of calibration using the visual markers is easier, faster and, thanks to the non-contact (optical) detection of the visual markers, more accurate, as there is no risk of the piece moving from its rest position when pushed by the hand work tool or the robotic work tool into the calibration fixture, which is fastened e.g. to the frame on which the piece is suspended. At the same time, the risk of incorrect insertion of the work tool into the calibration opening caused by human error is eliminated.
  • Calibration of the sensor unit with respect to the hand work tool is implemented using a cube-shaped calibration fixture with defined openings located on the individual cube walls, which, by the principle of mechanical lock and key, allow the user to insert the tip of the hand work tool into them in exactly one defined way, i.e. with a specific position and orientation.
  • the relative positions of these openings are defined and known.
  • the user inserts the tip of the hand work tool with the sensor unit fastened sequentially into the individual openings in a defined sequence and, when the hand work tool is correctly placed, the user gives a signal to the control unit, e.g. via the user interface of the sensor unit, to store the position and orientation information provided by the sensor unit.
  • the user performs calibration of the reference coordinate system C of the sensor unit with the reference coordinate system P of the piece.
  • This part of the calibration is performed using a calibration fixture that includes an opening which, by the principle of mechanical lock and key, allows the user to insert the tip of the hand work tool into it in exactly one defined way, i.e. with a specific position and orientation.
  • the relative position of this calibration fixture and the piece is fixed but does not have to be known. However, it must remain constant during the recording of the work operation and during the third part of the mechanical calibration (stationary robot - piece).
  • the user performs the calibration by inserting the hand work tool into the opening of the calibration element and gives a signal to the control unit to store the position and orientation data provided by the sensor unit at that moment.
  • This data is then used to obtain information about the spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system P of the piece. Specifically, a corresponding transformation matrix can thus be constructed.
  • This part of the calibration is performed at the beginning of each demonstration of the work operation for which the user wants to generate a robotic program.
  • Calibration of the reference coordinate system R of the stationary robot with respect to the reference coordinate system P of the piece is performed using a calibration fixture that includes an opening which, by the principle of mechanical lock and key, allows the tip of the robotic work tool to be inserted into it in exactly one defined way, i.e. with a specific position and orientation.
  • the relative position of this calibration fixture and the piece is fixed but does not have to be known. However, it must remain constant during the recording of the work operation and during the calibration of the stationary robot with respect to the piece.
  • the user performs the calibration by guiding the robotic work tool into the opening of this calibration element using the stationary robot controller and noting the position and orientation of the robotic work tool in the reference coordinate system R of the stationary robot using the robot controller.
  • This data is then used to construct a transformation matrix defining the spatial relationship between the reference coordinate system R of the stationary robot and the reference coordinate system P of the piece, i.e. information about the relative spatial relationship of the reference coordinate systems R and P is obtained.
  • This part of the calibration only needs to be performed once, provided that the piece for which the robotic instructions are generated will subsequently always be located in the same place with respect to the stationary robot that will execute the generated robotic program thereon.
  • the step of generating instructions for the stationary robot is preferably performed by an instruction generator comprising information about syntax, semantics, required structure and instruction format for the stationary robot of at least one stationary robot manufacturer, wherein the instructions are written by the instruction generator in the required structure and format based on the syntax and semantics information and include keywords and argument values that are the result of the step of processing the data by the data processing module.
  • These argument values are the resulting position and orientation coordinates in the reference coordinate system R of the stationary robot, which are the result of the steps of processing the data that precede the step of generating the instructions.
  • the resulting position and orientation coordinates may be expressed in another user coordinate system, which, however, has a fixed and known transformation into the reference coordinate system R of the stationary robot.
  • these argument values may be these resulting position and orientation coordinates, which may be modified. For example, in such a way that the generated instructions lead to the performance of the automated operation, e.g. faster or slower than was demonstrated by the hand work tool when monitoring the position and orientation.
  • the step of processing the data by the data processing module further preferably comprises, prior to the step of generating the instructions for the stationary robot, the step of filtering the data, the step of time synchronizing the data, the step of dynamic subsampling, and the step of applying spatial transformation based on the provided calibration data.
  • the step of processing the data must comprise all of these steps - however, the steps of time synchronizing and applying spatial transformation based on the provided calibration data are essential for the proper function.
  • applying spatial transformation entails multiplying the calibration matrices to obtain the resulting transformation relationship. This resulting transformation relationship allows the transformation of the measured data into the reference coordinate system of the stationary robot in order to subsequently generate instructions that, according to the convention of the programming language of the stationary robot, are expected to be expressed in the coordinate system of the robot.
  • the time synchronization is used to synchronize the measured data, i.e. data from the inertial sensor, the camera, and the work state sensor. This ensures that a particular camera image corresponds to the same moment at which the signal was recorded by the work state sensor and the inertial sensor.
  • Dynamic subsampling allows to generate a robotic program with a sufficiently small number of robotic instructions to be easily readable, understandable, and editable by the user, while capturing all the significant elements of the recorded work operation such that it can be replicated with sufficient quality by the stationary robot.
  • the step of monitoring the work state is preferably performed by a sound sensor recording a sound signal during the step of performing the work operation for surface treatment of the piece.
  • the sound signal may be analyzed e.g. by a work state computational unit but may also be analyzed e.g. by a microcomputer in the control unit or by the data processing module.
  • the analysis of the sound signal can also provide e.g. information about which sub-activity of a given work operation was performed at a given time, if these sub-activities differ in the sound they produce.
  • a system for performing the method of the present invention comprises a sensor unit and a control unit communicatively connected to the sensor unit and adapted for controlling the sensor unit, wherein the sensor unit comprises: an attachment element for attachment to the hand work tool, an inertial sensor for recording acceleration and angular velocity over time, - at least one camera for recording visual images of the surrounding environment at different time points; and
  • the system further comprises a data processing module adapted to receive data from the inertial sensor, the at least one camera, and the work state sensor, as well as calibration data including information about the relative spatial relationship between the reference coordinate systems of the sensor unit, the hand work tool, the stationary robot, and the piece, wherein the data processing module comprises an instruction generator comprising an interface for generating instructions for the stationary robot in a programming language of at least one stationary robot manufacturer.
  • the system of the present invention further preferably comprises at least one first calibration fixture with at least one calibration opening and a set of unique first visual markers, wherein the relative positions of the first visual markers with respect to each calibration opening are known, and wherein the at least one calibration opening is adapted in shape to receive the hand work tool in only one particular position and orientation, and the at least one calibration opening is adapted in shape to receive the robotic work tool in only one particular position and orientation, wherein the system further comprises a second calibration fixture comprising a unique second visual marker, wherein the relative position of the second calibration fixture and the piece is fixed. Thanks to these calibration fixtures and visual markers it is possible to perform the preferable calibration using the visual markers, which is easier, more accurate, and faster than purely mechanical calibration.
  • the work state sensor is preferably a sound sensor, which enables reliable and sufficiently sensitive detection of the work state of the hand work tool.
  • the work state sensor may also be a different sensor operating on a different principle, e.g. a Hall sensor, a flow meter that may, e.g. in the case of a painting process, measure the flow of the paint through a supply hose, or an ammeter that may measure the current flow in the case of powder painting, etc.
  • the control unit preferably comprises a microcomputer for controlling the sensor unit and a memory for storing data from the sensor unit and a communication module for communicative connection to the data processing module.
  • This basic arrangement enables efficient and reliable control of data collection and provision of the data to the data processing module.
  • the system further preferably comprises a remote server communicatively connected to the control unit, wherein the data processing module is preferably part of the remote server.
  • the data processing module does not need to be a part of the remote server and may be e.g. a part of the microcomputer of the control unit, which would mean that this microcomputer would perform the respective steps of processing the data.
  • the system further preferably comprises a user interface of the data processing module for configuring the instruction generator and also an access device for accessing the user interface of the data processing module, wherein the user interface of the data processing module is communicatively connected to the data processing module.
  • This user interface of the data processing module allows the user to configure the details of how the instructions are to be generated for the stationary robot and also allows the generated instructions to be subsequently downloaded. Using this interface it is possible, for example, to select the data to be processed, configure the data processing parameters (e.g. of the dynamic subsampling, etc.) or select the type (manufacturer) of the stationary robot.
  • This user interface of the data processing module may preferably be a web interface that is part of a remote server.
  • the access device may be e.g. a computer, a mobile phone, or other device with web access.
  • the user interface of the data processing module does not need to be implemented as a web interface but may be any other suitable interface, e.g. a device with respective control elements.
  • the system for performing the method of the present invention in the first exemplary embodiment comprises a sensor unit 1, a control unit 2 communicatively connected to the sensor unit 1 and adapted to control the sensor unit 1, and also a remote server 21 comprising a data processing module 3.
  • the remote server 21 including the data processing module 3 is communicatively connected to the control unit 2, wherein this communication connection is provided by the communication module 15 in the control unit 2, as will be described in even greater detail below when describing the arrangement of the control unit 2 in the first exemplary embodiment of the system.
  • the sensor unit 1 in the first exemplary embodiment comprises an inertial sensor 8, a camera 9, a work state sensor 10, an attachment element 11 for attaching the sensor unit 1 to the hand work tool 6, and an extension 12.
  • the computational unit of the tracking camera comprises a detection module, which means that a visual element detection algorithm is applied to the images from the camera 9.
  • These visual elements are, for example, various visually prominent points in the surrounding environment, e.g. corners or edges visible in the images.
  • the detection of the visual elements is preferable primarily because there is an order of magnitude reduction in the number of points to be considered, typically from thousands of pixels (depending on the resolution of the camera 9) to a few hundred visual elements.
  • Each visual element is marked, spatially mapped, and arranged into a vector of visual elements. After a new image is acquired, the detection module compares the newly marked elements arranged in the newly constructed visual element vector and computes the change in position based on this comparison.
  • the hand work tool 6 is a hand painting gun and the work operation for surface treatment of the piece 7 is industrial painting.
  • the work state sensor 10 provides information about whether the hand painting gun is activated, in other words, whether the hand painting gun is currently painting. This information is essential for generating the robotic instructions, since it is important for the stationary robot 5 to reproduce not only the motion (position and orientation) of the hand work tool 6 during the subsequent automated work operation but also when the hand work tool 6 has been activated, and thus when the robotic work tool 25 is to be activated.
  • the sound sensor is attached to the inner side of the housing of the sensor unit 1, namely to the inner wall of the housing closest to the hand work tool 6 to which the sensor unit 1 is being fastened.
  • the sensor unit 1 further comprises a user interface 19 of the sensor unit 1, such as control buttons and signal diodes, which are used to control the sensor unit 1 and/or the control unit 2 and to interact with the control unit 2 during the recording of data by the sensor unit T
  • the function of the user interface 19 of the sensor unit 1 will be further explained below in the description of the method.
  • the control unit 2 in the first exemplary embodiment of the system as schematically shown in fig.
  • the 1 comprises a microcomputer 13, a memory 14, a communication module 15 for communicative connection to the remote server 21 , a battery 16 for powering the microcomputer J3, a charging connector 17 for recharging the battery 16, and also a connector for electrical connection to the sensor unit 1 via a cable.
  • the individual components are housed in the housing of the control unit 2, which is indicated in fig. 1 by the dashed line.
  • the housing of the control unit 2 is further provided with at least one strap 1.8, which allows the user to carry the control unit 2 on the body, for example over the shoulder or on the back, and to move freely with it.
  • the control unit 2 further also comprises the user interface 20 of the control unit, such as control buttons and signal diodes, which are used to control the control unit 2 and/or the sensor unit 1 and to interact with the sensor unit 1 during the recording of data by the sensor unit T
  • the function of the user interface 20 of the control unit 1 will be further explained below in the description of the method.
  • the microcomputer 13 in the control unit 2 is powered by the battery 16 and further powers and controls other components of the control unit 2 (user interface 20 of the control unit 2, memory 14, communication module 15), as well as components of the sensor unit T
  • the microcomputer 13 also specifically controls the inertial sensor 8, the camera 9, and the work state sensor 10 and controls the collection (recording, uploading) of data and implements the storage thereof in the memory 14 with which it works.
  • the microcomputer 13 also controls the communication module 15, wherein the communication module 15 in the described first exemplary embodiment is a Wi-Fi module and the communicative connection of the control unit 2 and the remote server 21 is implemented as a wireless connection using Wi-Fi technology. This is indicated in fig. 1 or fig.
  • the remote server 21 since the communicative connection of the control unit 2 and the remote server 21 is used to transmit data recorded by the sensor unit 1 from the control unit 2 to the remote server 21 , the so-called cloud.
  • the remote server 21 due to the mutual connection between the control unit 2 and the sensor unit 1, the remote server 21 is indirectly connected to the sensor unit 1 via the control unit 2. In other words, it can also be said that the data recorded by the sensor unit 1 are being uploaded to the remote server 21 via the control unit 2.
  • the program When the program is uploaded into the controller of the stationary robot 5 and activated on the stationary robot 5, it implements automated control of the stationary robot 5 such that the robotic work tool 25 on the stationary robot 5 reproduces the required work operation for surface treatment of the piece 7 as performed by the hand work tool 6 with the sensor unit 1 attached.
  • the data processing module 3 comprises an instruction generator 23 comprising an interface for generating instructions in a programming language of at least one manufacturer of the stationary robots 5.
  • the data processing module 3 is also adapted for further processing of the data, which is performed prior to the generation of the instructions using the respective algorithms.
  • the data processing module 3 in the first exemplary embodiment comprises a data filtering module, a data time synchronization module, a dynamic subsampling module, and a spatial transformation application module. The significance of these individual steps of further processing is explained in the description of the method below.
  • the data processing module 3 comprises sufficient computational capacity to generate instructions and to perform this further processing of the data.
  • the input to the data processing module 3, and therefore also to the instruction generator 23, is data from the inertial sensor 8 and the camera 9, wherein in this particular embodiment it is already combined data, i.e. position and orientation over time in the three spatial axes.
  • Another input is also data from the work state sensor 10 comprising information about whether the hand work tool 6 is activated.
  • Still another input is calibration data, which comprises a transformation matrix defining the relative spatial relationship between the reference coordinate systems of the sensor unit 1, the hand work tool 6, the stationary robot 5, and the piece 7, as will be described in even greater detail below.
  • the instruction generator 23 Based on this input data, the instruction generator 23 generates instructions for the stationary robot 5, which will also be described in more detail below when describing the method.
  • the system comprises a user interface of the data processing module and also an access device 4 for accessing this user interface of the data processing module 3.
  • this user interface of the data processing module 3 is a web interface that is a part of the remote server 21 and that allows the user to select the data to be processed, configure the data processing parameters, e.g. the type (manufacturer) of the stationary robot 5 for which the instructions are to be generated, and then download the generated instructions from the remote server 21, specifically from the generated instructions storage.
  • the access device 4 is communicatively connected to the remote server 21 , particularly to at least the instruction generator 23, and to the generated instructions storage 24.
  • the access device 4 is e.g. a computer, a mobile phone, or other device with web access.
  • this method in the first exemplary embodiment comprises the steps of:
  • step of monitoring the position and orientation of the hand work tool 6 comprises a step of recording acceleration and angular velocity over time by the inertial sensor 8 and a step of recording visual images of the surrounding environment at different time points by the camera 9,
  • the calibration data provided to the data processing module 3 comprise a transformation matrix defining the relative spatial relationship between the reference coordinate systems of the sensor unit 1, the hand work tool 6, the stationary robot 5, and the piece 7, and
  • step of processing the data comprises a step of generating instructions for the stationary robot 5 in a programming language of at least one manufacturer of stationary robots 5.
  • the calibration measurement which, in a first exemplary embodiment, is used to obtain the calibration data, namely to obtain four transformation matrices defining the relative spatial relationship between the reference coordinate systems of the sensor unit 1, the hand work tool 6, the stationary robot 5, and the piece 7.
  • a transformation matrix TGC is obtained defining the relative spatial relationship between the reference coordinate system G of the hand work tool 6 (denoted by the letter G according to the English “gun”) and the reference coordinate system C of the sensor unit 1 (denoted by the letter C according to the English “camera”, which is a part of the sensor unit 1).
  • This first part of the calibration is implemented using a first calibration fixture 26, which comprises a shape-defined calibration opening.
  • This calibration opening is adapted for the insertion of the tip of the hand work tool 6 in exactly one defined way, i.e. with a specific position and a specific orientation of the hand work tool 6.
  • the first calibration fixture 26 is schematically shown in fig. 2, wherein the insertion of the hand work tool 6 is indicated by an arrow leading to this calibration opening.
  • the first calibration fixture 26 further comprises a set of first visual markers, which are visible after insertion of the hand work tool 6 into the calibration opening by the camera 9 that is a part of the sensor unit 1 attached on the hand work tool 6.
  • the relative positions of these first visual markers and the calibration opening are defined and known.
  • the visual markers are unique and each includes its own identifier to distinguish it from the other visual markers.
  • the user After inserting the tip of the hand work tool 6 with the fastened sensor unit 1 into the calibration opening and at the moment when the hand work tool 6 is correctly placed, the user gives a signal using the user interface 19 of the sensor unit 1, i.e. for example by pressing a button, to the control unit 2 to store the position and orientation information provided by the sensor unit 1.
  • the sensor unit 1, namely the camera 9, simultaneously detects the first visual markers and determines its relative position and orientation to these individual first visual markers.
  • a transformation matrix TGC is then constructed defining the relative spatial relationship between the reference coordinate system G of the hand work tool 6 and the reference coordinate system C of the sensor unit T This first part of the calibration is performed every time the user fastens the sensor unit 1 to the hand work tool 6 in order to demonstrate the work operation for subsequent automation.
  • a transformation matrix 7CR defining the relative spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system R of the stationary robot 5 (denoted by the letter R after the English “robot”), namely the robotic work tool 25, is obtained in an analogous manner.
  • This second part of the calibration is also implemented using the first calibration fixture 26, however, instead of the hand work tool 6, the robotic work tool 25 is inserted into the calibration opening (guided by the controller of the stationary robot 5), to which the sensor unit 1 is attached for the purposes of this part of the calibration.
  • the camera 9 detects the first visual markers and measures its relative position and orientation with respect to these first visual markers.
  • a transformation matrix 7CR is then constructed defining the relative spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system R of the stationary robot 5, specifically the robotic work tool 25.
  • a transformation matrix TOP defining the relative spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system P of the piece 7 (denoted by the letter P after the English “product” or “part” for the piece 7) is obtained.
  • This third part of the calibration is implemented by a second calibration fixture 27, which comprises a second visual marker that is located at a fixed relative distance from the piece 7. This fixed relative distance does not need to be known but must remain constant during the calibration and during the step of performing the work operation for surface treatment of the piece 7 using the hand work tool 6, i.e. during the demonstration.
  • the second calibration fixture 27 is schematically shown in fig. 2 for the piece 7, which is suspended from a support frame for a work operation, e.g. for painting.
  • the user performs this part of the calibration by bringing the hand work tool 6 closer to the second calibration fixture 27 and pointing the camera 9 in the sensor unit 1 at the second visual marker.
  • the sensor unit 1 recognizes the second visual marker, since the second visual marker is, like all visual markers used in the calibration, unique, and measures its relative position and orientation with respect thereto.
  • the user gives a signal using the user interface 19 of the sensor unit 1, i.e. for example, by pressing a button, to the control unit 2 to store the position and orientation data with respect to the reference coordinate system C of the sensor unit 1 that is provided at that moment by the sensor unit 1_.
  • This data is subsequently used to construct a transformation matrix TCP defining the spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system P of the piece 7.
  • This calibration is performed at the beginning of each demonstration of the work operation for which the user wants to generate a robotic program.
  • a transformation matrix 7RP defining the relative spatial relationship between the reference coordinate system R of the stationary robot 5 and the reference coordinate system P of the piece 7 is obtained in an analogous manner.
  • This fourth part of the calibration is also implemented using the second calibration fixture 27, however, instead of the hand work tool 6, the robotic work tool 25 is brought closer to the second calibration fixture 27 with the second visual marker (guided by the controller of the stationary robot 5), to which the sensor unit 1 is attached for the purpose of this part of the calibration.
  • the relative position of the second calibration fixture 27 and the piece 7 is fixed but does not have to be known. However, it must remain constant during the calibration and during the step of performing the work operation for surface treatment of the piece 7 using the hand work tool 6, i.e. during the demonstration.
  • the user performs this part of the calibration by aiming the camera 9 at the second visual marker after approaching the second visual marker, the camera 9 detects this second visual marker and determines its relative position and orientation with respect thereto.
  • the user notes the position and orientation of the robotic work tool 6 in the reference coordinate system R of the stationary robot 5, using the controller of the stationary robot 5.
  • This data together with the data obtained from the second part of the calibration (between the reference coordinate system C of the sensor unit 1 and the reference coordinate system R of the stationary robot 5) is subsequently used to construct a matrix TRP defining the relative spatial relationship between the reference coordinate system R of the stationary robot 5 and the reference coordinate system P of the piece 7.
  • This calibration only needs to be performed once, assuming that the piece 7 for which the robotic instructions are generated will subsequently always be in the same place with respect to the stationary robot 5, which will perform an automated work operation on it according to the generated robotic program.
  • the second part of the calibration is necessary because the camera 9 is used for the fourth part of the calibration (the stationary robot 5-the piece 7). Furthermore, it is also worth mentioning that the sensor unit 1 is attached on the robotic work tool 25 only for the purpose of calibration and is no longer attached on the robotic work tool 25 during the performance of the automated work operation.
  • the step of performing the calibration measurement is followed by the step of performing the work operation for surface treatment of the piece 7 using the hand work tool 6 with the sensor unit 1 attached.
  • the monitoring of the position and orientation and also the monitoring of the work state of the hand work tool 6 is performed by the sensor unit 1, as also described above.
  • This monitoring i.e. data recording/logging
  • the user e.g. by means of the user interface 19 of the sensor unit 1 or the user interface 20 of the control unit 2, signaling to the control unit 2 the start of the recording.
  • the user After performing the work operation for surface treatment of the piece 7 using the hand work tool 6, the user signals the end of the recording in a similar way.
  • the data from the sensor unit 1 are stored in the memory 14.
  • the stored data are transferred via wireless (Wi-Fi) communication to the remote server 21 , where it is processed by the data processing module 3 to generate instructions for the stationary robot 5.
  • the calibration data are also transferred to the input of the data processing module 3.
  • the processing of the data comprises several sub-steps, namely the step of filtering the data, the step of time synchronizing the data, the step of dynamic subsampling, the step of applying spatial transformation based on the provided calibration data, and subsequently the step of generating instructions for the stationary robot 5.
  • the processing of the data comprises several sub-steps, namely the step of filtering the data, the step of time synchronizing the data, the step of dynamic subsampling, the step of applying spatial transformation based on the provided calibration data, and subsequently the step of generating instructions for the stationary robot 5.
  • the step of generating the instructions is performed by the instruction generator 23, which is in the first described embodiment a part of the remote server 21 and which includes an interface for generating instructions for the stationary robot 5 in a programming language of at least one manufacturer of the stationary robots 5.
  • the instruction generator 23 comprises information about the syntax, semantics, required structure and instruction format for the stationary robot s, including information about what keywords in the programming language of the given manufacturer of the stationary robot 5 are used to write the individual instructions for the stationary robot 5.
  • An executable robotic program is created by having the instruction generator 23 write the individual instructions according to syntax and semantics rules and using the argument values resulting from previous data processing.
  • the instruction generator 23 creates a .mod file that includes the appropriate MODULE and PROCEDURE sections, and the PROCEDURE section lists the individual instructions on the individual lines. These instructions begin with the PaintL keyword and are followed by the argument values of the instructions x, y, z, q-i, q2, qa, q4 (and others), instead of which the instruction generator 23 adds specific numbers that determine where the robotic work tool 25 should move, how fast, etc.
  • the method and system of the present invention may be further implemented in further exemplary embodiments other than the first exemplary embodiment described in detail above. The individual alternatives by which the further exemplary embodiments may differ are explained in the section with the summary of the invention.
  • the method and system described above can be used to generate instructions useful in various work operations for surface treatment, such as painting, sanding, polishing, chamfering, blasting, etc.
  • the generated instructions can be uploaded not only into the controller of an actual stationary robot but also into the controller of a virtual stationary robot, i.e. a digital twin thereof for simulation purposes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Manipulator (AREA)

Abstract

The object of the invention is a method of generating instructions for a stationary robot (5) comprising the steps of: performing the work operation for surface treatment of the piece (7) using a hand work tool (6) with a sensor unit (1) attached, monitoring the position and orientation of the hand work tool (6) by the sensor unit (1) during the step of performing the work operation, monitoring the work state of the hand work tool (6) by a work state sensor (10) during the step of performing the work operation, providing calibration data to a data processing module (3), wherein the calibration data provided to the data processing module (3) comprise information about the relative spatial relationship between the reference coordinate systems of the sensor unit (1), the hand work tool (6), the stationary robot (5), and the piece (7), and processing the data by the data processing module (3), wherein the step of processing the data comprises the step of generating instructions for the stationary robot (5) in a programming language of at least one manufacturer of the stationary robots (5). The object of the invention is also a system for performing this method.

Description

A method of generating instructions for a stationary robot and a system for performing this method
Technical Field
The present invention relates to a method and system for generating instructions for a stationary robot based on a demonstration of a work operation for surface treatment of a piece. More particularly, the invention relates to generating instructions for e.g. a painting robot.
Background of the Invention
Currently, there are solutions for automating a work operation, e.g. industrial painting, using industrial robots. For example, various demonstrators are used for the automation, by which the user demonstrates the motion performed for the given operation and, based on the recording of this demonstrated motion, the robot is programmed to copy this motion in the subsequent automated operation. However, this demonstrator is typically not a real work tool and thus does not allow to capture all the essential characteristics of the work operation. The available solutions also often fail to generate robotic instructions for different industrial robots, or for industrial robots from different manufacturers, as different manufacturers use different programming languages to program the robots. Such solutions can then be used only for one given robot, but not for robots from other manufacturers, which is disadvantageous.
Therefore, it would be desirable to come up with a solution that would allow recording the actual performance of the work operation such that all the essential characteristics of this work operation are captured, and that would allow generating a robotic program executable by a specific industrial robot. The generated program should include not only motion instructions, i.e. information about the trajectory along which the robotic work tool should move, but also additional instructions for controlling the robotic work tool, e.g. when the robotic work tool should perform the work operation and when it should not, i.e. e.g. when the robot’s painting gun should be activated. The desired solution should also allow the demonstrated work operation to be recorded sufficiently accurately and reliably such that the automated work operation qualitatively corresponds to the manually performed work operation. At the same time, the recording of the demonstrated work operation should not affect the course of the work operation or restrict the user performing the work operation n any way.
Summary of the Invention
The above shortcomings are to a certain extent eliminated by a method of generating instructions for a stationary robot based on a demonstration of a work operation for surface treatment of a piece, the essence of which lies in the fact that it comprises the steps of:
- performing a work operation for surface treatment of a piece using a hand work tool with a sensor unit attached,
- monitoring the position and orientation of the hand work tool by the sensor unit during the step of performing the work operation for surface treatment of the piece, wherein the step of monitoring the position and orientation of the hand work tool comprises a step of recording acceleration and angular velocity over time by an inertial sensor and a step of recording visual images of the surrounding environment at different time points by at least one camera,
- monitoring the work state of the hand work tool by a work state sensor during the step of performing the work operation for surface treatment of the piece, wherein the work state sensor provides information about whether the hand work tool is activated,
- providing calibration data to a data processing module, wherein the calibration data provided to the data processing module comprises information about the relative spatial relationship between the reference coordinate systems of the sensor unit, the hand work tool, the stationary robot, and the piece, and
- processing the data from the inertial sensor, the at least one camera, and the work state sensor as well as the calibration data by the data processing module, wherein the step of processing the data comprises a step of generating instructions for the stationary robot in a programming language of at least one stationary robot manufacturer.
The above method enables to generate an executable robotic program (instructions) for a stationary robot of different manufacturers, such that the program includes all essential characteristics of the work operation for surface treatment of the piece. The work operation is monitored with high accuracy and reliability, wherein the recording of the work operation by means of the sensor unit attached to the hand work tool does not affect in any way the course of the work operation so performed, nor does it restrict the person handling the hand work tool in any way. The work operation for surface treatment of the piece may be, for example, industrial painting (wet or powder), but also sanding, polishing, chamfering, blasting, etc. The work operation for surface treatment of the piece may also be a quality check performed in connection with the surface treatment, i.e. for example a quality check of the painted piece.
The term stationary robot is well established in the art and refers to a robot that cannot move from one place to another by its own means. In particular, a stationary robot may be an industrial robot, e.g. a painting robot, or a collaborative robot. Alternatively, the stationary robot may be fastened to an external mechanism adding an additional degree of freedom, such as a rail conveyor. The generated instructions may then also include instructions for controlling this additional mechanism. The generated instructions can also be used not only for the real stationary robot, but also for a virtual stationary robot, i.e. the so-called digital twin thereof. This can be used for simulation purposes.
The hand work tool is an actual work tool that allows the desired work operation for surface treatment of the piece to be performed. Thus, for example, if the work operation for surface treatment of the piece is industrial painting, this hand work tool is a hand painting gun.
The monitoring of the position and orientation (i.e. rotation) of the hand work tool by the sensor unit is accomplished by the combined use of the inertial sensor and the at least one camera, wherein multiple cameras may also be used to increase accuracy. The camera together with the inertial sensor can be collectively referred to as a so-called tracking camera, wherein it should be noted that its output is data with six degrees of freedom, namely the position of the tracking camera over time in three spatial axes and the orientation of the tracking camera over time in three spatial axes. That is, the position and orientation data in its reference coordinate system. Since the sensor unit is attached to the hand work tool and since these recorded data are then transformed using the calibration data, it can be said that the position and orientation of the hand work tool is monitored by the sensor unit as described above. In order to compute the position and orientation from the data recorded by the sensors (from the acceleration in the three spatial axes over time and from the angular velocity in the three spatial axes over time), the tracking camera may comprise its own computational unit.
The computational unit of the tracking camera preferably comprises a detection module for detecting visual elements in the camera images. These visual elements are, for example, various visually prominent points in the surrounding environment, e.g. corners or edges visible in the images. The detection of the visual elements is advantageous primarily because there is an order of magnitude reduction in the number of points to be considered, typically from thousands of pixels (depending on the resolution of the camera) to a few hundred visual elements. Each visual element is marked, spatially mapped, and arranged into a vector of visual elements. After a new image is acquired, the detection module compares the newly marked elements arranged in the newly constructed visual element vector and computes the change in position based on this comparison.
Alternatively, the position and orientation may be computed from the recorded data elsewhere than in the tracking camera, e.g. by a microcomputer in the control unit or by a data processing module that may be part of a remote server.
Alternatively, the output of the tracking camera may be only acceleration over time, angular velocity over time, and visual images, wherein the actual computation of the position and orientation may be performed afterwards, e.g. by a microcomputer in the control unit or by the data processing module. In addition to a tri-axial accelerometer for recording acceleration and a tri-axial gyroscope for recording angular velocity, the sensor unit may preferably also comprise a magnetometer to make the computation of the position and orientation more accurate.
The step of monitoring the position and orientation of the hand work tool is performed in parallel with the step of monitoring the work state of the hand work tool, which is performed by the work state sensor. The work state sensor provides information on whether the hand work tool is activated. Thus, for example, the work state sensor may provide a signal which can be analyzed to obtain the desired work state of the hand work tool. This analysis may be a comparison of the measured signal values with predetermined threshold values. The analysis of this signal to obtain the work state may be performed, for example, by the data processing module. Alternatively, this analysis may be performed e.g. by a microcomputer in the control unit or by the computational unit of the work state sensor, or by another computational unit. In addition to the determination of the work state, other information may also be obtained from the work state sensor signal, e.g. what specific activity of the given work operation was performed at a certain time, etc.
After the completion of the step of performing the work operation, the recorded data from the inertial sensor, the camera, and the work state sensor are sent to the data processing module. Alternatively, the recorded data may also be sent to the data processing module continuously. The calibration data to be provided to the data processing module are also sent to this module.
Preferably, the provided calibration data comprise at least information about the relative spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system C of the sensor unit, information about the relative spatial relationship between the reference coordinate system C of the sensor unit and the reference coordinate system P of the piece, and information about the relative spatial relationship between the reference coordinate system R of the stationary robot and the reference coordinate system P of the piece. The information about the relative spatial relationship (in other words, the transformation relationships) between the respective reference coordinate systems is used for transformation between these coordinate systems and may preferably be written in the form of transformation matrices. Thus, the provided calibration data preferably comprise at least transformation matrices TGC, TCP, and TRP. Alternatively, however, this information may also be written in another mathematical form.
To obtain the calibration data provided to the data processing module, the method preferably comprises the step of performing the calibration measurement. Thus, preferably the calibration data are measured, but in general can be obtained by another method and provided to the data processing module. The calibration measurement may be performed at various stages of the method, wherein at least a part of the calibration is performed at the beginning, during, or at the end of the step of monitoring the position and orientation when the recording of the position and orientation data is still active and the recording of the data has not been restarted.
Unique visual markers are preferably used for the step of performing the calibration measurement, wherein the information about the relative spatial relationship between the reference coordinate systems G and C, the information about the relative spatial relationship between the reference coordinate systems C and P, the information about the relative spatial relationship between the reference coordinate systems R and P, and additionally the information about the relative spatial relationship between the reference coordinate systems C and R, is obtained by detecting the visual markers by the at least one camera and by determining the relative orientation of the sensor unit with respect to the visual markers. The uniqueness of the visual markers is that each visual marker includes its own identifier to distinguish it from the other visual markers. Thus, when the camera detects a visual marker, it can identify which visual marker it is and associate this information with its position information. This visual marker can be e.g. a QR code.
Preferably, the step of the calibration measurement comprises four parts of the calibration that use calibration fixtures in addition to the visual markers. Specifically, at least one first calibration fixture with at least one calibration opening and a set of unique first visual markers and also a second calibration fixture comprising a unique second visual marker are provided for the calibration measurement, wherein the relative position of the second calibration fixture and the piece is fixed. The second calibration fixture may be located on e.g. a frame on which the piece is suspended. The relative positions of the first visual markers with respect to each calibration opening are known.
The at least one calibration opening is adapted in shape to receive the hand work tool in only one particular position and orientation, and the at least one calibration opening is adapted in shape to receive the robotic work tool in only one particular position and orientation. In other words, the calibration opening together with the hand work tool or the robotic work tool, operate on a lock and key principle. The first calibration fixture may comprise multiple calibration openings, e.g. two calibration openings, one of which is adapted in shape to receive the hand work tool and the other is adapted in shape to receive the robotic work tool. Alternatively, however, the first calibration element may comprise only one calibration opening, which is adapted to receive both the hand tool and the robotic tool in sequence. Alternatively, two first calibration fixtures may be provided, wherein one of them is adapted to receive the hand work tool and the other is adapted to receive the robotic work tool.
The step of performing the calibration measurement using the visual markers comprises the following four parts, wherein the first part comprises the steps of:
- inserting the hand work tool with the sensor unit attached into the calibration opening of the first calibration fixture,
- recording the position and orientation information using the sensor unit,
- detecting the individual first visual markers using the camera,
- determining the relative position and orientation of the sensor unit with respect to the individual first visual markers.
- obtaining information about the spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system C of the sensor unit.
The second part of the calibration comprises the steps of:
- guiding the robotic work tool with the sensor unit attached into the calibration opening of the first calibration fixture,
- recording the position and orientation information using the sensor unit,
- detecting the individual first visual markers using the camera,
- determining the relative position and orientation of the sensor unit with respect to the individual first visual markers, and
- obtaining information about the spatial relationship between the reference coordinate system C of the sensor unit and the reference coordinate system R of the stationary robot. The third part of the calibration comprises the steps of:
- bringing the hand work tool with the sensor unit attached closer to the second calibration fixture with the second visual marker,
- recording the position and orientation information using the sensor unit,
- detecting the second visual marker using the camera,
- determining the relative position and orientation of the sensor unit with respect to the second visual marker, and
- obtaining information about the spatial relationship between the reference coordinate system C of the sensor unit and the reference coordinate system P of the piece using the information about the spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system C of the sensor unit.
The fourth part of the calibration comprises the steps of:
- bringing the robotic work tool with the sensor unit attached closer to the second calibration fixture with the second visual marker,
- recording the position and orientation information using the sensor unit,
- detecting the second visual marker using the camera,
- determining the relative position and orientation of the sensor unit with respect to the second visual marker, and
- obtaining information about the spatial relationship between the reference coordinate system R of the stationary robot and the reference coordinate system P of the piece using the information about the spatial relationship between the reference coordinate system C of the sensor unit and the reference coordinate system R of the stationary robot.
The first and third parts of the calibration (i.e. the calibration using the hand work tool) are performed before, during, or at the end of the step of monitoring the position and orientation when the recording of the position and orientation data is still active and the recording of the data has not been restarted. The second and fourth parts of the calibration (i.e. calibration using the robotic work tool) can be performed at any time. The order of the individual parts of the calibration does not have to be fixed and e.g. the last steps of the sub-parts of the calibration (obtaining information about the spatial relationship, i.e. specifically e.g. constructing the respective transformation matrices) may be performed later. Similarly, the steps of determining the relative position and orientation of the sensor unit with respect to the visual markers may also be performed (computed) later and the physical handling required for the given part of the calibration ends with the detection of the corresponding visual marker and the recording of the position and rotation information by the sensor unit.
For the second and fourth parts of the calibration, an additional sensor unit may be used. Alternatively, a single sensor unit may be used for the entire calibration and may be transferred between the hand work tool and the robotic work tool to perform the individual parts of the calibration. To record the position and orientation information using the sensor unit, the user may give signal to the control unit using the user interface of the sensor unit, alternatively using the user interface of the control unit.
Alternatively, the calibration measurement may be performed on a purely mechanical principle, but the process of calibration using the visual markers is easier, faster and, thanks to the non-contact (optical) detection of the visual markers, more accurate, as there is no risk of the piece moving from its rest position when pushed by the hand work tool or the robotic work tool into the calibration fixture, which is fastened e.g. to the frame on which the piece is suspended. At the same time, the risk of incorrect insertion of the work tool into the calibration opening caused by human error is eliminated.
This alternative method of calibration is shown below, wherein this mechanical calibration comprises only three parts, since it is not necessary to perform a part analogous to the second part of the optical calibration described above.
The first part of the mechanical calibration:
Calibration of the sensor unit with respect to the hand work tool is implemented using a cube-shaped calibration fixture with defined openings located on the individual cube walls, which, by the principle of mechanical lock and key, allow the user to insert the tip of the hand work tool into them in exactly one defined way, i.e. with a specific position and orientation. The relative positions of these openings are defined and known. During this part of the calibration, the user inserts the tip of the hand work tool with the sensor unit fastened sequentially into the individual openings in a defined sequence and, when the hand work tool is correctly placed, the user gives a signal to the control unit, e.g. via the user interface of the sensor unit, to store the position and orientation information provided by the sensor unit. From the known definition of the geometry of this calibration fixture and the position and orientation information from the sensor unit stored during this part of the calibration, information about the spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system C of the sensor unit is subsequently obtained. Specifically, a corresponding transformation matrix can thus be constructed.
The second part of the mechanical calibration:
Next, the user performs calibration of the reference coordinate system C of the sensor unit with the reference coordinate system P of the piece. This part of the calibration is performed using a calibration fixture that includes an opening which, by the principle of mechanical lock and key, allows the user to insert the tip of the hand work tool into it in exactly one defined way, i.e. with a specific position and orientation. The relative position of this calibration fixture and the piece is fixed but does not have to be known. However, it must remain constant during the recording of the work operation and during the third part of the mechanical calibration (stationary robot - piece). The user performs the calibration by inserting the hand work tool into the opening of the calibration element and gives a signal to the control unit to store the position and orientation data provided by the sensor unit at that moment. This data, together with the data obtained from the calibration of the sensor unit with respect to the hand work tool (first part of the mechanical calibration), is then used to obtain information about the spatial relationship between the reference coordinate system G of the hand work tool and the reference coordinate system P of the piece. Specifically, a corresponding transformation matrix can thus be constructed. This part of the calibration is performed at the beginning of each demonstration of the work operation for which the user wants to generate a robotic program.
The third part of the mechanical calibration:
Calibration of the reference coordinate system R of the stationary robot with respect to the reference coordinate system P of the piece is performed using a calibration fixture that includes an opening which, by the principle of mechanical lock and key, allows the tip of the robotic work tool to be inserted into it in exactly one defined way, i.e. with a specific position and orientation. The relative position of this calibration fixture and the piece is fixed but does not have to be known. However, it must remain constant during the recording of the work operation and during the calibration of the stationary robot with respect to the piece. The user performs the calibration by guiding the robotic work tool into the opening of this calibration element using the stationary robot controller and noting the position and orientation of the robotic work tool in the reference coordinate system R of the stationary robot using the robot controller. This data is then used to construct a transformation matrix defining the spatial relationship between the reference coordinate system R of the stationary robot and the reference coordinate system P of the piece, i.e. information about the relative spatial relationship of the reference coordinate systems R and P is obtained. This part of the calibration only needs to be performed once, provided that the piece for which the robotic instructions are generated will subsequently always be located in the same place with respect to the stationary robot that will execute the generated robotic program thereon.
The step of generating instructions for the stationary robot is preferably performed by an instruction generator comprising information about syntax, semantics, required structure and instruction format for the stationary robot of at least one stationary robot manufacturer, wherein the instructions are written by the instruction generator in the required structure and format based on the syntax and semantics information and include keywords and argument values that are the result of the step of processing the data by the data processing module. These argument values are the resulting position and orientation coordinates in the reference coordinate system R of the stationary robot, which are the result of the steps of processing the data that precede the step of generating the instructions. Alternatively, the resulting position and orientation coordinates may be expressed in another user coordinate system, which, however, has a fixed and known transformation into the reference coordinate system R of the stationary robot. In that case, the user must provide this transformation as an additional input to the data processing module, and hence to the instruction generator. Alternatively, these argument values may be these resulting position and orientation coordinates, which may be modified. For example, in such a way that the generated instructions lead to the performance of the automated operation, e.g. faster or slower than was demonstrated by the hand work tool when monitoring the position and orientation.
The step of processing the data by the data processing module further preferably comprises, prior to the step of generating the instructions for the stationary robot, the step of filtering the data, the step of time synchronizing the data, the step of dynamic subsampling, and the step of applying spatial transformation based on the provided calibration data. Alternatively, the step of processing the data must comprise all of these steps - however, the steps of time synchronizing and applying spatial transformation based on the provided calibration data are essential for the proper function. Specifically, applying spatial transformation entails multiplying the calibration matrices to obtain the resulting transformation relationship. This resulting transformation relationship allows the transformation of the measured data into the reference coordinate system of the stationary robot in order to subsequently generate instructions that, according to the convention of the programming language of the stationary robot, are expected to be expressed in the coordinate system of the robot.
The time synchronization is used to synchronize the measured data, i.e. data from the inertial sensor, the camera, and the work state sensor. This ensures that a particular camera image corresponds to the same moment at which the signal was recorded by the work state sensor and the inertial sensor.
Dynamic subsampling allows to generate a robotic program with a sufficiently small number of robotic instructions to be easily readable, understandable, and editable by the user, while capturing all the significant elements of the recorded work operation such that it can be replicated with sufficient quality by the stationary robot.
The step of monitoring the work state is preferably performed by a sound sensor recording a sound signal during the step of performing the work operation for surface treatment of the piece. The sound signal may be analyzed e.g. by a work state computational unit but may also be analyzed e.g. by a microcomputer in the control unit or by the data processing module. In addition to information about the work state itself, the analysis of the sound signal can also provide e.g. information about which sub-activity of a given work operation was performed at a given time, if these sub-activities differ in the sound they produce.
The above shortcomings are also eliminated to a certain extent by a system for performing the method of the present invention, the essence of which lies in the fact that it comprises a sensor unit and a control unit communicatively connected to the sensor unit and adapted for controlling the sensor unit, wherein the sensor unit comprises: an attachment element for attachment to the hand work tool, an inertial sensor for recording acceleration and angular velocity over time, - at least one camera for recording visual images of the surrounding environment at different time points; and
- a work state sensor for monitoring the work state of the hand work tool, wherein the work state sensor provides information about whether the hand work tool is activated, wherein the system further comprises a data processing module adapted to receive data from the inertial sensor, the at least one camera, and the work state sensor, as well as calibration data including information about the relative spatial relationship between the reference coordinate systems of the sensor unit, the hand work tool, the stationary robot, and the piece, wherein the data processing module comprises an instruction generator comprising an interface for generating instructions for the stationary robot in a programming language of at least one stationary robot manufacturer.
Communicative connection here means a connection that allows data transmission, whether it is implemented electrically via cable or wirelessly, e.g. via Wi-Fi technology. The control unit and the sensor unit may each include their own housing, but alternatively they may also be stored in one housing and thus be a part of a single additional unit which is attached to the hand work tool by means of the attachment element. As also mentioned above, multiple cameras can be used to achieve better accuracy.
The system of the present invention further preferably comprises at least one first calibration fixture with at least one calibration opening and a set of unique first visual markers, wherein the relative positions of the first visual markers with respect to each calibration opening are known, and wherein the at least one calibration opening is adapted in shape to receive the hand work tool in only one particular position and orientation, and the at least one calibration opening is adapted in shape to receive the robotic work tool in only one particular position and orientation, wherein the system further comprises a second calibration fixture comprising a unique second visual marker, wherein the relative position of the second calibration fixture and the piece is fixed. Thanks to these calibration fixtures and visual markers it is possible to perform the preferable calibration using the visual markers, which is easier, more accurate, and faster than purely mechanical calibration. The work state sensor is preferably a sound sensor, which enables reliable and sufficiently sensitive detection of the work state of the hand work tool. Alternatively, and less preferably, the work state sensor may also be a different sensor operating on a different principle, e.g. a Hall sensor, a flow meter that may, e.g. in the case of a painting process, measure the flow of the paint through a supply hose, or an ammeter that may measure the current flow in the case of powder painting, etc.
The control unit preferably comprises a microcomputer for controlling the sensor unit and a memory for storing data from the sensor unit and a communication module for communicative connection to the data processing module. This basic arrangement enables efficient and reliable control of data collection and provision of the data to the data processing module.
The system further preferably comprises a remote server communicatively connected to the control unit, wherein the data processing module is preferably part of the remote server. Alternatively, the data processing module does not need to be a part of the remote server and may be e.g. a part of the microcomputer of the control unit, which would mean that this microcomputer would perform the respective steps of processing the data.
The system further preferably comprises a user interface of the data processing module for configuring the instruction generator and also an access device for accessing the user interface of the data processing module, wherein the user interface of the data processing module is communicatively connected to the data processing module. This user interface of the data processing module allows the user to configure the details of how the instructions are to be generated for the stationary robot and also allows the generated instructions to be subsequently downloaded. Using this interface it is possible, for example, to select the data to be processed, configure the data processing parameters (e.g. of the dynamic subsampling, etc.) or select the type (manufacturer) of the stationary robot. This user interface of the data processing module may preferably be a web interface that is part of a remote server. The access device may be e.g. a computer, a mobile phone, or other device with web access. Alternatively, the user interface of the data processing module does not need to be implemented as a web interface but may be any other suitable interface, e.g. a device with respective control elements. Description of Drawings
A summary of the invention is further clarified using exemplary embodiments thereof, which are described with reference to the accompanying drawings, in which: fig. 1 schematically shows the system of the present invention in connection with a stationary robot, with emphasis on the arrangement of the sensor unit and the control unit, fig. 2 schematically shows the system of the present invention in connection with the stationary robot, with indication of the calibration and emphasis on the arrangement of the remote server; and fig. 3 schematically shows a diagram of the method of the first exemplary embodiment of the present invention.
Exemplary Embodiments of the Invention
The invention will be further clarified using exemplary embodiments with reference to the respective drawings. First, an exemplary embodiment of the system by which the method of generating instructions of the present invention can be performed will be described in detail, and subsequently the individual steps of this method will also be described.
The system for performing the method of the present invention in the first exemplary embodiment comprises a sensor unit 1, a control unit 2 communicatively connected to the sensor unit 1 and adapted to control the sensor unit 1, and also a remote server 21 comprising a data processing module 3. The remote server 21 including the data processing module 3 is communicatively connected to the control unit 2, wherein this communication connection is provided by the communication module 15 in the control unit 2, as will be described in even greater detail below when describing the arrangement of the control unit 2 in the first exemplary embodiment of the system. As schematically shown in fig. 1 , the sensor unit 1 in the first exemplary embodiment comprises an inertial sensor 8, a camera 9, a work state sensor 10, an attachment element 11 for attaching the sensor unit 1 to the hand work tool 6, and an extension 12. For example, the sensor unit 1 is composed of a housing for housing the individual components, wherein the extension 12 is connected to this housing and comprises the attachment element 1 1 for attachment to the hand work tool 6. For example, the attachment element 1 1 is adapted in shape for fitting on the hand work tool 6, i.e. in this embodiment it is possible to attach the sensor unit 1 to the hand work tool 6 without using screws.
The sensor unit 1, when attached to the hand work tool 6, enables monitoring of the behavior of the hand work tool 6, and more particularly monitoring of the position and orientation (i.e. rotation) of the hand work tool 6, and also monitoring of the work state of the hand work tool 6.
The monitoring of the position and orientation of the hand work tool 6 is implemented by the combined use of the inertial sensor 8 and the camera 9, in other words, by using data that is recorded by the inertial sensor 8 and visual images from the camera 9. The inertial sensor 8 in the described first exemplary embodiment of the system is an inertial measurement unit (IMU), which comprises a three-axis accelerometer and a three-axis gyroscope. The three-axis accelerometer records acceleration values over time in the three spatial axes and the three-axis gyroscope records angular velocity values over time in the three spatial axes. The camera 9 is a stereo camera in the first described exemplary embodiment, wherein this camera 9 records visual images of the surrounding environment at various time points, i.e., provides time-marked images.
The camera 9 together with the inertial sensor 8 may be collectively referred to as a so-called tracking (or monitoring) camera, wherein in the first exemplary embodiment the tracking camera comprises its own computational unit (not shown in the figures) for computing the position and orientation of its reference coordinate system (i.e. the reference coordinate system C of the sensor unit 1) over time. This, in principle, monitors the position and orientation of the hand work tool 6 (to which the sensor unit 1 is attached), although a transformation using calibration data must still be applied to convert the position and orientation of the tracking camera to the position and orientation of the hand work tool 6, as will be illustrated below. The position and orientation of the tracking camera is computed from the data of the inertial sensor 8 and the camera 9, i.e. from the recorded values of acceleration over time in the three spatial axes, angular velocity over time in the three spatial axes, and based on the visual elements detected in the visual images from the camera 9. To detect the visual elements, the computational unit of the tracking camera comprises a detection module, which means that a visual element detection algorithm is applied to the images from the camera 9. These visual elements are, for example, various visually prominent points in the surrounding environment, e.g. corners or edges visible in the images. The detection of the visual elements is preferable primarily because there is an order of magnitude reduction in the number of points to be considered, typically from thousands of pixels (depending on the resolution of the camera 9) to a few hundred visual elements. Each visual element is marked, spatially mapped, and arranged into a vector of visual elements. After a new image is acquired, the detection module compares the newly marked elements arranged in the newly constructed visual element vector and computes the change in position based on this comparison.
The process described above is known as V-SLAM (Visual Simultaneous Localization and Mapping) technology, wherein the output of the computational unit of the tracking camera performing the V-SLAM is data with six degrees of freedom, namely the position of the tracking camera over time in the three spatial axes and the orientation of the tracking camera over time in the three spatial axes. Thus, in the first described embodiment, the data from the motion sensor 8 and the camera 9 are fed to the data processing module 3 already in this combined and converted form, i.e. as position and orientation over time in the three spatial axes. In other words, the V-SLAM is used to compute the position and orientation of the tracking camera over time, and thus indirectly to monitor the position and orientation of the hand work tool 6 in the environment that is mapped using the camera 9. Commercially available tracking cameras can be used for these purposes.
As mentioned above, in addition to the inertial sensor 8, the sensor unit 1 also comprises the work state sensor 10, which is in the first described exemplary embodiment a sound sensor (or microphone). The sound sensor is used to record a sound signal during the step of performing the work operation for surface treatment of the piece 7, wherein this step of performing the work operation for surface treatment of the piece 7 is performed using the hand work tool 6 with the sensor unit 1 fastened, as will also be described below when describing the method of the present invention.
In the first exemplary embodiment, as indicated in fig. 2, the hand work tool 6 is a hand painting gun and the work operation for surface treatment of the piece 7 is industrial painting. This means that the work state sensor 10 provides information about whether the hand painting gun is activated, in other words, whether the hand painting gun is currently painting. This information is essential for generating the robotic instructions, since it is important for the stationary robot 5 to reproduce not only the motion (position and orientation) of the hand work tool 6 during the subsequent automated work operation but also when the hand work tool 6 has been activated, and thus when the robotic work tool 25 is to be activated.
Specifically, the work state of the hand work tool 6 (activated/not activated) is determined by analyzing this sound signal, which may be performed e.g. by the computational unit of this work state sensor 10 (not shown in the figures). For example, the computational unit of the work state sensor 10 is adapted to compare the values of the sound signal at different time points with predetermined threshold values and is also adapted to subsequently determine the work state of the hand work tool 6. For example, if the sound signal is higher than the first threshold value, the work state is indicated as activated, and if the sound signal is lower than the second threshold value, the work state is indicated as not activated. The information about the work state of the hand work tool 6 is fed to the data processing module 3, as will be further described below. In addition to determining the work state, the computational unit of the work state sensor 10 may also compute other information from the sound signal, e.g. about which sub-activity of a given work operation was performed at a given time, if these sub-activities differ in the sound produced by them.
In the first exemplary embodiment, the sound sensor is attached to the inner side of the housing of the sensor unit 1, namely to the inner wall of the housing closest to the hand work tool 6 to which the sensor unit 1 is being fastened. The sensor unit 1 further comprises a user interface 19 of the sensor unit 1, such as control buttons and signal diodes, which are used to control the sensor unit 1 and/or the control unit 2 and to interact with the control unit 2 during the recording of data by the sensor unit T The function of the user interface 19 of the sensor unit 1 will be further explained below in the description of the method. The control unit 2 in the first exemplary embodiment of the system, as schematically shown in fig. 1 , comprises a microcomputer 13, a memory 14, a communication module 15 for communicative connection to the remote server 21 , a battery 16 for powering the microcomputer J3, a charging connector 17 for recharging the battery 16, and also a connector for electrical connection to the sensor unit 1 via a cable. The individual components are housed in the housing of the control unit 2, which is indicated in fig. 1 by the dashed line. The housing of the control unit 2 is further provided with at least one strap 1.8, which allows the user to carry the control unit 2 on the body, for example over the shoulder or on the back, and to move freely with it. The control unit 2 further also comprises the user interface 20 of the control unit, such as control buttons and signal diodes, which are used to control the control unit 2 and/or the sensor unit 1 and to interact with the sensor unit 1 during the recording of data by the sensor unit T The function of the user interface 20 of the control unit 1 will be further explained below in the description of the method.
The microcomputer 13 in the control unit 2 is powered by the battery 16 and further powers and controls other components of the control unit 2 (user interface 20 of the control unit 2, memory 14, communication module 15), as well as components of the sensor unit T Thus, the microcomputer 13 also specifically controls the inertial sensor 8, the camera 9, and the work state sensor 10 and controls the collection (recording, uploading) of data and implements the storage thereof in the memory 14 with which it works. The microcomputer 13 also controls the communication module 15, wherein the communication module 15 in the described first exemplary embodiment is a Wi-Fi module and the communicative connection of the control unit 2 and the remote server 21 is implemented as a wireless connection using Wi-Fi technology. This is indicated in fig. 1 or fig. 2 by a dashed arrow and also by a graphical signal symbol, since the communicative connection of the control unit 2 and the remote server 21 is used to transmit data recorded by the sensor unit 1 from the control unit 2 to the remote server 21 , the so-called cloud. Thus, due to the mutual connection between the control unit 2 and the sensor unit 1, the remote server 21 is indirectly connected to the sensor unit 1 via the control unit 2. In other words, it can also be said that the data recorded by the sensor unit 1 are being uploaded to the remote server 21 via the control unit 2.
The remote server 21 , as schematically illustrated in fig. 2, comprises, in the first exemplary embodiment, an uploaded data storage 22, the data processing module 3, and a generated instructions storage 24. The uploaded data are thus transmitted to the uploaded data storage 22 and further processed by the data processing module 3, the output of which is instructions for the stationary robot 5. More specifically, the instructions for the stationary robot 5 (or so-called robotic instructions) is an executable program for automated control of the stationary robot 5. Such programs are subsequently stored in the generated instructions storage 24, where they are ready for uploading to the stationary robot 5. When the program is uploaded into the controller of the stationary robot 5 and activated on the stationary robot 5, it implements automated control of the stationary robot 5 such that the robotic work tool 25 on the stationary robot 5 reproduces the required work operation for surface treatment of the piece 7 as performed by the hand work tool 6 with the sensor unit 1 attached.
As also schematically shown in fig. 2, the data processing module 3 comprises an instruction generator 23 comprising an interface for generating instructions in a programming language of at least one manufacturer of the stationary robots 5. However, in addition to the actual generation of the instructions, the data processing module 3 is also adapted for further processing of the data, which is performed prior to the generation of the instructions using the respective algorithms. Specifically, the data processing module 3 in the first exemplary embodiment comprises a data filtering module, a data time synchronization module, a dynamic subsampling module, and a spatial transformation application module. The significance of these individual steps of further processing is explained in the description of the method below. The data processing module 3 comprises sufficient computational capacity to generate instructions and to perform this further processing of the data.
The input to the data processing module 3, and therefore also to the instruction generator 23, is data from the inertial sensor 8 and the camera 9, wherein in this particular embodiment it is already combined data, i.e. position and orientation over time in the three spatial axes. Another input is also data from the work state sensor 10 comprising information about whether the hand work tool 6 is activated. Still another input is calibration data, which comprises a transformation matrix defining the relative spatial relationship between the reference coordinate systems of the sensor unit 1, the hand work tool 6, the stationary robot 5, and the piece 7, as will be described in even greater detail below. Based on this input data, the instruction generator 23 generates instructions for the stationary robot 5, which will also be described in more detail below when describing the method.
In the first exemplary embodiment, the system comprises a user interface of the data processing module and also an access device 4 for accessing this user interface of the data processing module 3. In the first exemplary embodiment, this user interface of the data processing module 3 is a web interface that is a part of the remote server 21 and that allows the user to select the data to be processed, configure the data processing parameters, e.g. the type (manufacturer) of the stationary robot 5 for which the instructions are to be generated, and then download the generated instructions from the remote server 21, specifically from the generated instructions storage. Thus, the access device 4 is communicatively connected to the remote server 21 , particularly to at least the instruction generator 23, and to the generated instructions storage 24. The access device 4 is e.g. a computer, a mobile phone, or other device with web access.
In the following part, the method of generating instructions will be described in more detail, wherein this method in the first exemplary embodiment comprises the steps of:
- performing the calibration measurement to obtain calibration data,
- performing a work operation for surface treatment of the piece 7 using the hand work tool 6 with the sensor unit 1 attached,
- monitoring the position and orientation of the hand work tool 6 by the sensor unit 1 during the step of performing the work operation for surface treatment of the piece 7, wherein the step of monitoring the position and orientation of the hand work tool 6 comprises a step of recording acceleration and angular velocity over time by the inertial sensor 8 and a step of recording visual images of the surrounding environment at different time points by the camera 9,
- monitoring the work state of the hand work tool 6 by the work state sensor 1.0 during the step of performing the work operation for surface treatment of the piece 7, wherein the work state sensor 10 provides information about whether the hand work tool 6 is activated,
- providing calibration data to the data processing module 3, wherein the calibration data provided to the data processing module 3 comprise a transformation matrix defining the relative spatial relationship between the reference coordinate systems of the sensor unit 1, the hand work tool 6, the stationary robot 5, and the piece 7, and
- processing the data from the inertial sensor 8, the camera 9, and the work state sensor 10 as well as the calibration data by the data processing module 3, wherein the step of processing the data comprises a step of generating instructions for the stationary robot 5 in a programming language of at least one manufacturer of stationary robots 5.
First, the calibration measurement will be described which, in a first exemplary embodiment, is used to obtain the calibration data, namely to obtain four transformation matrices defining the relative spatial relationship between the reference coordinate systems of the sensor unit 1, the hand work tool 6, the stationary robot 5, and the piece 7.
First, a transformation matrix TGC is obtained defining the relative spatial relationship between the reference coordinate system G of the hand work tool 6 (denoted by the letter G according to the English “gun”) and the reference coordinate system C of the sensor unit 1 (denoted by the letter C according to the English “camera”, which is a part of the sensor unit 1).
This first part of the calibration is implemented using a first calibration fixture 26, which comprises a shape-defined calibration opening. This calibration opening is adapted for the insertion of the tip of the hand work tool 6 in exactly one defined way, i.e. with a specific position and a specific orientation of the hand work tool 6. In other words, the calibration opening together with the hand work tool 6 operate on a lock and key principle. The first calibration fixture 26 is schematically shown in fig. 2, wherein the insertion of the hand work tool 6 is indicated by an arrow leading to this calibration opening. The first calibration fixture 26 further comprises a set of first visual markers, which are visible after insertion of the hand work tool 6 into the calibration opening by the camera 9 that is a part of the sensor unit 1 attached on the hand work tool 6. The relative positions of these first visual markers and the calibration opening are defined and known. The visual markers are unique and each includes its own identifier to distinguish it from the other visual markers.
After inserting the tip of the hand work tool 6 with the fastened sensor unit 1 into the calibration opening and at the moment when the hand work tool 6 is correctly placed, the user gives a signal using the user interface 19 of the sensor unit 1, i.e. for example by pressing a button, to the control unit 2 to store the position and orientation information provided by the sensor unit 1. The sensor unit 1, namely the camera 9, simultaneously detects the first visual markers and determines its relative position and orientation to these individual first visual markers. From the known definition of the geometry of the first calibration fixture 26 and the position and orientation information from the sensor unit 1 stored during this part of the calibration, a transformation matrix TGC is then constructed defining the relative spatial relationship between the reference coordinate system G of the hand work tool 6 and the reference coordinate system C of the sensor unit T This first part of the calibration is performed every time the user fastens the sensor unit 1 to the hand work tool 6 in order to demonstrate the work operation for subsequent automation.
Further, a transformation matrix 7CR defining the relative spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system R of the stationary robot 5 (denoted by the letter R after the English “robot”), namely the robotic work tool 25, is obtained in an analogous manner.
This second part of the calibration is also implemented using the first calibration fixture 26, however, instead of the hand work tool 6, the robotic work tool 25 is inserted into the calibration opening (guided by the controller of the stationary robot 5), to which the sensor unit 1 is attached for the purposes of this part of the calibration. After the user gives a signal to the control unit 2, the camera 9 detects the first visual markers and measures its relative position and orientation with respect to these first visual markers. From the known definition of the geometry of the first calibration fixture 26 and the position and orientation information from the sensor unit 1 stored during this part of the calibration, a transformation matrix 7CR is then constructed defining the relative spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system R of the stationary robot 5, specifically the robotic work tool 25.
Further, a transformation matrix TOP defining the relative spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system P of the piece 7 (denoted by the letter P after the English “product” or “part” for the piece 7) is obtained. This third part of the calibration is implemented by a second calibration fixture 27, which comprises a second visual marker that is located at a fixed relative distance from the piece 7. This fixed relative distance does not need to be known but must remain constant during the calibration and during the step of performing the work operation for surface treatment of the piece 7 using the hand work tool 6, i.e. during the demonstration. The second calibration fixture 27 is schematically shown in fig. 2 for the piece 7, which is suspended from a support frame for a work operation, e.g. for painting.
The user performs this part of the calibration by bringing the hand work tool 6 closer to the second calibration fixture 27 and pointing the camera 9 in the sensor unit 1 at the second visual marker. The sensor unit 1 recognizes the second visual marker, since the second visual marker is, like all visual markers used in the calibration, unique, and measures its relative position and orientation with respect thereto. Furthermore, the user gives a signal using the user interface 19 of the sensor unit 1, i.e. for example, by pressing a button, to the control unit 2 to store the position and orientation data with respect to the reference coordinate system C of the sensor unit 1 that is provided at that moment by the sensor unit 1_. This data, together with the data obtained from the first part of the calibration (between the reference coordinate system G of the hand work tool 6 and the reference coordinate system C of the sensor unit 1) is subsequently used to construct a transformation matrix TCP defining the spatial relationship between the reference coordinate system C of the sensor unit 1 and the reference coordinate system P of the piece 7. This calibration is performed at the beginning of each demonstration of the work operation for which the user wants to generate a robotic program.
Furthermore, a transformation matrix 7RP defining the relative spatial relationship between the reference coordinate system R of the stationary robot 5 and the reference coordinate system P of the piece 7 is obtained in an analogous manner.
This fourth part of the calibration is also implemented using the second calibration fixture 27, however, instead of the hand work tool 6, the robotic work tool 25 is brought closer to the second calibration fixture 27 with the second visual marker (guided by the controller of the stationary robot 5), to which the sensor unit 1 is attached for the purpose of this part of the calibration. The relative position of the second calibration fixture 27 and the piece 7 is fixed but does not have to be known. However, it must remain constant during the calibration and during the step of performing the work operation for surface treatment of the piece 7 using the hand work tool 6, i.e. during the demonstration. Thus, the user performs this part of the calibration by aiming the camera 9 at the second visual marker after approaching the second visual marker, the camera 9 detects this second visual marker and determines its relative position and orientation with respect thereto. Further, the user notes the position and orientation of the robotic work tool 6 in the reference coordinate system R of the stationary robot 5, using the controller of the stationary robot 5. This data, together with the data obtained from the second part of the calibration (between the reference coordinate system C of the sensor unit 1 and the reference coordinate system R of the stationary robot 5) is subsequently used to construct a matrix TRP defining the relative spatial relationship between the reference coordinate system R of the stationary robot 5 and the reference coordinate system P of the piece 7. This calibration only needs to be performed once, assuming that the piece 7 for which the robotic instructions are generated will subsequently always be in the same place with respect to the stationary robot 5, which will perform an automated work operation on it according to the generated robotic program.
At this point, it may be mentioned that the second part of the calibration is necessary because the camera 9 is used for the fourth part of the calibration (the stationary robot 5-the piece 7). Furthermore, it is also worth mentioning that the sensor unit 1 is attached on the robotic work tool 25 only for the purpose of calibration and is no longer attached on the robotic work tool 25 during the performance of the automated work operation.
In the first exemplary embodiment of the method, the step of performing the calibration measurement is followed by the step of performing the work operation for surface treatment of the piece 7 using the hand work tool 6 with the sensor unit 1 attached. During the performance of the work operation for surface treatment of the piece 7, i.e., for example during the demonstrative painting of the piece 7 with the hand painting gun, the monitoring of the position and orientation and also the monitoring of the work state of the hand work tool 6 is performed by the sensor unit 1, as also described above. This monitoring (i.e. data recording/logging) is initiated by the user, e.g. by means of the user interface 19 of the sensor unit 1 or the user interface 20 of the control unit 2, signaling to the control unit 2 the start of the recording. After performing the work operation for surface treatment of the piece 7 using the hand work tool 6, the user signals the end of the recording in a similar way. During the recording, the data from the sensor unit 1 are stored in the memory 14. After the recording is finished, the stored data are transferred via wireless (Wi-Fi) communication to the remote server 21 , where it is processed by the data processing module 3 to generate instructions for the stationary robot 5. The calibration data are also transferred to the input of the data processing module 3.
In the described first exemplary embodiment of the method, as schematically shown in fig. 3, the processing of the data comprises several sub-steps, namely the step of filtering the data, the step of time synchronizing the data, the step of dynamic subsampling, the step of applying spatial transformation based on the provided calibration data, and subsequently the step of generating instructions for the stationary robot 5. By applying spatial transformation based on the calibration data, the recorded data are transformed into the reference coordinate system R of the stationary robot 5, which allows to generate instructions that are expressed in this reference coordinate system R of the stationary robot 5.
The step of generating the instructions is performed by the instruction generator 23, which is in the first described embodiment a part of the remote server 21 and which includes an interface for generating instructions for the stationary robot 5 in a programming language of at least one manufacturer of the stationary robots 5. The instruction generator 23 comprises information about the syntax, semantics, required structure and instruction format for the stationary robot s, including information about what keywords in the programming language of the given manufacturer of the stationary robot 5 are used to write the individual instructions for the stationary robot 5. An executable robotic program is created by having the instruction generator 23 write the individual instructions according to syntax and semantics rules and using the argument values resulting from previous data processing.
For example for an ABB painting robot, the instruction generator 23 creates a .mod file that includes the appropriate MODULE and PROCEDURE sections, and the PROCEDURE section lists the individual instructions on the individual lines. These instructions begin with the PaintL keyword and are followed by the argument values of the instructions x, y, z, q-i, q2, qa, q4 (and others), instead of which the instruction generator 23 adds specific numbers that determine where the robotic work tool 25 should move, how fast, etc. The method and system of the present invention may be further implemented in further exemplary embodiments other than the first exemplary embodiment described in detail above. The individual alternatives by which the further exemplary embodiments may differ are explained in the section with the summary of the invention. Industrial Applicability
The method and system described above can be used to generate instructions useful in various work operations for surface treatment, such as painting, sanding, polishing, chamfering, blasting, etc. Furthermore, the generated instructions (robotic programs) can be uploaded not only into the controller of an actual stationary robot but also into the controller of a virtual stationary robot, i.e. a digital twin thereof for simulation purposes.
List of Reference Signs
1 sensor unit
2 control unit
3 data processing module
4 access device
5 stationary robot
6 hand work tool
7 piece
8 inertial sensor
9 camera
10 work state sensor
11 attachment element
12 extension
13 microcomputer
14 memory
15 communication module
16 battery
17 charging connector
18 strap
19 user interface of the sensor unit
20 user interface of the control unit
21 remote server
22 uploaded data storage - instruction generator - generated instructions storage - robotic work tool - first calibration fixture - second calibration fixture

Claims

1. A method of generating instructions for a stationary robot (5) based on a demonstration of a work operation for surface treatment of a piece (7), characterized in that it comprises the steps of:
- performing the work operation for surface treatment of the piece (7) using a hand work tool (6) with a sensor unit (1 ) attached,
- monitoring the position and orientation of the hand work tool (6) by the sensor unit (1 ) during the step of performing the work operation for surface treatment of the piece (7), wherein the step of monitoring the position and orientation of the hand work tool (6) comprises a step of recording acceleration and angular velocity over time by an inertial sensor (8) and a step of recording visual images of the surrounding environment at different time points by at least one camera (9),
- monitoring the work state of the hand work tool (6) by a work state sensor (10) during the step of performing the work operation for surface treatment of the piece (7), wherein the work state sensor (10) provides information about whether the hand work tool (6) is activated,
- providing calibration data to a data processing module (3), wherein the calibration data provided to the data processing module (3) comprise information about the relative spatial relationship between the reference coordinate systems of the sensor unit (1 ), the hand work tool (6), the stationary robot (5), and the piece (7), and
- processing the data from the inertial sensor (8), the at least one camera (9), and the work state sensor (10) as well as the calibration data by the data processing module (3), wherein the step of processing the data comprises a step of generating instructions for the stationary robot (5) in a programming language of at least one manufacturer of the stationary robots (5).
2. The method according to claim 1 , characterized in that the provided calibration data comprise at least information about the relative spatial relationship between the reference coordinate system G of the hand work tool (6) and the reference coordinate system C of the sensor unit (1 ), information about the relative spatial relationship between the reference coordinate system C of the sensor unit (1 ) and the reference coordinate system P of the piece (7), and information about the relative spatial relationship between the reference coordinate system R of the stationary robot (5) and the reference coordinate system P of the piece (7).
3. The method according to any one of the preceding claims, characterized in that for obtaining the calibration data, which are provided to the data processing module (3), the method comprises a step of performing calibration measurement.
4. The method according to claims 2 and 3, characterized in that unique visual markers are used for the step of performing the calibration measurement, wherein the information about the relative spatial relationship between the reference coordinate systems G and C, the information about the relative spatial relationship between the reference coordinate systems C and P, the information about the relative spatial relationship between the reference coordinate systems R and P, and additionally also the information about the relative spatial relationship between the reference coordinate systems C and R, is obtained by detecting the visual markers by the at least one camera (9) and by determining the relative orientation of the sensor unit (1 ) with respect to the visual markers.
5. The method according to any one of the preceding claims, characterized in that the step of generating instructions for the stationary robot (5) is performed by an instruction generator (23) comprising information about syntax, semantics, required structure and instruction format for the stationary robot (5) of at least one manufacturer of the stationary robots (5), wherein the instructions are written by the instruction generator (23) in the required structure and format based on the syntax and semantics information and include keywords and argument values that are the result of the step of processing the data by the data processing module (3).
6. The method according to any one of the preceding claims, characterized in that the step of processing the data by the data processing module (3) further comprises, prior to the step of generating the instructions for the stationary robot (5), a step of filtering the data, step of time synchronizing the data, step of dynamic subsampling, and step of applying spatial transformation based on the provided calibration data.
7. The method according to any one of the preceding claims, characterized in that the step of monitoring the work state is performed by a sound sensor recording a sound signal during the step of performing the work operation for surface treatment of the piece (7).
8. A system for performing the method according to any one of the preceding claims 1 to 7, characterized in that it comprises the sensor unit (1 ) and the control unit (2) communicatively connected to the sensor unit (1 ) and adapted to control the sensor unit (1 ), wherein the sensor unit (1 ) comprises:
- an attachment element (1 1 ) for attachment to the hand work tool (6),
- the inertial sensor (8) for recording acceleration and angular velocity over time,
- the at least one camera (9) for recording visual images of the surrounding environment at different time points, and
- a work state sensor (10) for monitoring the work state of the hand work tool (6), wherein the work state sensor (10) provides information about whether the hand work tool (6) is activated, wherein the system further comprises the data processing module (3) adapted to receive data from the inertial sensor (8), the at least one camera (9), and the work state sensor (10), as well as calibration data including information about the relative spatial relationship between the reference coordinate systems of the sensor unit (1 ), the hand work tool (6), the stationary robot (5), and the piece (7), wherein the data processing module (3) comprises an instruction generator (23) comprising an interface for generating instructions for the stationary robot (5) in a programming language of at least one manufacturer of the stationary robots (5).
9. The system according to claim 8, characterized in that it further comprises at least one first calibration fixture (26) with at least one calibration opening and a set of unique first visual markers, wherein the relative positions of the first visual markers with respect to each calibration opening are known, and wherein the at least one calibration opening is adapted in shape to receive the hand work tool (6) in only one particular position and orientation, and the at least one calibration opening is adapted in shape to receive the robotic work tool (25) in only one particular position and orientation, wherein the system further comprises a second calibration fixture (27) comprising a unique second visual marker, wherein the relative position of the second calibration fixture (27) and the piece (7) is fixed.
10. The system according to any one of the preceding claims 8 to 9, characterized in that the work state sensor (10) is a sound sensor.
11 .The system according to any one of the preceding claims 8 to 10, characterized in that the control unit (2) comprises a microcomputer (13) for controlling the sensor unit (1 ), a memory (14) for storing data from the sensor unit (1), and a communication module (15) for communicative connection to the data processing module (3).
12. The system according to any one of the preceding claims 8 to 11 , characterized in that it further comprises a remote server (21 ) communicatively connected to the control unit (2), wherein the data processing module (3) is a part of the remote server (21 ).
13. The system according to any one of the preceding claims 8 to 12, characterized in that it further comprises a user interface of the data processing module (3) for configuring the instruction generator (23) and also an access device (4) for accessing the user interface of the data processing module (3), wherein the user interface of the data processing module (3) is communicatively connected to the data processing module (3).
PCT/CZ2024/050063 2023-10-10 2024-10-09 A method of generating instructions for a stationary robot and a system for performing this method Pending WO2025077953A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CZPV2023-384 2023-10-10
CZ2023-384A CZ2023384A3 (en) 2023-10-10 2023-10-10 Method of generating instructions for a stationary robot and system for implementing this method

Publications (1)

Publication Number Publication Date
WO2025077953A1 true WO2025077953A1 (en) 2025-04-17

Family

ID=93924704

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CZ2024/050063 Pending WO2025077953A1 (en) 2023-10-10 2024-10-09 A method of generating instructions for a stationary robot and a system for performing this method

Country Status (2)

Country Link
CZ (1) CZ2023384A3 (en)
WO (1) WO2025077953A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT392605B (en) * 1989-08-31 1991-05-10 Favre Bulle Bernard Dipl Ing Inertia-sensor-assisted measuring system for manipulators
US20170296173A1 (en) * 2016-04-18 2017-10-19 Ethicon Endo-Surgery, Llc Method for operating a surgical instrument
US20220143830A1 (en) * 2019-03-07 2022-05-12 Wandelbots GmbH Method, system and nonvolatile storage medium
US20230147238A1 (en) * 2020-04-27 2023-05-11 Scalable Robotics Inc. Process Agnostic Robot Teaching Using 3D Scans

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT392605B (en) * 1989-08-31 1991-05-10 Favre Bulle Bernard Dipl Ing Inertia-sensor-assisted measuring system for manipulators
US20170296173A1 (en) * 2016-04-18 2017-10-19 Ethicon Endo-Surgery, Llc Method for operating a surgical instrument
US20220143830A1 (en) * 2019-03-07 2022-05-12 Wandelbots GmbH Method, system and nonvolatile storage medium
US20230147238A1 (en) * 2020-04-27 2023-05-11 Scalable Robotics Inc. Process Agnostic Robot Teaching Using 3D Scans

Also Published As

Publication number Publication date
CZ2023384A3 (en) 2025-04-23

Similar Documents

Publication Publication Date Title
US11911912B2 (en) Robot control apparatus and method for learning task skill of the robot
CN103153553B (en) Vision guide alignment system and method
US9333654B2 (en) Robot parts assembly on a workpiece moving on an assembly line
Fu et al. Robust pose estimation for multirotor UAVs using off-board monocular vision
CN109313417A (en) Help the robot locate
CN114093052A (en) Intelligent inspection method and system suitable for computer room management
JP2005201824A (en) Measuring device
KR102075844B1 (en) Localization system merging results of multi-modal sensor based positioning and method thereof
WO2023165355A1 (en) Surgical platform positioning system, and pose information determining method and device
CN115839726A (en) Method, system and medium for jointly calibrating magnetic sensor and angular speed sensor
CN112729013B (en) Servo index test system of infrared seeker
JP7178790B2 (en) distance measuring device
Rendón-Mancha et al. Robot positioning using camera-space manipulation with a linear camera model
WO2025077953A1 (en) A method of generating instructions for a stationary robot and a system for performing this method
JP2009503711A (en) Method and system for determining the relative position of a first object with respect to a second object, a corresponding computer program and a corresponding computer-readable recording medium
Redžepagić et al. A sense of quality for augmented reality assisted process guidance
WO2022172471A1 (en) Assistance system, image processing device, assistance method and program
Le Gentil et al. A Gaussian Process approach for IMU to Pose Spatiotemporal Calibration
Redzepagic et al. A sense of quality for augmented reality assisted process guidance
CN114115288A (en) Robot gait adjusting method, device, equipment and storage medium
CN120125788B (en) Assessment method and device for virtual-real alignment, electronic equipment and computer readable storage medium
Brill et al. Visual servoing of an inverted pendulum on cart using a mounted smartphone
Artemciukas et al. Real-time control system for various applications using sensor fusion algorithm
Clark Automated Extraction of Fasteners Using Digital Twin Robotics Software and DCNN Vision System
US10388148B2 (en) Method of controlling a calculation device via a mobile element and control system implementing this method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24826981

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)