[go: up one dir, main page]

WO2021261023A1 - Système de commande de robot, programme de commande et procédé de commande - Google Patents

Système de commande de robot, programme de commande et procédé de commande Download PDF

Info

Publication number
WO2021261023A1
WO2021261023A1 PCT/JP2021/008669 JP2021008669W WO2021261023A1 WO 2021261023 A1 WO2021261023 A1 WO 2021261023A1 JP 2021008669 W JP2021008669 W JP 2021008669W WO 2021261023 A1 WO2021261023 A1 WO 2021261023A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
work
control
moving mechanism
positioning error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2021/008669
Other languages
English (en)
Japanese (ja)
Inventor
征彦 仲野
嵩史 大倉
圭 安田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Omron Tateisi Electronics Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp, Omron Tateisi Electronics Co filed Critical Omron Corp
Publication of WO2021261023A1 publication Critical patent/WO2021261023A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Definitions

  • This technology relates to robot control systems, control programs and control methods.
  • robots are used in various applications.
  • an example of using it in combination with a visual sensor for assembling or mounting a component is known.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2001-036295 is a component that superimposes a real image of a mounting component and a mounting substrate on an image pickup unit and then adjusts the relative position of the mounted component while visually recognizing the captured image. Disclose the mounting device and the like.
  • Patent Document 2 Japanese Patent Application Laid-Open No. 10-224095 is an electronic component mounting device operated at high speed, and is an electronic component mounting method capable of mounting components on a printed circuit board with high accuracy even if the influence of vibration becomes large. And so on.
  • Patent Document 3 discloses an electronic component mounting method that can secure a relative position between a plurality of components with high accuracy.
  • Patent Documents 1 to 3 employ a method of measuring and correcting the positional error of parts that occurs in assembly or mounting in advance. According to such a method, only the statically generated position error can be corrected, and the dynamic position error that can occur due to various factors cannot be dealt with.
  • the purpose of this technology is to provide a robot control system that enables more accurate positioning control.
  • a robot control system is mechanically connected to a robot and a hand that causes the first work gripped by the hand to approach the second work, and is displaced between the robot and the hand.
  • the movement mechanism that causes the robot to approach the second work the first control unit that gives a control command to the robot so that the first work approaches the second work, and the positioning error that occurs when the robot approaches the second work. It includes an error calculation unit for calculating the above, and a second control unit for giving a control command to the moving mechanism so as to compensate for the positioning error calculated by the error calculation unit.
  • the positioning error that occurs when the first robot that moves the first work is moved at high speed is calculated, and the first is compensated based on the calculated positioning error.
  • a control command is given to the moving mechanism mechanically connected to the hand that grips the work.
  • the error calculation unit may calculate the deviation from the target position of the first work as a positioning error.
  • the second control unit may give a control command to the moving mechanism so as to generate a displacement that cancels the deviation of the first work from the target position. According to this configuration, the deviation from the target position of the first work can be compensated for each control cycle.
  • the error calculation unit may calculate the positioning error due to the vibration generated in the robot based on the state value of the movable part of the robot. According to this configuration, a component caused by vibration, which is a factor of positioning error when the first robot is moved at high speed, can be appropriately calculated.
  • the error calculation unit may calculate the positioning error due to the deflection caused in the robot based on the temporal change of the state value of the movable part of the robot. According to this configuration, a component due to deflection, which is a factor of positioning error when the first robot is moved at high speed, can be appropriately calculated.
  • the robot control system may further include an image processing device that optically detects the positions of the first work and the second work.
  • the error calculation unit may calculate the positioning error based on the position of the first work and the position of the second work detected by the image processing device. According to this configuration, position control can be realized without providing an encoder in the robot and the moving mechanism.
  • the error calculation unit may calculate the temporal change in the position of the first work as a positioning error.
  • the second control unit may give a control command to the moving mechanism so as to generate a displacement that cancels the temporal change in the position of the first work. According to this configuration, it is possible to compensate for positioning errors such as periodic position fluctuations that occur in the robot or the first work.
  • the second control unit may enable the control command for compensating for the positioning error when the calculated magnitude of the positioning error satisfies a predetermined condition.
  • the control command for compensating for the positioning error is valid only when the calculated magnitude of the positioning error satisfies a predetermined condition, so that it is excessive for the second robot. Control commands can be suppressed.
  • the first control unit, the error calculation unit, and the second control unit may execute the processing synchronously at a predetermined cycle. According to this configuration, since the calculation of the control command for the first robot and the second robot can be performed in synchronization, the control accuracy when the first robot and the second robot are linked can be improved.
  • the robot that causes the first work gripped by the hand to approach the second work is mechanically connected to the robot and the hand, and a displacement is generated between the robot and the hand.
  • a control program executed by a computer of a robot control system including a moving mechanism for causing the robot is provided.
  • the control program calculates a step of giving a control command to the robot so that the computer approaches the first work to the second work, and a positioning error generated when the robot approaches the first work to the second work.
  • the step and the step of giving a control command to the moving mechanism so as to compensate for the calculated positioning error are executed.
  • a robot that causes the first work gripped by the hand to approach the second work is mechanically connected to the robot and the hand, and a displacement is generated between the robot and the hand.
  • a control method executed by a robot control system including a moving mechanism for generating is provided.
  • the control method includes a step of giving a control command to the robot so that the first work approaches the second work, and a step of calculating the positioning error generated when the robot approaches the first work to the second work. It includes a step of giving a control command to the moving mechanism to compensate for the calculated positioning error.
  • FIG. 1 is a schematic diagram showing an outline of the robot control system 1 according to the present embodiment.
  • the robot control system 1 includes a robot 200 that causes the first work 50 gripped by the hand 210 to approach the second work 60.
  • the second work 60 may be arranged on the work table 80.
  • the robot control system 1 includes a moving mechanism 300 mechanically connected to the robot 200 and the hand 210.
  • the moving mechanism 300 causes a displacement between the robot 200 and the hand 210.
  • the robot control system 1 has a control module 30 for controlling the robot control system 1.
  • the control module 30 may be realized in any mounting form.
  • control module 30 includes a first control module 32, an error calculation module 34, and a second control module 36.
  • the first control module 32 is a control logic in charge of controlling the robot 200, and gives a control command to the robot 200 so that the first work 50 approaches the second work 60. Further, the first control module 32 acquires a state value (for example, encoder information indicating the position of each joint) from the robot 200.
  • a state value for example, encoder information indicating the position of each joint
  • the error calculation module 34 calculates a positioning error (in FIG. 1, simply referred to as “error”) that occurs when the robot 200 approaches the first work 50 to the second work 60.
  • the second control module 36 is a control logic in charge of controlling the moving mechanism 300, gives a control command to the moving mechanism 300, and receives a state value (for example, encoder information indicating the position of each axis) from the moving mechanism 300. get.
  • the second control module 36 gives a control command to the moving mechanism 300 so as to compensate for the positioning error calculated by the error calculation module 34.
  • a method of compensating for the positioning error for example, a method (position control) of outputting the position error occurring in the robot 200 (or the first work 50 held by the hand 210) as a compensation amount as it is, and a robot. It is assumed that the method (dumping control) is to output the temporal change of the position occurring in the 200 (or the first work 50 held by the hand 210) as a compensation amount.
  • the damping control is intended to suppress a sudden change in speed that occurs in the hand 210 (or the first work 50 held by the hand 210).
  • the compensation amount is calculated according to the time derivative (first derivative and / or higher derivative) of the position generated in the robot 200 (or the first work 50 held by the hand 210). You may.
  • the moving mechanism 300 is controlled so as to calculate the position and / or the temporal change of the position in the process in which the robot 200 conveys the first work 50, and to compensate for the positioning error based on the calculation result. To. This makes it possible to realize a robot control system 1 capable of more accurate positioning control.
  • FIG. 2 is a schematic diagram illustrating the overall configuration of the robot control system 1 according to the present embodiment.
  • FIG. 2 shows, as an example, an application for assembling two parts.
  • the robot control system 1 includes a robot 200 that grips and moves the first work 50.
  • the robot 200 assembles the first work 50 to the second work 60 arranged on the work table 80.
  • the first work 50 includes an electronic component 52 having a pair of pins 54.
  • the second work 60 includes a substrate 62 and an electronic component 64 arranged on the substrate 62.
  • the electronic component 64 is provided with a pair of holes 66 that are inserted into the pair of pins 54.
  • a positioning marker 68 is provided on the substrate 62.
  • the camera 400 is arranged so that the first work 50 is included in the visual field range, and the positioning marker 68 is optically recognized by the image pickup by the camera 400.
  • a vertical articulated robot is typically used.
  • the robot 200 includes a plurality of links 202 and joints 204 connecting the links 202 to each other.
  • the joint 204 sometimes referred to as a shaft, is driven by a drive source such as a servomotor.
  • the joint 204 of the robot 200 is mechanically coupled to a drive source (not shown) and its relative or absolute position can be detected by a drive source or a sensor (typically an encoder) attached to the joint 204. It has become.
  • the robot 200 is not limited to a vertical articulated robot, but any robot such as a horizontal articulated (scalar) robot or a parallel link robot can be used.
  • the tip of the robot 200 includes a moving mechanism 300 and a hand 210 mechanically connected to the moving mechanism 300.
  • the hand 210 is an example of a tool attached to the robot 200, and can grip an arbitrary work.
  • the moving mechanism 300 generates a displacement for compensating for a positioning error as described later. Any mechanism may be adopted as long as it compensates for the positioning error.
  • FIG. 2 shows, as an example of the moving mechanism 300, an orthogonal mechanism that generates displacements in a plurality of axial directions (for example, three axes of X-axis, Y-axis, and Z-axis) that are orthogonal to each other.
  • the moving mechanism 300 includes movable axes 312, 314, and 316 that can change their positions in directions orthogonal to each other.
  • Each of the movable axes 312, 314, and 316 moves the mechanically connected hand 210 in any three axis directions (X axis, Y axis, Z axis) by moving in the corresponding axial direction. Can be done.
  • the movable shafts 312, 314, 316 of the moving mechanism 300 are mechanically coupled to the servomotor 330 (see FIG. 5 and the like), and a sensor (typically) attached to the servomotor 330 or the movable shafts 312, 314, 316.
  • the relative position or the absolute position can be detected by the encoder).
  • FIG. 3 is a diagram for explaining a problem that arises in an assembly application using a robot.
  • a horizontal or diagonal approach is required to insert the pair of pins 54 extending from the first work 50 including the electronic component 52 into the pair of holes 66 provided in the second work 60. Is required.
  • a vertical articulated robot having a high degree of freedom of movement is required.
  • FIG. 4 is a diagram for explaining another problem that arises in an assembly application using a robot.
  • the link 202 is bent due to the weight of the robot and the weight of the gripped work, and the hand 210 attached to the tip is positioned. It is difficult to secure sufficient accuracy.
  • the robot 200 having a relatively high degree of freedom of movement and the moving mechanism 300 mechanically connected to the hand holding the work are provided. Adopt in combination.
  • the robot 200 can be operated at high speed and in a wide variety of ways. As a result, various assembly processes can be realized at high speed and with high accuracy.
  • FIG. 5 is a schematic diagram illustrating the system configuration of the robot control system 1 according to the present embodiment.
  • the robot control system 1 includes a control device 100, a robot controller 250 networked with the control device 100 via a field network 10, a servo controller 350, and an image processing device 450. ..
  • the control device 100 exchanges data with a device connected to the field network 10 and executes a process as described later.
  • the control device 100 may be typically realized by a PLC (programmable logic controller).
  • the robot controller 250 is in charge of controlling the robot 200. More specifically, the robot controller 250 functions as an interface with the robot 200, outputs a command for driving the robot 200 according to a command from the control device 100, and acquires a state value of the robot 200. And output to the control device 100.
  • the servo controller 350 is in charge of controlling the servomotor 330 that drives the axis of the moving mechanism 300. More specifically, the servo controller 350 functions as an interface with the moving mechanism 300, and according to a command from the control device 100, the servo corresponding to the command for driving one axis constituting the moving mechanism 300. In addition to outputting to the motor 330, the state value of the corresponding servomotor 330 of the moving mechanism 300 is acquired and output to the control device 100.
  • the image processing device 450 executes various image recognition processes on the image captured by the camera 400.
  • the image processing device 450 detects the position of the second work 60 by performing search processing of the marker 68 and the like. Further, the image processing device 450 can also detect the position of the first work 50 when the first work 50 is included in the field of view of the camera 400. In this way, the image processing apparatus 450 can optically detect the positions of the first work 50 and the second work 60.
  • EtherCAT registered trademark
  • EtherNet / IP protocols for industrial networks
  • EtherCAT is adopted as the protocol
  • data can be exchanged between the control device 100 and the device connected to the field network 10 at a fixed cycle of, for example, several hundred ⁇ s to several m seconds.
  • the robot 200 and the moving mechanism 300 included in the robot control system 1 can be controlled with high speed and high accuracy.
  • the control device 100 may be connected to the display device 600 and the server device 700 via the upper network 20.
  • the upper network 20 a protocol for an industrial network, EtherNet / IP, or the like can be used.
  • the control device 100 may be connected to a support device 500 for installing a user program executed by the control device 100 and performing various settings.
  • FIG. 6 is a schematic diagram showing a hardware configuration example of the control device 100 constituting the robot control system 1 according to the present embodiment.
  • the control device 100 includes a processor 102, a main memory 104, a storage 110, a memory card interface 112, an upper network controller 106, a field network controller 108, a local bus controller 116, and a USB. (Universal Serial Bus) Includes a USB controller 120 that provides an interface. These components are connected via the processor bus 118.
  • the processor 102 corresponds to an arithmetic processing unit that executes control operations, and is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like. Specifically, the processor 102 reads a program stored in the storage 110, expands it in the main memory 104, and executes it to realize a control operation for a controlled object.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the main memory 104 is composed of a volatile storage device such as a DRAM (Dynamic Random Access Memory) or a SRAM (Static Random Access Memory).
  • the storage 110 is composed of, for example, a non-volatile storage device such as an SSD (Solid State Drive) or an HDD (Hard Disk Drive).
  • the storage 110 stores a system program 1102 for realizing basic functions, an IEC program 1104 and an application program 1106 created according to a control target, and the like.
  • the IEC program 1104 includes a group of instructions necessary for realizing the assembly process in the robot control system 1 according to the present embodiment.
  • the IEC program 1104 may typically include sequence and motion instructions.
  • the IEC program 1104 may be written in any language specified by IEC 61131-3 as defined by the International Electrotechnical Commission (IEC). However, the IEC program 1104 may include a program described in a manufacturer's own language other than the language specified in IEC61131-3.
  • the application program 1106 includes instructions for controlling the operation of the robot 200 and / or the moving mechanism 300.
  • the application program 1106 may include instructions written in a predetermined programming language (for example, a programming language for robot control such as V + language or a programming language related to NC control such as G code).
  • control module 30 first control module 32, error calculation module 34 and second control module 36 shown in FIG. 1 is realized by executing the IEC program 1104 and / or the application program 1106 by the processor 102. good.
  • the memory card interface 112 accepts a memory card 114, which is an example of a removable storage medium.
  • the memory card interface 112 can read and write arbitrary data to and from the memory card 114.
  • the host network controller 106 exchanges data with an arbitrary information processing device (display device 600, server device 700, etc. shown in FIG. 5) via the host network 20.
  • an arbitrary information processing device display device 600, server device 700, etc. shown in FIG. 5
  • the field network controller 108 exchanges data with each device via the field network 10.
  • the field network controller 108 may function as a communication master of the field network 10.
  • the local bus controller 116 exchanges data with an arbitrary functional unit 130 included in the control device 100 via the local bus 122.
  • the functional unit 130 is, for example, an analog I / O unit that is in charge of input and / or output of an analog signal, a digital I / O unit that is in charge of input and / or output of a digital signal, a counter unit that receives pulses from an encoder, and the like. And so on.
  • the USB controller 120 exchanges data with an arbitrary information processing device (support device 500, etc.) via a USB connection.
  • FIG. 7 is a schematic diagram showing a hardware configuration example of the robot controller 250 constituting the robot control system 1 according to the present embodiment.
  • the robot controller 250 includes a field network controller 252 and a control processing circuit 260.
  • the field network controller 252 mainly exchanges data with the control device 100 via the field network 10.
  • the control processing circuit 260 executes arithmetic processing necessary for driving the robot 200.
  • the control processing circuit 260 includes a processor 262, a main memory 264, a storage 270, and an interface circuit 268.
  • the processor 262 executes a control operation for driving the robot 200.
  • the main memory 264 is composed of, for example, a volatile storage device such as a DRAM or SRAM.
  • the storage 270 is composed of, for example, a non-volatile storage device such as an SSD or an HDD.
  • the storage 270 stores a system program 272 for realizing control for driving the robot 200.
  • the system program 272 includes an instruction for executing a control operation related to the operation of the robot 200 and an instruction for an interface with the robot 200.
  • FIG. 8 is a schematic diagram showing a hardware configuration example of the servo controller 350 constituting the robot control system 1 according to the present embodiment.
  • the servo controller 350 includes a field network controller 352, a control processing circuit 360, and a drive circuit 380.
  • the field network controller 352 mainly exchanges data with the control device 100 via the field network 10.
  • the control processing circuit 360 executes arithmetic processing necessary for controlling the servomotor 330 that drives the moving mechanism 300.
  • the control processing circuit 360 includes a processor 362, a main memory 364, and a storage 370.
  • the processor 362 executes the control calculation related to the servomotor 330 that drives the moving mechanism 300.
  • the main memory 364 is composed of, for example, a volatile storage device such as a DRAM or SRAM.
  • the storage 370 is composed of, for example, a non-volatile storage device such as an SSD or an HDD.
  • the storage 370 stores a system program 372 for realizing drive control of the servomotor 330.
  • the system program 372 includes an instruction for executing a control operation related to the operation of the mobile mechanism 300 and an instruction for an interface with the mobile mechanism 300.
  • the drive circuit 380 includes a converter circuit, an inverter circuit, and the like, generates electric power having a specified voltage, current, and phase according to a command value calculated by the control processing circuit 360, and supplies the electric power to the servomotor 330.
  • the servomotor 330 is mechanically coupled to any of the axes constituting the moving mechanism 300.
  • a motor having characteristics corresponding to the moving mechanism 300 can be adopted.
  • the name is not limited to the servo motor, and any of an inductive motor, a synchronous motor, a permanent magnet motor, and a reluctance motor may be adopted, and not only a rotary motor but also a linear motor may be adopted.
  • FIG. 9 is a schematic diagram showing a hardware configuration example of the image processing device 450 constituting the robot control system 1 according to the present embodiment.
  • the image processing apparatus 450 includes a processor 452, a main memory 454, a storage 460, a memory card interface 462, an upper network controller 456, a field network controller 458, a USB controller 470, and a camera. Includes interface 466 and. These components are connected via the processor bus 468.
  • the processor 452 corresponds to an arithmetic processing unit that executes image processing, and is composed of a CPU, a GPU, and the like. Specifically, the processor 452 reads a program stored in the storage 460, expands it in the main memory 454, and executes it to realize arbitrary image processing.
  • the main memory 454 is composed of a volatile storage device such as DRAM or SRAM.
  • the storage 460 is composed of, for example, a non-volatile storage device such as an SSD or an HDD.
  • the storage 460 stores a system program 4602 for realizing basic functions, an image processing program 4604 created according to a control target, and the like.
  • the memory card interface 462 accepts a memory card 464, which is an example of a removable storage medium.
  • the upper network controller 456 exchanges data with an arbitrary information processing device via the upper network.
  • the field network controller 458 exchanges data with each device via the field network 10.
  • the USB controller 470 exchanges data with an arbitrary information processing device via a USB connection.
  • the camera interface 466 acquires the image captured by the camera 400 and gives various commands to the camera 400.
  • the support device 500 constituting the robot control system 1 according to the present embodiment may be realized by using a general-purpose personal computer as an example. Since the basic hardware configuration example of the support device 500 is well known, detailed description thereof will not be given here.
  • the display device 600 constituting the robot control system 1 according to the present embodiment may be realized by using a general-purpose personal computer as an example. Since the basic hardware configuration example of the display device 600 is well known, detailed description thereof will not be given here.
  • the server device 700 constituting the robot control system 1 according to the present embodiment may be realized by using a general-purpose personal computer as an example. Since the basic hardware configuration example of the server device 700 is well known, detailed description thereof will not be given here.
  • E8 Other forms 6 to 9 show configuration examples in which one or more processors provide necessary functions by executing a program, but some or all of these provided functions are provided by dedicated hardware. It may be implemented using a hardware circuit (for example, ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array)).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • 10 to 12 are diagrams for explaining an assembly process using the robot control system 1 according to the present embodiment.
  • a process of specifying the position where the second work 60 is arranged and determining the relative position between the first work 50 and the second work 60 is performed. More specifically, the second work 60 is imaged by the camera 400, and the positioning marker 68 provided on the substrate 62 of the second work 60 is recognized, so that the second work 60 is arranged on the stage plate 310. The position of the second work 60 is determined.
  • the trajectory 70 for making the first work 50 approach the second work 60. That is, the trajectory 70 for combining the first work 50 gripped by the robot 200 with the second work 60 is calculated. Then, the assembly process using the robot 200 and the moving mechanism 300 is started.
  • the robot 200 moves along the calculated trajectory 70.
  • the moving mechanism 300 compensates for the positioning error that occurs when the robot 200 moves along the trajectory 70.
  • the positioning error generated in the first work 50 is sequentially calculated and moved in order to compensate for the calculated positioning error.
  • the movable shafts 312, 314, 316 of the mechanism 300 are sequentially driven.
  • the position of the second work 60 arranged on the stage plate 310 is adjusted so as to compensate for the positioning error caused in the first work 50.
  • the position of the second work 60 is adjusted so that the positioning error due to vibration and deflection caused by moving the robot 200 at a relatively high speed is absorbed. This absorbs the predetermined relative positional deviation between the first work 50 and the second work 60.
  • FIGS. 10 to 12 show a configuration example in which the camera 400 and the image processing device 450 are used to optically recognize the position of the positioning marker 68, the camera 400 and the image processing device 450 are indispensable. If the position where the second work 60 is arranged is predetermined instead of the configuration, the camera 400 and the image processing device 450 may be omitted.
  • a plurality of devices cooperate to realize processing.
  • the position information managed by each device is often defined by a coordinate system independent of each other. Therefore, positioning control may be realized using a common reference coordinate system.
  • FIG. 13 is a diagram showing an example of a coordinate system defined in the robot control system 1 according to the present embodiment. With reference to FIG. 13, the positioning control of the entire robot control system 1 is realized by a common reference coordinate system.
  • the tip position of the robot 200 (the position where the robot 200 and the moving mechanism 300 are mechanically connected) is defined by the robot coordinate system defined based on the installation position of the robot 200.
  • the position of the moving mechanism 300 connected to the tip of the robot 200 (the position where the moving mechanism 300 and the hand 210 are mechanically connected) is defined by the hand coordinate system defined based on the tip position of the robot 200.
  • the position of the first work 50 can be calculated by the combination of the robot position (robot coordinate system) and the movement mechanism position (hand coordinate system).
  • the position of the second work 60 is also calculated by the camera 400 and the image processing device 450.
  • the calculated position of the second work 60 is defined by the camera coordinate system (or image coordinate system) defined with reference to the inside of the image captured by the camera 400. That is, the position of the second work 60 can be defined by the camera coordinate system.
  • the relative position After converting the position specified in each coordinate system as described above to the position specified in the common reference coordinate system, the relative position may be calculated and the positioning control may be performed.
  • FIG. 14 is a diagram for explaining an example of position information exchanged in the robot control system 1 according to the present embodiment.
  • the robot controller 250 acquires (1) the robot position (robot coordinate system) from the robot 200 and periodically sends it to the control device 100.
  • Each of the servo controllers 350 acquires the state value of the corresponding servomotor 330 and periodically sends it to the control device 100.
  • the set of the state values of each servomotor 330 constituting the moving mechanism 300 is (2) the moving mechanism position (hand coordinate system).
  • the image processing device 450 identifies the position where the second work 60 is arranged by image recognition processing based on the image captured by the camera 400, and (3) controls the position (camera coordinate system) of the second work 60. It is periodically sent to the device 100.
  • the control device 100 converts each position into a position in the reference coordinate system by using a conversion formula acquired in advance based on the positional relationship of the robot arrangement, calibration, and the like, and then executes the processing necessary for the positioning control. do.
  • the moving mechanism 300 adopted in the robot control system 1 according to the present embodiment is not limited to the XYZ stages that can move in three axial directions, but also the X stage that can move in one axial direction and the moving mechanism 300 that can move in two axial directions. It may be a possible XY stage. Alternatively, a Cartesian robot that can move in more axial directions may be adopted. Further, as the moving mechanism 300, a mechanism capable of generating a displacement along one or a plurality of rotation axes may be adopted.
  • FIG. 15 is a schematic diagram showing a configuration example of the moving mechanism 300 adopted in the robot control system 1 according to the present embodiment.
  • a movable shaft 312 that moves along the X axis
  • a movable shaft 314 that moves along the Y axis
  • a movable shaft 316 that moves along the Z axis.
  • the movable shaft 316 may be configured to be rotatable around the Z axis.
  • the configuration of the moving mechanism 300 may be determined according to the intended use.
  • ⁇ I. Processing procedure> Next, the processing procedure in the robot control system 1 according to the present embodiment will be described.
  • the deviation from the target position of the first work 50 is calculated as a positioning error.
  • the control device 100 gives a control command to the moving mechanism 300 so as to generate a displacement that cancels the deviation of the first work 50 from the target position.
  • FIG. 16 is a flowchart showing a processing procedure of an assembly process in the robot control system 1 according to the present embodiment. Each step shown in FIG. 16 is typically realized by the processor 102 of the control device 100 executing a program. The series of processes shown in FIG. 16 are repeatedly executed in a predetermined cycle (control cycle).
  • control device 100 detects the position of the second work 60 arranged on the stage plate 310 (step S2), and the trajectory 70 for inserting the first work 50 into the second work 60. Is calculated (step S4). Then, the following processes of steps S10 to S26 are repeated until the first work 50 is inserted into the second work 60.
  • control device 100 acquires the encoder information of each joint of the robot 200 and the encoder information of each axis of the moving mechanism 300 (step S10).
  • the control device 100 calculates the current position (theoretical value) of the first work 50 held by the hand 210 of the robot 200 based on the acquired encoder information (step S12).
  • the "theoretical value” is a value calculated based on the encoder information of each joint of the robot 200, and does not reflect the positioning error caused by the deflection described later.
  • the control device 100 calculates the correction amount of the moving mechanism 300 caused by the vibration corresponding to the positioning error caused in the robot 200 due to the vibration (step S14).
  • the calculated positioning error corresponds to the deviation of the first work 50 from the target position.
  • the calculated correction amount of the moving mechanism 300 corresponds to a control command for generating a displacement that cancels the deviation of the first work 50 from the target position.
  • control device 100 calculates the positioning error caused by the vibration generated in the robot 200 based on the state value (for example, encoder information) of the movable portion (typically indirect) of the robot 200.
  • the control device 100 calculates the amount of deflection generated in each link 202 of the robot 200 based on the acquired encoder information (step S16).
  • the control device 100 calculates the current position (actual value) of the first work 50 based on the calculated amount of deflection (step S18).
  • the control device 100 calculates a correction amount of the movement mechanism 300 due to the deflection corresponding to the positioning error caused in the robot 200 due to the deflection (step S20).
  • the calculated positioning error corresponds to the deviation of the first work 50 from the target position.
  • the calculated correction amount of the moving mechanism 300 corresponds to a control command for generating a displacement that cancels the deviation of the first work 50 from the target position.
  • control device 100 calculates the positioning error caused by the deflection caused in the robot 200.
  • the control device 100 determines whether or not the combined result of the correction amount of the movement mechanism 300 due to vibration and the correction amount of the movement mechanism 300 due to deflection is within a predetermined threshold value (step S22). .. That is, the control device 100 calculates the positioning error that occurs when the robot 200 approaches the first work 50 to the second work 60.
  • step S24 When the combined result of the correction amount of the movement mechanism 300 due to vibration and the correction amount of the movement mechanism 300 due to deflection exceeds a predetermined threshold value (NO in step S22), the control device 100 , The target position of the moving mechanism 300 for compensating for the error between the target position of the first work 50 and the current position (actual value) is updated (step S24).
  • step S24 If the error between the target position of the first work 50 and the current position (actual value) is within a predetermined threshold value (YES in step S22), the process of step S24 is skipped.
  • control device 100 outputs the target position of the next cycle of the robot 200 to the robot controller 250, and outputs the target position of the next cycle of the moving mechanism 300 to the servo controller 350 (step S26). That is, the control device 100 gives a control command to the robot 200 so that the first work 50 approaches the second work 60 along the trajectory 70. Further, the control device 100 gives a control command to the moving mechanism 300 so as to compensate for the calculated positioning error (particularly, when YES in step S22).
  • step S2 the image processing apparatus 450 detects the position (camera coordinate system) of the second work 60 by performing the search process of the marker 68 on the image captured by the camera 400.
  • the control device 100 converts the position of the second work 60 (camera coordinate system) into the position of the reference coordinate system, and calculates the initial position P w2 (0) of the second work 60. That is, the image processing device 450 optically detects the position of the second work 60 supported by the moving mechanism 300.
  • step S4 the control device 100 calculates the initial position P w1 (0) of the first work 50 based on the initial position of the robot 200, and the calculated initial positions P w1 (0) and initial positions P w2 (0). ),
  • the trajectory 70 for inserting the first work 50 into the second work 60 is calculated.
  • the trajectory 70 may be defined as a set of points (reference coordinate system) to be passed by the first work 50 held by the robot 200, or may be defined by using a function defined in the reference coordinate system. ..
  • the target position of the robot 200 at the time t which is the current cycle, is R d (t), and the current position of the robot 200 is R (t).
  • the control device 100 calculates the current position (theoretical value) R (t) of the robot 200 based on the encoder information of the robot 200 (encoder value E Ri (t) of each joint i). Further, the current position (theoretical value) P w1 (t) of the first work 50 is calculated. At the same time, the control device 100 calculates the current position S (t) of the moving mechanism 300 based on the encoder information of the moving mechanism 300 (encoder value E Si (t) of each axis i).
  • the function f includes a time rate of change of the encoder value E Ri of each joint (t) and an encoder value E Ri of each joint (t) as an input variable.
  • Controller 100 calculates the amount of deflection occurs each joint calculated by that torque tau i from (t) are generated in the link 202 [delta] j (t). Finally, the control device 100 synthesizes the deflection amount ⁇ j (t) generated in each link 202 to calculate the correction amount ⁇ f (t) of the movement mechanism 300 caused by the deflection.
  • the positioning error due to the deflection may be calculated based on the temporal change of the state value (for example, encoder information) of the movable part (typically indirect) of the robot 200.
  • the control device 100 on the basis of the track 70 which is calculated in advance, the target position of the first workpiece 50 from the target position P dw1 the first workpiece 50 at time t (t) at time t + 1 is the next period P dw1 (T + 1) is calculated, and further, the target position R d (t + 1) of the robot 200 is calculated. In this way, the control device 100 calculates a new target position of the robot 200 based on the current position of the first work 50.
  • Positioning control based on camera image
  • a processing example in which positioning control is performed using a state value (for example, encoder information) acquired from the robot 200 and the moving mechanism 300 is shown, but it is optically recognized by imaging with the camera 400.
  • Positioning control may be performed using the positions of the first work 50 and the second work 60.
  • the control device 100 calculates the positioning error based on the position of the first work 50 and the position of the second work 60 detected by the image processing device 450.
  • An implementation example that adopts control is shown.
  • FIG. 17 is a flowchart showing another processing procedure of the assembly processing in the robot control system 1 according to the present embodiment.
  • Each step shown in FIG. 17 is typically realized by the processor 102 of the control device 100 executing a program.
  • the series of processes shown in FIG. 17 are repeatedly executed in a predetermined cycle (control cycle).
  • the processing procedure shown in FIG. 17 is obtained by changing steps S10 to S26 included in the processing procedure shown in FIG. 16 to steps S30 to S40.
  • control device 100 acquires the positions of the first work 50 and the second work 60 detected based on the image captured by the camera 400 (step S30).
  • the control device 100 calculates the positioning error that occurs in the robot 200 based on the acquired positions of the first work 50 and the second work 60 (step S32), and the correction amount of the moving mechanism 300 that compensates for the calculated positioning error. Is calculated (step S34).
  • the control device 100 determines whether or not the calculated correction amount of the moving mechanism 300 is within a predetermined threshold value (step S36).
  • step S36 When the calculated correction amount of the moving mechanism 300 exceeds a predetermined threshold value (NO in step S36), the control device 100 is a moving mechanism for compensating for the positioning error occurring in the robot 200.
  • the target position of 300 is updated (step S38).
  • step S36 If the calculated correction amount of the moving mechanism 300 is within a predetermined threshold value (YES in step S36), the process of step S38 is skipped.
  • control device 100 outputs the target position of the next cycle of the robot 200 to the robot controller 250, and outputs the target position of the next cycle of the moving mechanism 300 to the servo controller 350 (step S40).
  • a temporal change in the position of the first work 50 is calculated as a positioning error.
  • the control device 100 gives a control command to the moving mechanism 300 so as to generate a displacement that cancels the temporal change in the position of the first work 50.
  • the following processing example shows an implementation example in which damping control is added to the implementation example in which the position control shown in FIG. 16 is adopted.
  • FIG. 18 is a flowchart showing still another processing procedure of the assembly processing in the robot control system 1 according to the present embodiment.
  • Each step shown in FIG. 18 is typically realized by the processor 102 of the control device 100 executing a program.
  • the series of processes shown in FIG. 18 are repeatedly executed in a predetermined cycle (control cycle).
  • the processing procedure shown in FIG. 18 is obtained by adding steps S50 to S56 to the processing procedure shown in FIG.
  • control device 100 calculates the moving speed of the robot 200 (step S50). In particular, the control device 100 calculates the moving speed of the hand 210 attached to the tip of the robot 200. Then, the control device 100 determines whether or not the calculated movement speed is within a predetermined threshold value (step S52). That is, the control device 100 determines whether or not the calculated temporal change in the position of the first work 50 satisfies a predetermined condition.
  • the control device 100 calculates the compensation amount corresponding to the damping control based on the calculated movement speed. (Step S54). Then, the control device 100 updates the target position of the moving mechanism 300 based on the calculated compensation amount (step S56). At this time, the control device 100 may set the target speed in addition to or instead of the target position of the moving mechanism 300.
  • step S52 If the calculated movement speed is within a predetermined threshold value (YES in step S52), the processes of steps S54 and S56 are skipped.
  • control device 100 outputs the target position of the next cycle of the robot 200 to the robot controller 250, and outputs the target position of the next cycle of the moving mechanism 300 to the servo controller 350 (step S26).
  • position control using state values As processing examples, position control using state values (FIG. 16), position control using images captured by the camera 400 (FIG. 17), and position control and damping control using state values (FIG. 18). As described above, these may be combined as appropriate.
  • a processing procedure using a state value (FIG. 16) and a position control using an image captured by the camera 400 may be combined.
  • the processing procedure using the state value (see FIG. 16) is executed until the first work 50 reaches the field of view of the camera 400, and after that, the image captured by the camera 400 is used.
  • the determined position control (see FIG. 17) may be executed.
  • FIG. 18 shows an example of processing in which damping control is added to position control using state values (FIG. 16), but the position control using images captured by the camera 400 (FIG. 17) is shown.
  • a processing example to which damping control is added may be adopted.
  • FIG. 18 shows an example of processing in which damping control is added to the position control using the state value (FIG. 16), but the position control using the state value is omitted and the state value is used. Only damping control may be implemented.
  • the position control using the image captured by the camera 400 (FIG. 17)
  • the position control using the image captured by the camera 400 is omitted, and the damping using the image captured by the camera 400 is omitted. Only control may be implemented.
  • the position control and the damping control can be implemented independently of each other according to the conditions for combining the first work 50 and the second work 60.
  • control device 100 satisfies a predetermined condition (in this example, exceeding the threshold value) for the calculated magnitude of the positioning error (FIG. 16).
  • a predetermined condition in this example, exceeding the threshold value
  • the control command for compensating for the positioning error is enabled in step S22 of 16 and step S36) of FIG. 17, in order to compensate for the positioning error regardless of whether or not the predetermined conditions are satisfied.
  • the control command of may be enabled at all times.
  • the control device 100 satisfies a predetermined condition (in this example, exceeding the threshold value) for the calculated temporal change of the position of the first work 50.
  • a predetermined condition in this example, exceeding the threshold value
  • the control command for compensating for the positioning error is enabled, but the control command for compensating for the positioning error is performed regardless of whether or not the predetermined conditions are satisfied. May always be enabled.
  • the first control module 32, the error calculation module 34, and the second control module 36 included in the control module 30 of FIG. 1 are processed synchronously at a predetermined cycle. Will be executed.
  • FIG. 19 is a diagram showing a part of variations of the configuration example of the robot control system 1 according to the present embodiment.
  • FIG. 19 shows the five basic functions of the robot control system 1 by using the symbols “LD”, “RC”, “RA”, “MC”, and “SA”.
  • LD includes a function of executing a control operation related to the assembly process as shown in FIG. 16 above.
  • the control device 100 is in charge of the "LD". More specifically, the IEC program 1104 executed by the processor 102 of the control device 100 includes instructions necessary for realizing "LD".
  • “LD” includes a function corresponding to the error calculation module 34 of FIG.
  • RC includes a function to execute a control operation related to the operation of the robot 200.
  • the control calculation related to the operation of the robot 200 includes the calculation of the trajectory of the robot 200, the calculation of the target angle of each joint included in the robot 200 at each time, and the like.
  • the "RC” may be realized by the application program 1106 stored in the control device 100 and the system program 272 stored in the robot controller 250.
  • RA includes a function related to the interface with the robot 200. Specifically, “RA” is a function of converting the calculation result by the RC function into a value (voltage, etc.) required for the actual operation of the robot 200 and outputting it, and the data (pulse value) obtained from the robot 200. Etc.) is included in the RC function. “RA” may be realized by the system program 272 stored in the robot controller 250.
  • the “RC” and “RA” include functions corresponding to the first control module 32 in FIG.
  • the “MC” includes a function of executing a control operation related to the operation of the moving mechanism 300.
  • the control calculation related to the operation of the moving mechanism 300 includes the calculation of the trajectory of the moving mechanism 300, the calculation of the target angle or the target speed of each axis included in the moving mechanism 300 at each time, and the like.
  • the "MC” may be realized by the application program 1106 stored in the control device 100 and the system program 372 stored in the servo controller 350.
  • SA includes a function related to an interface with the moving mechanism 300. Specifically, “SA” is a function of converting the calculation result by the MC function into a value (voltage, etc.) required for the operation of the actual moving mechanism 300 and outputting it, and the data obtained from the moving mechanism 300 (). Includes a function to output (pulse value, etc.) to the MC function. “SA” may be realized by the system program 372 stored in the servo controller 350.
  • FIG. 19 shows 16 types of configuration examples as an example.
  • the configuration example number “1” is an implementation example corresponding to the robot control system 1 described above, and the control device 100 is in charge of positioning control, and the control device 100 and the robot controller 250 control the robot 200.
  • the control device 100 and the servo controller 350 are in charge of controlling the moving mechanism 300.
  • the same function may be shared by a plurality of devices.
  • the configuration example number “2” means a configuration example in which the control device 100 and the robot controller 250 are integrated, and for example, the robot controller 250 may be incorporated into the control device 100 and mounted.
  • control device 100 the robot controller 250, and the servo controller 350 may be integrated and configured.
  • the mounting example shown in FIG. 19 is an example, and may be mounted using, for example, a plurality of control devices. Further, the image processing device 450 may be configured independently or integrated with the control device 100.
  • the robot control system 1 may adopt any mounting form as long as the required functions can be realized by any method.
  • Robot control system (1) A robot (200) that causes the first work (50) gripped by the hand (210) to approach the second work (60). A moving mechanism (300) that is mechanically connected to the robot and the hand and generates a displacement between the robot and the hand. A first control unit (32; 100) that gives a control command to the robot so that the first work approaches the second work. An error calculation unit (34; 100) that calculates a positioning error that occurs when the robot approaches the first work to the second work, and A robot control system including a second control unit (36; 100) that gives a control command to the movement mechanism so as to compensate for the positioning error calculated by the error calculation unit.
  • the error calculation unit calculates the deviation from the target position of the first work as the positioning error.
  • [Structure 3] The robot control system according to the configuration 1 or 2, wherein the error calculation unit calculates a positioning error caused by vibration generated in the robot based on a state value of a movable portion of the robot.
  • an image processing device for optically detecting the positions of the first work and the second work is provided.
  • the error calculation unit calculates a temporal change in the position of the first work as the positioning error.
  • the second control unit is any of the configurations 1 to 6 for validating the control command for compensating for the positioning error when the calculated magnitude of the positioning error satisfies a predetermined condition.
  • the robot control system according to item 1.
  • the robot control system according to any one of configurations 1 to 7, wherein the first control unit, the error calculation unit, and the second control unit execute processing in synchronization with a predetermined cycle.
  • a robot (200) that causes the first work (50) gripped by the hand (210) to approach the second work (60) is mechanically connected to the robot and the hand, and is between the robot and the hand.
  • Step (S26) of giving a control command to the robot so as to approach the second work Steps (S14, S20) for calculating the positioning error that occurs when the robot approaches the first work to the second work, and A control program for executing a step (S24, S26) of giving a control command to the moving mechanism so as to compensate for the calculated positioning error.
  • a robot (200) that causes the first work (50) gripped by the hand (210) to approach the second work (60) is mechanically connected to the robot and the hand, and is between the robot and the hand. It is a control method executed by a robot control system (1) provided with a movement mechanism (300) for generating a displacement in the robot.
  • Step (S26) of giving a control command to the robot so that the first work approaches the second work Steps (S14, S20) for calculating the positioning error that occurs when the robot approaches the first work to the second work, and
  • a control method comprising a step (S24, S26) of giving a control command to the movement mechanism so as to compensate for the calculated positioning error.
  • positioning errors (caused by vibration and / or caused by deflection) that occur when the robot 200 that conveys the first work 50 is moved at high speed are sequentially calculated.
  • a control command is given to generate an appropriate displacement between the hand 210 holding the first work 50 and the robot 200 so as to compensate for the calculated positioning error.
  • more accurate positioning control can be realized in an application for assembling parts.
  • 1 robot control system 10 field network, 20 upper network, 30 control module, 32 1st control module, 34 error calculation module, 36 2nd control module, 50 1st work, 52, 64 electronic parts, 54 pins, 60th 2 work, 62 board, 66 hole, 68 marker, 70 orbit, 80 work table, 100 controller, 102,262,362,452 processor, 104,264,364,454 main memory, 106,456 upper network controller, 108 , 252,352,458 field network controller, 110,270,370,460 storage, 112,462 memory card interface, 114,464 memory card, 116 local bus controller, 118,468 processor bus, 120,470 USB controller, 122 Local bus, 130 functional unit, 200 robot, 202 link, 204 joint, 210 hand, 250 robot controller, 260, 360 control processing circuit, 268 interface circuit, 272,372,1102,4602 system program, 300 mobile mechanism, 310 stage Plate, 312,314,316 movable shaft, 330 servo motor, 350 servo controller, 380

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne ce système de commande de robot qui comprend : un robot qui amène une première pièce à travailler saisie par une main à s'approcher d'une seconde pièce à travailler ; un mécanisme de déplacement qui est relié mécaniquement au robot et à la main et qui amène un déplacement entre le robot et la main ; une première unité de commande qui fournit une instruction de commande au robot de façon à amener la première pièce à travailler à s'approcher de la seconde pièce à travailler ; une unité de calcul d'erreur qui calcule une erreur de positionnement qui se produit lorsque le robot amène la première pièce à travailler à s'approcher de la seconde pièce à travailler ; et une seconde unité de commande qui fournit une instruction de commande au mécanisme de déplacement de façon à compenser l'erreur de positionnement calculée par l'unité de calcul d'erreur.
PCT/JP2021/008669 2020-06-25 2021-03-05 Système de commande de robot, programme de commande et procédé de commande Ceased WO2021261023A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020109513A JP2022006927A (ja) 2020-06-25 2020-06-25 ロボット制御システム、制御プログラムおよび制御方法
JP2020-109513 2020-06-25

Publications (1)

Publication Number Publication Date
WO2021261023A1 true WO2021261023A1 (fr) 2021-12-30

Family

ID=79282235

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008669 Ceased WO2021261023A1 (fr) 2020-06-25 2021-03-05 Système de commande de robot, programme de commande et procédé de commande

Country Status (2)

Country Link
JP (1) JP2022006927A (fr)
WO (1) WO2021261023A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024176719A1 (fr) * 2023-02-22 2024-08-29 パナソニックIpマネジメント株式会社 Système d'entraînement

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6430472B1 (en) * 1999-12-20 2002-08-06 Servo-Robot Inc. Robot feature tracking devices and methods
WO2020017426A1 (fr) * 2018-07-20 2020-01-23 オムロン株式会社 Système de commande, procédé de commande de système de commande, et programme de système de commande

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6430472B1 (en) * 1999-12-20 2002-08-06 Servo-Robot Inc. Robot feature tracking devices and methods
WO2020017426A1 (fr) * 2018-07-20 2020-01-23 オムロン株式会社 Système de commande, procédé de commande de système de commande, et programme de système de commande

Also Published As

Publication number Publication date
JP2022006927A (ja) 2022-01-13

Similar Documents

Publication Publication Date Title
US20190126474A1 (en) Method of iterative motion control
US10882182B2 (en) Robot apparatus, control method of robot apparatus, and recording medium
CN111687838B (zh) 机械手轨迹跟随误差的在线补偿方法、系统及存储介质
JP6331225B2 (ja) モータ制御装置、位置制御システム、及びモータ制御方法
WO2016208467A1 (fr) Dispositif d'étalonnage et système de robot l'utilisant
JP2000250614A (ja) バックラッシ補正装置および数値制御システム
CN110581945B (zh) 控制系统、控制装置、图像处理装置以及存储介质
WO2015111298A1 (fr) Dispositif de commande de moteur
US11141855B2 (en) Robot system, method of controlling robot arm, recording medium, and method of manufacturing an article
US20230075352A1 (en) Method and apparatus for metrology-in-the-loop robot control
JP2016051398A (ja) 軸間干渉を補正するモータ制御装置
WO2020017426A1 (fr) Système de commande, procédé de commande de système de commande, et programme de système de commande
JP7392590B2 (ja) ロボット制御システム、制御プログラムおよび制御方法
WO2021261023A1 (fr) Système de commande de robot, programme de commande et procédé de commande
CN109311163B (zh) 校正机器人的运动控制命令的方法及其相关设备
US11955917B2 (en) Motor control system, motor control apparatus, and motor control method
JP2015006705A (ja) 制御装置
WO2021261024A1 (fr) Système de commande de robot, programme de commande et procédé de commande
CN110682287B (zh) 自动机械以及控制装置
Lee et al. Intelligent control of precision linear actuators
CN111136652A (zh) 减速机系统及其制造、校正数据生成及校正方法、机器人
JP2006293624A (ja) 多軸制御装置
JP6391489B2 (ja) モータ制御装置
JP2022114752A (ja) 制御装置、プログラム実行方法およびプログラム
JP2023038747A (ja) コントローラシステムおよびその制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21829264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21829264

Country of ref document: EP

Kind code of ref document: A1