[go: up one dir, main page]

WO2025104846A1 - Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement - Google Patents

Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement Download PDF

Info

Publication number
WO2025104846A1
WO2025104846A1 PCT/JP2023/041121 JP2023041121W WO2025104846A1 WO 2025104846 A1 WO2025104846 A1 WO 2025104846A1 JP 2023041121 W JP2023041121 W JP 2023041121W WO 2025104846 A1 WO2025104846 A1 WO 2025104846A1
Authority
WO
WIPO (PCT)
Prior art keywords
teaching device
data
processor
teaching
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/041121
Other languages
English (en)
Japanese (ja)
Inventor
開 下妻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Priority to PCT/JP2023/041121 priority Critical patent/WO2025104846A1/fr
Priority to TW113139076A priority patent/TW202521315A/zh
Publication of WO2025104846A1 publication Critical patent/WO2025104846A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Definitions

  • This disclosure relates to a security device, a control device, a teaching device, a method, and a computer program for preventing misuse of a teaching device.
  • Patent Document 1 discloses that ensure the safety of robotic work.
  • a security device for preventing misuse of a teaching device that teaches a robot how to operate includes a data acquisition unit that acquires usage status data representing the usage status of the teaching device, a usage determination unit that determines whether the teaching device is in use or not based on the usage status data acquired by the data acquisition unit, and a logout execution unit that, when the usage determination unit determines that the teaching device is not in use, transitions the operation stage of the teaching device to a logout stage that prohibits operation of the robot through the teaching device and requests authentication from the user.
  • a method for preventing misuse of a teaching device that teaches a robot how to operate involves a processor acquiring usage status data that indicates the usage status of the teaching device, determining whether the teaching device is in use based on the acquired usage status data, and transitioning the operation stage of the teaching device to a logout stage that prohibits operation of the robot through the teaching device and requests authentication from the user if the processor determines that the teaching device is not in use.
  • FIG. 1 is a schematic diagram of a robot system according to an embodiment.
  • FIG. 1 is a block diagram of a robot system according to an embodiment.
  • FIG. 2 is a front view of the teaching device shown in FIG. 1 .
  • FIG. 4 is a rear view of the teaching device shown in FIG. 3 .
  • 1 is a flowchart illustrating a security function according to an embodiment.
  • 6 is a flowchart showing an example of the flow of step S1 in FIG. 5 .
  • 6 is a flowchart showing an example of the process of step S2 in FIG. 5 .
  • FIG. 11 is a block diagram of a robot system according to another embodiment.
  • FIG. 9 is a front view of the teaching device shown in FIG. 8 .
  • FIG. 7 is a flowchart showing another example of the flow of step S2 in FIG. 5 .
  • FIG. 13 is a block diagram of a robot system according to yet another embodiment. 7 is a flowchart showing yet another example of the flow of step S2 in FIG. 5 . 13 is a flowchart showing the process of step S43 in FIG. 12. 13 is a flowchart showing the process of step S44 in FIG. 12.
  • FIG. 12 is a block diagram showing other functions of the robot system shown in FIG. 11 . 13 is an example of image data for setting a judgment criterion. 13 is another example of image data for setting a judgment criterion. 13 is an example of image data for selecting whether a security function is enabled or disabled.
  • FIG. 12 is a block diagram showing further functions of the robot system shown in FIG. 11 .
  • the robot system 10 includes a robot 12, a teaching device 14, and a control device 16.
  • the robot 12 is, for example, a vertical articulated robot, and has an end effector 12A that performs a predetermined task on a workpiece (workpiece handling, welding, laser processing, cutting, etc.), and a movement mechanism 12B that moves the end effector 12A.
  • the teaching device 14 operates the robot 12 via the control device 16, teaches the robot 12 the movements required for a task, and generates an operation program for the task. More specifically, as shown in FIG. 2, the teaching device 14 is a computer having a processor 20, a memory 22, an I/O interface 24, a display device 26, an input device 28, a vibration sensor 30, an attitude sensor 32, and a biosensor 34.
  • the teaching device 14 may be any type of computer, such as a teaching pendant or a notebook or tablet PC.
  • the processor 20 has a CPU or GPU, etc., and is communicatively connected to the memory 22, I/O interface 24, display device 26, input device 28, vibration sensor 30, attitude sensor 32, and biosensor 34 via a bus 36.
  • the memory 22 has a RAM or ROM, etc., and temporarily or permanently stores various data.
  • the memory 22 may be a non-transitory computer-readable recording medium, such as a semiconductor memory, a magnetic recording medium, or an optical recording medium.
  • the I/O interface 24 has, for example, an Ethernet (registered trademark) port, a USB port, an optical fiber connector, or an HDMI (registered trademark) terminal, and communicates data with external devices via wired or wireless communication under instructions from the processor 20.
  • the display device 26 has a display such as an LCD or LED, and visibly displays various data under instructions from the processor 20.
  • the input device 28 has push buttons, switches, a touch panel, or the like, and accepts data input from an operator.
  • the sway sensor 30 detects the sway V of the teaching device 14.
  • the sway sensor 30 has, for example, an acceleration sensor built into the teaching device 14, and detects the change over time in the acceleration a of the teaching device 14 as the sway V of the teaching device 14.
  • the detection data Dv (time series data of the acceleration a) of the sway V detected by the sway sensor 30 is stored in the memory 22.
  • the sway sensor 30 may also detect the change over time in the jerk or speed of the teaching device 14 as the sway V.
  • the attitude sensor 32 detects the attitude O of the teaching device 14.
  • the attitude sensor 32 has a gyro sensor built into the teaching device 14, and detects the angle ⁇ x from the vertical downward direction in the positive direction of the x-axis of the two-dimensional orthogonal coordinate system C shown in Figures 3 and 4, and the angle ⁇ z from the vertical downward direction in the positive direction of the z-axis of the coordinate system C, as detection data Do of the attitude O of the teaching device 14.
  • the detection data Do (angles ⁇ x and ⁇ z) of the orientation O detected by the orientation sensor 32 is stored in the memory 22.
  • the biosensor 34 detects the user's biometric information B.
  • the biosensor 34 has a capacitive, optical, or ultrasonic fingerprint sensor 34A, and detects the user's fingerprint information as the biometric information B.
  • the fingerprint sensor 34A may be provided in a surface area 38 (FIG. 3) that the user's thumb comes into contact with when the user holds the teaching device 14.
  • Fingerprint sensor 34A may also be provided on an emergency stop button 37 provided on the front side of teaching device 14. This emergency stop button 37 is for emergency stopping the operation of robot 12. Fingerprint sensor 34A may also be provided on an enable switch 39 provided on the rear side (FIG. 4) of teaching device 14. This enable switch 39 can be pressed in three stages, and when in the second stage, it allows the operation of robot 12, while when in the first and third stages, it prohibits the operation of robot 12.
  • the biometric sensor 34 has a visible light or infrared camera 34B and detects an image of the user's face or iris as biometric information B.
  • the camera 34B may be provided on the front side of the teaching device 14, near the display device 26.
  • the biometric information B (fingerprint information, face or iris image) detected by the biometric sensor 34 (fingerprint sensor 34A, camera 34B) is stored in the memory 22.
  • the control device 16 controls the operation of the robot 12.
  • the control device 16 is a computer having a processor 40, memory 42, I/O interface 44, display device 46, and input device 48.
  • the configurations of the processor 40, memory 42, I/O interface 44, display device 46, and input device 48 are similar to those of the processor 20, memory 22, I/O interface 24, display device 26, and input device 28 described above, so duplicated explanations will be omitted.
  • the user operates the input device 28 of the teaching device 14 to operate the robot 12 and teach the robot 12 how to operate. At this time, the user may leave the teaching device 14 in an arbitrary location and interrupt the teaching operation. In such a case, an unauthorized third party may operate the teaching device 14 and cause the robot 12 to operate.
  • the robot system 10 executes a security function to prevent such misuse of the teaching device 14. The security function of the robot system 10 will be described below with reference to FIG. 5.
  • the processor 40 of the control device 16 executes the flow of FIG. 5.
  • the processor 40 starts the flow of FIG. 5 when the teaching device 14 is started.
  • the operation stage OP of the teaching device 14 is the logout stage OP1.
  • the processor 20 of the teaching device 14 requests authentication from the user.
  • the processor 20 starts the biosensor 34 and starts detecting bioinformation B.
  • the processor 20 may generate image data ID1 of the logout stage OP1 that requests authentication from the user, and display it on the display device 26.
  • step S11 the processor 40 determines whether or not user authentication has been completed.
  • the processor 40 executes biometric authentication to authenticate the user. Specifically, the processor 20 of the teaching device 14 acquires the user's biometric information B (fingerprint information, face or iris image) through the biometric sensor 34 (fingerprint sensor 34A, camera 34B) and supplies it to the control device 16.
  • biometric information B fingerprint information, face or iris image
  • biometric sensor 34 fingerprint sensor 34A, camera 34B
  • templates Bt of biometric information B of multiple users who have the authority to operate the robot are registered in a user database and stored in the memory 42 of the control device 16.
  • the processor 40 of the control device 16 performs biometric authentication by matching the acquired biometric information B with the templates Bt.
  • the processor 40 functions as an authentication execution unit 62 ( Figure 2) that performs biometric authentication of the user in the logout stage OP1.
  • step S12 the processor 40 determines whether or not the operation of the teaching device 14 has ended. If the processor 40 determines YES, it ends the flow of FIG. 6, thereby ending the flow of FIG. 5. On the other hand, if the processor 40 determines NO, it returns to step S11.
  • step S13 the processor 40 transitions the operation stage OP of the teaching device 14 to the login stage OP2 (so-called login). Specifically, the processor 40 transmits a login command CM1 to the teaching device 14, and in accordance with the login command CM1, the processor 20 of the teaching device 14 transitions to the login stage OP2.
  • the processor 40 transmits a login command CM1 to the teaching device 14, and in accordance with the login command CM1, the processor 20 of the teaching device 14 transitions to the login stage OP2.
  • the processor 40 of the control device 16 functions as a login execution unit 64 (FIG. 2) that transitions the operation stage OP of the teaching device 14 from the logout stage OP1 to the login stage OP2 when biometric authentication is completed.
  • the processor 20 of the teaching device 14 may stop the detection of the biometric information B by the biometric sensor 34 when the device transitions to the login stage OP2.
  • the processor 20 may also generate image data ID2 for the login stage OP2 (i.e., a graphical user interface (GUI) for the teaching operation) and display the image data on the display device 26.
  • GUI graphical user interface
  • step S13 the processor 40 of the control device 16 executes the flow of the login stage OP2 of step S2.
  • This step S2 will be described with reference to FIG. 7.
  • step S21 the processor 40 starts the operation of acquiring the usage status data D.
  • the processor 40 acquires the detection data Dv of the vibration sensor 30 and the detection data Do of the attitude sensor 32 as the usage status data D.
  • a sway V occurs in the teaching device 14
  • the orientation O of the teaching device 14 is such that the display device 26 faces vertically downward or is upside down, it is highly likely that the user has left the teaching device 14 without using it. In this way, the sway V and orientation O of the teaching device 14 are closely related to the usage state of the teaching device 14.
  • step S25 the processor 40 determines that the teaching device 14 is not in use.
  • the processor 40 may set the above-mentioned flag FL to "invalid.”
  • the processor 40 executes steps S22 and S23 based on the usage status data D (detection data Dv and Do) acquired after the start of step S1, and determines whether the teaching device 14 is in use or not. Therefore, the processor 40 functions as a usage determination unit 68 (FIG. 2) that determines whether the teaching device 14 is in use or not based on the usage status data D.
  • step S26 the processor 40 determines whether the operation of the teaching device 14 has ended (specifically, the power has been turned off). If the processor 40 determines YES, it ends the flow in FIG. 7, thereby ending the flow in FIG. 5, whereas if the processor 40 determines NO, it returns to step S22. Thus, while the processor 40 determines NO in step S22, S23, or S26, it repeats the loop of steps S22 to S24, and S26. The processor 40 may repeatedly execute the loop of steps S22 to S24, and S26 at a predetermined period ⁇ (e.g., 3 seconds).
  • e.g. 3 seconds
  • step S27 the processor 40 transitions the operation phase OP of the teaching device 14 from the login phase OP2 to the logout phase OP1 (so-called logout). Specifically, the processor 40 transmits a logout command CM3 to the teaching device 14. In accordance with the logout command CM3, the processor 20 of the teaching device 14 transitions to the logout phase OP1, thereby prohibiting any operation of the robot 12 through the input device 28 of the teaching device 14.
  • the processor 20 also displays the image data ID1 of the logout stage OP1 on the display device 26, activates the biometric sensor 34 to start detecting biometric information B, and requests authentication (biometric authentication) from the user. Meanwhile, the processor 20 stops detection of the usage status data D (detection data Dv and Do) by the vibration sensor 30 and the attitude sensor 32.
  • the processor 40 of the control device 16 determines in step S25 that the teaching device 14 is not in use, it functions as a logout execution unit 70 (FIG. 2) that transitions the operation stage OP of the teaching device to the logout stage OP1. After step S27, the processor 40 proceeds to step S1.
  • the processor 40 may transmit a non-use signal SG2 indicating that the teaching device 14 is not in use.
  • the processor 40 may display an image on the image data ID1 of the logout stage OP1 described above indicating that the device has automatically transitioned to the logout stage OP1 due to non-use.
  • the processor 40 functions as the authentication execution unit 62, login execution unit 64, data acquisition unit 66, use determination unit 68, and logout execution unit 70 to execute security functions to prevent misuse of the teaching device 14. Therefore, the authentication execution unit 62, login execution unit 64, data acquisition unit 66, use determination unit 68, and logout execution unit 70 constitute a security device 60 ( Figure 2) to prevent misuse of the teaching device 14.
  • the data acquisition unit 66 acquires usage status data (detection data Dv, Do) that indicates the usage status of the teaching device 14 (step S1).
  • the usage determination unit 68 also determines whether the teaching device 14 is in use based on the usage status data D acquired by the data acquisition unit 66 (steps S22 and S23).
  • the logout execution unit 70 transitions the operation stage OP of the teaching device 14 to a logout stage OP1 in which operation of the robot 12 through the teaching device 14 is prohibited and authentication is requested from the user.
  • This configuration reliably prevents an unauthorized third party from erroneously using the teaching device 14 while the teaching device 14 is left unused by the user.
  • the data acquisition unit 66 acquires detection data Dv of the vibration sensor 30 that detects the vibration V of the teaching device 14 as usage status data D, and the usage determination unit 68 determines that the teaching device 14 is not in use (step S25) if the vibration V (in this embodiment, acceleration a) is not detected for a predetermined period T1 (if YES is determined in step S22).
  • the use determination unit 68 determines whether the teaching device 14 is being used or not, according to a determination criterion STv related to the sway V, which is whether or not the sway V (acceleration a) has been detected over a period T1. With this configuration, it is possible to determine with high accuracy whether the teaching device 14 is being used or not, based on the sway V, which is closely related to the usage state of the teaching device 14.
  • the data acquisition unit 66 acquires detection data Do from the attitude sensor 32 that detects the attitude O of the teaching device 14 as the usage status data D, and the usage determination unit 68 determines that the teaching device 14 is not in use (step S25) if the attitude O is a predetermined attitude Ou that indicates that the teaching device 14 is not in use (if YES is determined in step S23).
  • the use determination unit 68 determines whether the teaching device 14 is being used or not according to the determination criterion STo for the attitude O, which is whether the attitude O is in the attitude Ou or not.
  • the above-mentioned predetermined attitude Ou is an attitude in which the display device 26 provided on the teaching device 14 faces vertically downward, or an attitude in which the teaching device 14 is upside down. With this configuration, it is possible to more reliably determine whether the teaching device 14 is not being used.
  • the authentication execution unit 62 executes biometric authentication of the user to authenticate the user in the logout phase OP1 (step S11). Then, when the biometric authentication by the authentication execution unit 62 is completed (when the result of step S11 is YES), the login execution unit 64 transitions the operation phase OP from the logout phase OP1 to the login phase OP2, which permits the operation of the robot 12 through the teaching device 14 (step S13).
  • step S23 when the processor 40 judges NO in step S22, it executes step S23 to judge whether the teaching device 14 is in use or not.
  • various industrial machines such as the robot 12, belt conveyors, and heavy machinery are in operation at the work site, and this may cause minute vibrations to occur constantly at the work site. In such a case, even if the teaching device 14 is left unused, shaking V of the teaching device 14 will occur.
  • step S23 may be executed before step S22, and step S22 may be executed when step S23 is judged as NO.
  • steps S22 and S23 may be omitted.
  • the processor 40 acquires only one of the detection data Dv of the sway V and the detection data Do of the orientation O in step S21. That is, in this case, the sway sensor 30 or the orientation sensor 32 can be omitted from the teaching device 14.
  • the threshold value ⁇ x th or ⁇ z th of the range [ ⁇ x th , ⁇ x th ] or [ ⁇ z th , + ⁇ z th ] that defines the orientation Ou indicating non-use may be set to any value.
  • the processor 40 may request input of a password through the image data ID1, and authenticate the user using the input password. That is, in this case, the biosensor 34 may be omitted from the teaching device 14. Also, the authentication execution unit 62 and the login execution unit 64 may be omitted from the security device 60. In this case, the functions of the authentication execution unit 62 and the login execution unit 64 may be implemented in an external device (such as a PC). For example, the biosensor 34 may be provided in the external device, and the external device may take on the functions of the authentication execution unit 62 and the login execution unit 64.
  • the teaching device 82 differs from the teaching device 14 described above in that it further includes a tactile sensor 84 and an optical sensor 86.
  • the tactile sensor 84 and the optical sensor 86 are communicatively connected to the processor 20 via the bus 36.
  • the tactile sensor 84 detects the tactile sensation H of the teaching device 82.
  • the tactile sensor 84 has, for example, a capacitive or piezoelectric force sensor, and is provided in a portion where the user's hand touches when the user holds the teaching device 82.
  • the tactile sensor 84 is provided on the side surface 88 of the teaching device 82 as shown in FIG. 9.
  • the tactile sensor 84 may be provided on the enable switch 39 (FIG. 4) on the rear side of the teaching device 82.
  • the tactile sensor 84 detects the tactile sensation H when the user touches the teaching device 82. Detection data Dh of the tactile sensation H detected by the tactile sensor 84 is stored in the memory 22.
  • the optical sensor 86 detects the brightness B (luminous flux, luminous intensity, luminance, illuminance, etc.) around the teaching device 82.
  • the optical sensor 86 has, for example, a photoelectric conversion element (photodiode, phototransistor, etc.) and is provided on the front side of the teaching device 82, near the screen of the display device 26.
  • the detection data Db of the brightness B detected by the optical sensor 86 is stored in the memory 22.
  • the processor 40 of the control device 16 executes the flow of FIG. 10 as step S2 in FIG. 5. Note that in the flow of FIG. 10, the same step numbers are used for processes that are similar to those in the flow of FIG. 7, and duplicated explanations will be omitted.
  • step S31 the processor 40 functions as the data acquisition unit 66 and starts the operation of acquiring the usage status data D.
  • the processor 40 acquires, as the usage status data D, in addition to the above-mentioned detection data Dv and Do, detection data Dh of the tactile sense H detected by the tactile sensor 84 and detection data Db of the brightness B detected by the optical sensor 86.
  • the user's hand touches the teaching device 82, generating a tactile sensation H.
  • no tactile sensation H is generated in the teaching device 82.
  • the tactile sensation H and the surrounding brightness B of the teaching device 82 are closely related to the usage state of the teaching device 82. Therefore, the detection data Dh of the tactile sensation H detected by the tactile sensor 84 and the detection data Db of the brightness B detected by the optical sensor 86 constitute usage state data D that represents the usage state of the teaching device 82.
  • the processor 40 of the control device 16 transmits a data acquisition command CM2 to the teaching device 82, and in response to the data acquisition command CM2, the processor 20 of the teaching device 82 activates the vibration sensor 30, the attitude sensor 32, the tactile sensor 84, and the optical sensor 86.
  • the processor 20 then periodically acquires the detection data Dv, Do, Dh, and Db, and sequentially supplies them to the control device 16.
  • the processor 40 of the control device 16 periodically acquires the detection data Dv, Do, Dh, and Db as usage status data D, and stores them in the memory 42. After step S31, the processor 40 sequentially executes the above-mentioned steps S22 and S23.
  • step S32 the processor 40 functions as the use determination unit 68 and determines whether or not tactile sensation H to the teaching device 82 has not been detected for a predetermined period T3. Specifically, the processor 40 refers to the detection data Dh acquired after the start of step S31, and determines YES if tactile sensation H indicated in the detection data Dv has not been detected for a predetermined period T3. If the result of step S23 is NO, the processor 40 proceeds to step S25, whereas if the result of step S23 is NO, the processor 40 proceeds to step S33.
  • the data acquisition unit 66 acquires the detection data Dh of the tactile sensor 84 that detects the touch H to the teaching device 82 as the usage status data D, and the usage determination unit 68 determines that the teaching device 82 is not in use (step S25) if the touch H is not detected for a predetermined period T3 (if YES is determined in step S32).
  • the use determination unit 68 determines whether the teaching device 82 is being used or not according to a determination criterion STh for haptic H, which is whether haptic H has not been detected over a period T3. With this configuration, it is possible to determine with even greater accuracy whether the teaching device 82 is being used or not based on haptic H, which is closely related to the usage state of the teaching device 82.
  • the data acquisition unit 66 acquires, as the usage status data D, detection data Db of the optical sensor 86 that detects the brightness B of the surroundings of the teaching device 82, and the usage determination unit 68 determines that the teaching device 82 is not in use when the brightness B has been equal to or less than a predetermined threshold value Bth for a predetermined period T4 (when YES is determined in step S33).
  • the use determination unit 68 determines whether the teaching device 82 is being used or not in accordance with a determination criterion STb regarding the brightness B, which is whether the surrounding brightness B has been equal to or lower than a threshold value Bth over a period T4. According to this configuration, it is possible to determine with even higher accuracy whether the teaching device 82 is being used or not in accordance with the surrounding brightness B, which is closely related to the usage state of the teaching device 82.
  • steps S22, S23, S32, and S33 may be executed in any order.
  • the processor 40 may execute steps S32 ⁇ S22 ⁇ S23 ⁇ S33 in that order.
  • at least one of steps S22, S23, S32, and S33 may be omitted from the flow of FIG. 10.
  • steps S22, S23, and S33 may be omitted from the flow of FIG. 10, and the processor 40 may execute only step S32 after step S31.
  • step S31 the processor 40 acquires only the detection data Dh of the tactile sensor 84 as the usage state data D. That is, in this case, the sway sensor 30, the attitude sensor 32, and the optical sensor 86 can be omitted from the teaching device 82.
  • the robot system 90 differs from the above-described robot system 80 in that it further includes a force sensor 92.
  • the force sensor 92 detects an external force F applied to the robot 12.
  • the force sensor 92 has a six-axis force sensor provided at one position of the robot 12.
  • the force sensor 92 has a torque sensor provided on a servo motor (not shown) that drives each joint of the robot 12.
  • the force sensor 92 supplies detection data Df of the detected external force F to the control device 16.
  • the processor 40 of the control device 16 operates the robot 12 in a plurality of operation modes DM.
  • the operation modes DM include, for example, a jog teach mode DM1 and a direct teach mode DM2.
  • the jog teach mode DM1 is an operation mode DM that jogs the robot 12 in response to an input operation to the input device 28 of the teaching device 82.
  • the user performs an input operation on the input device 28 of the teaching device 82, and the processor 40 of the control device 16 jogs the robot 12 in response to the input to the input device 28.
  • direct teach mode DM2 is an operation mode DM in which the robot 12 moves in the direction of an external force F according to the external force F applied to the robot 12.
  • the user applies an external force F to the robot 12 in any direction.
  • the processor 40 of the control device 16 identifies the direction of the applied external force F based on the detection data Df of the force sensor 92, and moves the robot 12 in the direction of the external force F.
  • the processor 40 of the control device 16 executes the flow of FIG. 12 as step S2 in FIG. 5.
  • the processor 40 functions as the data acquisition unit 66 and starts the operation of acquiring the usage status data D.
  • the processor 40 acquires the operation mode data Dd1 or Dd2 as the usage status data D in addition to the above-mentioned detection data Dv, Do, Dh, and Db.
  • step S42 the processor 40 determines whether or not the first operation mode DM1 has been selected. Specifically, if the processor 40 has acquired operation mode data Dd1 indicating the jog teach mode DM1 in step S41, the result is YES, and the process proceeds to step S43. On the other hand, if the processor 40 has acquired operation mode data Dd2 indicating the direct teach mode DM2, the result is NO, and the process proceeds to step S44.
  • step S43 the processor 40 executes a first judgment process.
  • the flow of this step S43 is shown in Figure 13. Note that in the flow shown in Figure 13, processes similar to those in the flow of Figure 10 are given the same step numbers, and duplicated explanations will be omitted.
  • the processor 40 sequentially executes the above-mentioned steps S32, S24 to S26, and if it judges YES in step S26, it ends the flows of Figures 13 and 12, and thus ends the flow of Figure 5.
  • step S26 determines NO in step S26
  • the processor 40 determines NO in step S26
  • the processor 40 functions as the use determination unit 68 and determines whether the teaching device 82 is in use or not according to the determination criterion STh for the tactile sense H (whether the tactile sense H has not been detected for the period T3 or not).
  • step S44 the processor 40 executes a second determination process.
  • the flow of this step S44 is shown in FIG. 14. Note that in the flow shown in FIG. 14, the same step numbers are used for processes similar to those in the flow in FIG. 10, and duplicated explanations will be omitted.
  • the processor 40 sequentially executes the above-mentioned steps S22, S23, S33, S24 to S27, and if the result of the determination in step S26 is YES, the flows of FIG. 14 and FIG. 12 are terminated, and the flow of FIG. 5 is thereby terminated.
  • step S44 the processor 40 functions as a usage determination unit 68 and determines whether the teaching device 82 is in use or not in accordance with the determination criterion STv for sway V (whether or not sway V has been detected over the period T1), the determination criterion STo for posture O (whether or not posture O has become posture Ou), and the determination criterion STb for the surrounding brightness B (whether or not the surrounding brightness B has become equal to or less than the threshold value Bth over the period T4).
  • the data acquisition unit 66 acquires the operation mode data Dd1 or Dd2 indicating the operation mode DM of the robot 12 as the use state data D. Then, when the data acquisition unit 66 acquires the operation mode data Dd1 indicating the first operation mode (specifically, the jog teach mode) DM1, the use determination unit 68 determines whether or not the teaching device 82 is in use according to the first determination criterion STh (step S32 in FIG. 13).
  • the use determination unit 68 determines whether the teaching device 82 is in use or not according to the second judgment criteria STv, STo, and STb that are different from the first judgment criteria STh (steps S22, S23, and S33 in FIG. 14).
  • jog teach mode DM1 When executing jog teach mode DM1, the user performs teaching work by holding and operating the teaching device 82 in his/her hand, so it is possible to determine with high accuracy whether the teaching device 82 is being used or not simply by monitoring the tactile sensation H of the teaching device 82.
  • direct teach mode DM2 the user directly operates the robot 12 while holding the teaching device 82 or placing it in a specified location. In this direct teach mode DM2, it is possible to determine with high accuracy whether the teaching device 82 is being used or not by monitoring the sway V, posture O, and surrounding brightness B of the teaching device 82.
  • step S22, S23, or S33 in Figure 14 may be applied to step S43 in Figure 13, and step S22, S23, or S33 may be executed before or after step S32.
  • step S32 in Figure 13 may be applied to step S44 in Figure 14.
  • the judgment criteria in steps S43 and S44 are set arbitrarily by the user.
  • the robot system 90 executes the function of setting the judgment criterion ST. As described above, minute vibrations may occur at the work site.
  • the processor 40 of the control device 16 automatically sets the judgment criterion STv for the sway V in step S22 based on the detection data Dv of the sway sensor 30.
  • the processor 20 of the teaching device 82 displays image data 200 shown in FIG. 16 on the display device 26 in response to an input operation by a user to the input device 28 for automatically setting the judgment criterion STv.
  • the image data 200 is a GUI for automatically setting the judgment criterion STv, and includes a detection start button image 202 and a judgment criterion result image 204.
  • the user places the teaching device 82 at any position in the work site where micro-vibrations are occurring, and operates the input device 28 to operate the detection start button image 202 on the image.
  • the processor 20 activates the vibration sensor 30 and detects detection data Dv (acceleration a) over a predetermined period T5 (e.g., 10 seconds).
  • the processor 20 supplies the acquired detection data Dv to the control device 16.
  • the processor 40 of the control device 16 functions as a data acquisition unit 66, acquires the detection data Dv from the teaching device 82, and automatically sets a threshold value a th as a judgment standard STv based on the detection data Dv.
  • the processor 40 obtains the average value (or effective value) of the acceleration a indicated in the detection data Dv detected during the period T5, and automatically sets the average value (or effective value) as the threshold a th .
  • the processor 40 then supplies data of the automatically set threshold a th to the teaching device 82, and the processor 20 of the teaching device 82 displays the acquired threshold a th on the judgment criterion result image 204 of the image data 200.
  • the processor 40 of the control device 16 can automatically set the judgment criterion STv (threshold a th ). Therefore, the processor 40 functions as a judgment criterion setting unit 102 (FIG. 15) that automatically sets the judgment criterion STv based on the detection data Dv.
  • the processor 40 of the control device 16 generates image data 210 for setting the judgment criteria ST in response to an input operation by the user to the input device 48 for setting the judgment criteria ST, and displays the image data 210 on the display device 46.
  • An example of this image data 210 is shown in FIG. 17.
  • the image data 210 shown in FIG. 17 is a GUI for setting the judgment criteria STv, STo, and STb for the judgments in steps S22, S23, and S33 described above.
  • the processor 40 functions as the image generation unit 104 (FIG. 15) that generates the image data 210.
  • the image data 210 includes a swing input image 212, posture input images 214 and 216, and a brightness input image 218.
  • the swing input image 212 is a GUI for inputting a threshold a th as a criterion STv for sway V.
  • the posture input image 214 is a GUI for inputting a threshold ⁇ x th that defines the above-mentioned range of angle ⁇ x [ ⁇ x th , ⁇ x th ] as a criterion STo for posture O.
  • the posture input image 216 is a GUI for inputting a threshold ⁇ z th that defines the above-mentioned range of angle ⁇ z [ ⁇ z th , ⁇ z th ] as a criterion STo for posture O.
  • the processor 40 may generate image data ID3 of a three-dimensional virtual space in which a teaching device model 82M that models the teaching device 82 and a coordinate system C (FIG. 9) are arranged as a GUI for setting the determination criterion STo for the attitude O.
  • the user may be configured to be able to set a range [- ⁇ x th , ⁇ x th ] and a range [- ⁇ z th , ⁇ z th ] based on the x-axis and z-axis of the coordinate system C displayed in the image data ID3.
  • the brightness input image 218 is a GUI for inputting a threshold value B th (luminous flux [lm] in the example of FIG. 17) as a judgment criterion STb of brightness B.
  • the user operates the input device 48 to input numerical values to the fluctuation input image 212, the attitude input images 214 and 216, and the brightness input image 218.
  • the processor 40 accepts an input IP1 for setting the judgment criterion STv (threshold value a th ), the judgment criterion STo (threshold values ⁇ x th , ⁇ z th ), and the judgment criterion STb (threshold value B th ) through the fluctuation input image 212, the attitude input images 214 and 216, and the brightness input image 218.
  • the processor 40 functions as the input receiving unit 106 ( FIG. 15 ) for receiving an input IP1 for setting the judgment criteria STv, STo, and STb.
  • the processor 40 functions as the judgment criterion setting unit 102, and according to the input IP1, sets the judgment criterion STv (threshold a th ), the judgment criterion STo (thresholds ⁇ x th , ⁇ z th ), and the judgment criterion STb (threshold B th ) as the judgment criteria STv, STo, and STb regarding the usage status data D (detection data Dv, Do, Db) for the judgment of the above-mentioned steps S22, S23, and S33.
  • the processor 40 functions as the authentication execution unit 62, login execution unit 64, data acquisition unit 66, use determination unit 68, logout execution unit 70, judgment criterion setting unit 102, image generation unit 104, and input acceptance unit 106 to execute security functions to prevent misuse of the teaching device 82. Therefore, the authentication execution unit 62, login execution unit 64, data acquisition unit 66, use determination unit 68, logout execution unit 70, judgment criterion setting unit 102, image generation unit 104, and input acceptance unit 106 constitute a security device 100 (FIG. 15) to prevent misuse of the teaching device 82.
  • the judgment criteria setting unit 102 automatically sets the judgment criteria STv regarding the sway V for judgment by the usage judgment unit 68 (step S22) based on the detection data Dv of the sway V.
  • the judgment criteria STv that takes into account the actual micro-vibrations at a work site where micro-vibrations are occurring as described above. This allows judgment regarding the sway V to be performed with higher accuracy.
  • the image generation unit 104 generates image data 210 for setting the judgment criteria STv, STo, STb related to the usage status data D (detection data Dv, Do, Db) for judgment by the usage judgment unit 68 (steps S22, S23, S33). Furthermore, the input acceptance unit 106 accepts an input IP1 for setting the judgment criteria STv, STo, STb through the image data 210. With this configuration, the user can easily set the desired judgment criteria STv, STo, STb while visually checking the image data 210.
  • the processor 40 may function as the input receiving unit 106 and further receive an input IP2 for enabling or disabling a security function that executes a transition to the logout stage OP1.
  • the processor 40 may function as the image generating unit 104 and generate image data 220 for selecting whether to enable or disable the security function, and display the image data 220 on the display device 46.
  • An example of this image data 220 is shown in FIG. 18.
  • the image data 220 shown in FIG. 18 includes a setting button image 222.
  • the user can operate the input device 48 to operate the setting button image 222 on the image, thereby selecting whether to enable or disable the security function.
  • the processor 40 functions as the input receiving unit 106, and receives an input IP2 for enabling or disabling the security function via the setting button image 222.
  • the processor 40 When the processor 40 receives an input IP2 that enables the security function, it executes the flow of FIG. 5 after starting the teaching device 82. On the other hand, when the processor 40 receives an input IP2 that disables the security function, it does not execute the flow of FIG. 5 after starting the teaching device 82. In this case, operation of the robot 12 through the teaching device 82 is permitted regardless of the usage state. According to this embodiment, the user can freely select whether or not to execute the security function as described above depending on the task.
  • the processor 40 may execute the criterion setting function described with reference to the flow in FIG. 5, FIGS. 16 and 17, and the security selection function described with reference to FIG. 18 in accordance with a computer program PG pre-stored in the memory 42.
  • the functions of the security device 60 or 100 (authentication execution unit 62, login execution unit 64, data acquisition unit 66, use determination unit 68, logout executor 70, criterion setting unit 102, image generation unit 104, input acceptance unit 106) executed by the processor 40 may be functional modules realized by the computer program PG.
  • the functions of the security devices 60 and 100 are implemented in the control device 16, and the processor 40 of the control device 16 executes the functions of the security devices 60 and 100.
  • the functions of the security devices 60 or 100 may be implemented in the teaching device 14 or 82.
  • the processor 20 of the teaching device 14 or 82 functions as the security device 60 or 100, and executes the flow of FIG. 5 according to the computer program PG stored in the memory 22.
  • the processor 20 also functions as the image generation unit 104, and generates the above-mentioned image data ID1, ID2, ID3, 200, 210 or 220, and displays it on the display device 26.
  • the functions of the security device 60 or 100 may be implemented in the teaching device 14 or 82, and other parts of the functions of the security device 60 or 100 may be implemented in the control device 16.
  • FIG. 19 Such a configuration is shown in FIG. 19.
  • the functions of the authentication execution unit 62, data acquisition unit 66, usage determination unit 68, determination criterion setting unit 102, image generation unit 104, and input acceptance unit 106 of the security device 100 are implemented in the teaching device 82.
  • the functions of the login execution unit 64 and logout execution unit 70 of the security device 100 are implemented in the control device 16.
  • the processor 20 of the teaching device 82 and the processor 40 of the control device 16 communicate with each other and cooperate to execute the flow of FIG. 5.
  • the processor 20 of the teaching device 82 executes steps S11 and S12, and if step S11 is judged as YES, the processor 20 transmits an authentication completion signal SG3 to the control device 16.
  • the processor 40 of the control device 16 functions as the login execution unit 64 to execute step S13 and transition the operation stage OP of the teaching device 82 to the login stage OP2.
  • the processor 20 of the teaching device 82 executes steps S21 to S26 and S31 to S33 in FIG. 7, FIG. 10, FIG. 13 or FIG. 14 and determines in step S25 that the device is not in use, the processor 20 transmits a non-use signal SG2 to the control device 16.
  • the processor 40 of the control device 16 functions as the logout execution unit 70 to execute step S27 and transition the operation stage OP of the teaching device 82 to the logout stage OP1.
  • the detection data Dv of sway V, detection data Do of posture O, detection data Dh of tactile sense H, and detection data Db of brightness B are acquired as the usage state data D.
  • any parameter that changes depending on the usage state of the teaching device 14 or 82, such as the temperature of the teaching device 14 or 82, may be acquired as the usage state data D.
  • the posture Ou indicating that the teaching device 14 or 82 is not in use is not limited to a posture in which the display device 26 faces vertically downward and a posture in which the teaching device 14 or 82 is upside down, and may be any posture determined by the user.
  • a security device 60, 100 for preventing misuse of a teaching device 14, 82 that teaches a robot 12 how to operate comprising: a data acquisition unit 66 that acquires usage status data D representing the usage status of the teaching device 14, 82; a usage determination unit 68 that determines whether the teaching device 14, 82 is in use or not based on the usage status data D acquired by the data acquisition unit 66; and a logout execution unit 70 that, when the usage determination unit 68 determines that the teaching device 14, 82 is not in use, transitions from an operation stage OP of the teaching device 14, 82 to a logout stage OP1 that prohibits operation of the robot 12 through the teaching device 14, 82 and requests authentication from the user.
  • the data acquisition unit 66 acquires detection data Dv of the sensor 30 that detects the sway V of the teaching device 14, 82 as usage status data D, and the usage determination unit 68 determines that the teaching device 14, 82 is not in use if the sway V is not detected for a predetermined period T1, in the security device 60, 100 described in aspect 1.
  • Aspect 7 The security device 60, 100 according to any one of Aspects 1 to 6, wherein the data acquisition unit 66 acquires, as the usage status data D, detection data Db of a sensor 86 that detects the brightness B of the surroundings of the teaching device 82, and the usage determination unit 68 determines that the teaching device 82 is not in use when the brightness B is equal to or less than a predetermined threshold value Bth for a predetermined period T4.
  • a security device 100 according to any one of aspects 1 to 8, further comprising an image generation unit 104 that generates image data 210 for setting a judgment criterion ST regarding usage status data D for judgment by the usage judgment unit 68, and an input acceptance unit 106 that accepts an input IP1 for setting the judgment criterion ST through the image data 210.
  • a security device 100 according to any one of aspects 1 to 10 further comprising an input receiving unit 106 that receives an input IP2 for enabling or disabling a security function that causes the logout execution unit 70 to execute a transition to the logout phase OP1.
  • a method for preventing misuse of a teaching device 14, 82 that teaches a robot 12 how to operate wherein a processor 20, 40 acquires usage status data D representing the usage status of the teaching device 14, 82, determines whether the teaching device 14, 82 is in use or not based on the acquired usage status data D, and if it determines that the teaching device 14, 82 is not in use, transitions an operation phase OP of the teaching device 14, 82 to a logout phase OP1 that prohibits operation of the robot 12 through the teaching device 14, 82 and requests authentication from the user.
  • a computer program PG that causes a processor 20, 40 to execute the method described in aspect 13.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

Il est possible qu'une troisième personne qui n'est pas autorisée à faire fonctionner un robot n'utilise pas correctement un dispositif d'enseignement d'un robot. De manière classique, la technologie permettant d'empêcher le mauvais usage d'un tel dispositif d'enseignement est requise. Un dispositif de sécurité 60 permettant d'empêcher le mauvais usage d'un dispositif d'enseignement 14 comprend : une unité d'acquisition de données 66 permettant d'acquérir des données d'état d'utilisation représentant un état d'utilisation d'un dispositif d'enseignement 14; une unité de détermination d'utilisation 68 permettant de déterminer si le dispositif d'enseignement 14 est utilisé ou non sur la base des données d'état d'utilisation acquises par l'unité d'acquisition de données 66; et une unité d'exécution de déconnexion 70 permettant de basculer l'étape de fonctionnement du dispositif d'enseignement 14 vers une étape de déconnexion, dans laquelle le fonctionnement du robot 12 par l'intermédiaire du dispositif d'enseignement 14 est interdit et l'authentification est requise pour un utilisateur, lorsqu'il est déterminé par l'unité de détermination d'utilisation 68 que le dispositif d'enseignement 14 n'est pas en cours d'utilisation.
PCT/JP2023/041121 2023-11-15 2023-11-15 Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement Pending WO2025104846A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2023/041121 WO2025104846A1 (fr) 2023-11-15 2023-11-15 Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement
TW113139076A TW202521315A (zh) 2023-11-15 2024-10-15 防止教示裝置之誤用之安全裝置、控制裝置、教示裝置、方法及電腦程式

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/041121 WO2025104846A1 (fr) 2023-11-15 2023-11-15 Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement

Publications (1)

Publication Number Publication Date
WO2025104846A1 true WO2025104846A1 (fr) 2025-05-22

Family

ID=95742722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/041121 Pending WO2025104846A1 (fr) 2023-11-15 2023-11-15 Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement

Country Status (2)

Country Link
TW (1) TW202521315A (fr)
WO (1) WO2025104846A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007272608A (ja) * 2006-03-31 2007-10-18 Brother Ind Ltd 情報処理装置及びプログラム
JP2008080475A (ja) * 2006-08-29 2008-04-10 Daihen Corp ロボット制御システム
JP2021089575A (ja) * 2019-12-04 2021-06-10 株式会社デンソーウェーブ ロボット教示用認証システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007272608A (ja) * 2006-03-31 2007-10-18 Brother Ind Ltd 情報処理装置及びプログラム
JP2008080475A (ja) * 2006-08-29 2008-04-10 Daihen Corp ロボット制御システム
JP2021089575A (ja) * 2019-12-04 2021-06-10 株式会社デンソーウェーブ ロボット教示用認証システム

Also Published As

Publication number Publication date
TW202521315A (zh) 2025-06-01

Similar Documents

Publication Publication Date Title
US12296485B2 (en) Robot arm with adaptive three-dimensional boundary in free-drive
JP6973119B2 (ja) ロボット制御装置及びロボットシステム
CN100393485C (zh) 工业机器人
CN103802117A (zh) 机器人系统
DK201901238A1 (en) Maintaining free-drive mode of robot arm for period of time
JP7734701B2 (ja) 要求によるバッテリレベル表示
CN107803847A (zh) 人机协调型机器人
US20160301692A1 (en) Information processing apparatus
JP2016215303A (ja) ロボットシステム、ロボットシステムの制御方法、および監視コンソール
CN111897263A (zh) 智能眼镜控制方法、装置、存储介质及电子设备
CN108687748A (zh) 具有外力显示功能的机器人系统、处理装置以及示教操作盘
JP6058378B2 (ja) ロボット制御システム
US20170140491A1 (en) Operation advance indication apparatus
CN105975082A (zh) 一种应用于虚拟现实头戴设备的手柄控制器
KR20200124915A (ko) 전자 장치 및 전자 장치에서 사용자 입력에 기반한 생체 인증 및 지능형 에이전트 기능 수행 방법
JP2020019127A (ja) 協調動作支援装置
WO2025104846A1 (fr) Dispositif de sécurité, dispositif de commande, dispositif d'enseignement, procédé et programme d'ordinateur servant à empêcher une mauvaise utilisation d'un dispositif d'enseignement
TWI788607B (zh) 人機交互系統和人機交互方法
US11485023B2 (en) Robot controlling method using portable device including touchscreen
JP2024056066A (ja) 動作モードを選択する機能を有するツールを切り換える装置、教示装置、制御装置、ロボットシステム、及び方法
CN106200954B (zh) 虚拟现实系统和虚拟现实眼镜的控制方法
WO2023233545A1 (fr) Système d'exploitation de machine utilisant une communication filaire et une communication sans fil
CN114019818B (zh) 智能家居设备
JP2025142828A (ja) ロボットの操作装置
JP7547843B2 (ja) 情報処理装置、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23958852

Country of ref document: EP

Kind code of ref document: A1