WO2025197687A1 - Système de commande à distance, dispositif de commande à distance, procédé de commande à distance et programme - Google Patents
Système de commande à distance, dispositif de commande à distance, procédé de commande à distance et programmeInfo
- Publication number
- WO2025197687A1 WO2025197687A1 PCT/JP2025/009065 JP2025009065W WO2025197687A1 WO 2025197687 A1 WO2025197687 A1 WO 2025197687A1 JP 2025009065 W JP2025009065 W JP 2025009065W WO 2025197687 A1 WO2025197687 A1 WO 2025197687A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operator
- end effector
- unit
- finger
- joint angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
Definitions
- the present invention relates to a remote control system, a remote control device, a remote control method, and a program.
- This application claims priority based on Japanese Patent Application No. 2024-044918, filed March 21, 2024, the contents of which are incorporated herein by reference.
- the robots that perform tasks have, for example, an arm and a robotic hand. Furthermore, the robotic hand may have two or more fingers.
- a taxonomy for example, is a classification of grasping. For this reason, with conventional technology, the operability of a robot's multi-fingered hand can sometimes be poor.
- a remote control system is a control system in which an operator remotely controls an end effector having fingers with joints, the remote control system including: a first detection unit that detects the position of the operator's fingertip; a second detection unit that detects the joint angles of the operator's fingers; a determination unit that determines the operation content of the operator; a priority determination unit that determines whether the priority to be given when calculating the joint angles of the fingers of the end effector is the position of the operator's fingertip or the joint angle of the operator, depending on the operation content determined by the determination unit; a calculation unit that performs optimization calculation of the joint angle including the priority determined by the priority determination unit; and a control unit that outputs the joint angle calculated by the calculation unit to the end effector as a control command.
- the calculation unit may calculate a first cost function relating to the relationship between the end effector and the fingertip positions of each of the operators based on the priority, and the control unit may perform optimization calculations based on the calculated first cost function to calculate the joint angles of the fingers of the end effector.
- the calculation unit may calculate a second cost function relating to the relationship between the angle of each joint of the end effector and each finger of the operator based on the priority, and the control unit may perform an optimization calculation based on the calculated second cost function to calculate the joint angles of the finger portions of the end effector.
- the calculation unit may calculate the second cost function by multiplying either the joint angle of the finger portion of the end effector or the joint angle of the operator's finger by weights related to the bending of the end effector and the operator's finger.
- the first detection unit may detect the distance between each of the operator's fingers
- the priority determination unit may calculate a third cost function relating to the distance between the end effector and each of the operator's fingers
- the control unit may perform optimization calculations using the third cost function to calculate the joint angles of the fingers of the end effector.
- the first detection unit may detect the position of the operator's fingertip based on the distance from the center of the operator's palm to the fingertip, and the shape of the operator's hand and fingers when detecting the position of the operator's fingertip may be a predetermined shape when gripping.
- a remote control device is a device for remotely controlling an end effector having fingers with joints when the end effector is remotely controlled by an operator, and includes: a first detection unit that detects the position of the operator's fingertip; a second detection unit that detects the joint angles of the operator's fingers; a determination unit that determines the operation content of the operator; a priority determination unit that determines whether the priority to be given when calculating the joint angles of the fingers of the end effector is the position of the operator's fingertip or the joint angle of the operator, depending on the operation content determined by the determination unit; a calculation unit that performs optimization calculations for the joint angles including the priorities determined by the priority determination unit; and a control unit that outputs the joint angles calculated by the calculation unit to the end effector as control commands.
- a remote control method is a method for remotely operating an end effector having fingers with joints when the end effector is remotely operated by an operator, wherein a first detection unit detects the position of the operator's fingertip, a second detection unit detects the joint angles of the operator's fingers, a determination unit determines the operation content of the operator, a priority determination unit determines whether the priority to be given when calculating the joint angles of the fingers of the end effector is the position of the operator's fingertip or the joint angle of the operator based on the determined operation content, a calculation unit performs optimization calculations for the joint angles including the determined priority, and a control unit outputs the calculated joint angles to the end effector as control commands.
- a program according to one aspect of the present invention is a program that, when an operator remotely controls an end effector having fingers with joints, causes a computer of a remote control device that remotely controls the end effector to detect the position of the operator's fingertip, detect the joint angles of the operator's fingers, determine the operation content of the operator, determine whether the priority to be given when calculating the joint angles of the fingers of the end effector should be the position of the operator's fingertip or the joint angle of the operator based on the determined operation content, perform an optimization calculation of the joint angle including the determined priority, and output the calculated joint angle to the end effector as a control command.
- This aspect of the present invention can improve the operation of a robot's multi-fingered hand.
- FIG. 1 is a diagram for explaining an overview of remote control and operation of a robot.
- 1A and 1B are diagrams showing examples of grasp classification according to the GRASP classification method.
- 1A and 1B are diagrams showing examples of work using fingertips and examples of work using finger pads and the palm.
- 1 is a diagram illustrating an example of the configuration of a remote control system according to an embodiment of the present invention.
- FIG. 4 is a diagram illustrating an example of information stored in a storage unit.
- FIG. 10 is a diagram illustrating a first mapping example of fingertip positions.
- FIG. 10 is a diagram illustrating a second mapping example of fingertip positions.
- FIG. 10 is a diagram illustrating an example of mapping of joint angles.
- 10 is a flowchart of a processing procedure of a remote control device according to an embodiment.
- based on XX means “based on at least XX,” and includes cases where it is based on other elements in addition to XX.
- based on XX is not limited to cases where XX is used directly, but also includes cases where it is based on XX that has been calculated or processed.
- XX is any element (for example, any information).
- FIG. 1 is a diagram for explaining an overview of remote control and operation of a robot.
- an operator Us wears, for example, an image display device 4 on his/her head and an operation input unit 5 (5L, 5R) on his/her hand.
- An environmental sensor 3 is installed in the robot workspace. Note that the environmental sensor 3 may be attached to the robot 2.
- the robot 2 also includes an arm 21 (21L, 21R) and an end effector 22 (22L, 22R) (robot hand). Note that the end effector 22 has two or more fingers.
- the target object obj is, for example, a plastic bottle or a glass bottle, and includes a body and a lid.
- an example of the operation is to attach and close the lid cap to the body, or to open the lid cap from the body. In such a task, the robot 2 needs to hold the body in its palm and pinch the cap with its fingers.
- FIG. 2 is a diagram showing an example of grasp classification according to the GRASP classification method. As shown in FIG. 2, grasps are classified in columns according to assignment to power grasp, intermediate grasp, and precision grasp, interpersonal relationship, and virtual finger assignment. The columns are also classified according to the position of the thumb, which is abducted or adducted. Note that the grasp classification of the taxonomy shown in FIG. 2 is an example and is not limited to this.
- FIG. 3 shows an example of an operation using the fingertips and an example of an operation using the pads of the fingers and the palm.
- the image g10 is an example of a task using fingertips, for example, picking up a screw, which is a first target object obj1, with the fingertips and moving it onto a second target object obj2, which is the target to attach the screw to.
- Such object manipulation using fingertips is classified as Palmar Pinch (No. 9 in FIG. 2) in the taxonomy.
- the image g20 is an example of an operation using the finger pads and palm, in which a spherical target object obj is grasped with the finger pads and grasped with the palm.
- Such object manipulation using the finger pads is classified as Tip Pinch (No. 24 in Figure 2) in the taxonomy. In the Palmar Pinch and Tip Pinch, the joint angles of the human are almost the same, but the object manipulation methods are different.
- mapping method to be applied is switched depending on the work status, work content, etc.
- FIG. 4 is a diagram showing an example of the configuration of the remote control system according to this embodiment.
- the remote control system 1 includes, for example, a robot 2, an environmental sensor 3, an image display device 4, an operation input unit 5, a remote control device 6, and a measurement sensor 7.
- the robot 2 includes, for example, an arm 21 , an end effector 22 , a sensor 23 , a drive unit 24 , and a communication unit 25 .
- the image display device 4 includes, for example, a display unit 41 and a gaze detection unit 42 .
- the operation input unit 5 includes, for example, an operation detection unit 51 .
- the remote control device 6 includes, for example, an acquisition unit 61, a measurement unit 62 (first detection unit, second detection unit), a judgment unit 63, a priority determination unit 64, a calculation unit 65, a control unit 66, an image generation unit 67, an output unit 68, and a memory unit 69.
- the environmental sensor 3 is, for example, an RGB (red, green, blue)-D imaging device that can also acquire depth D information.
- the environmental sensor 3 is installed in the robot workspace.
- the environmental sensor 3 may also be installed in the remote control space.
- the environmental sensor 3 may also be a combination of a distance sensor and an RGB imaging device.
- the environmental sensor 3 is connected to the remote control device 6 via a wired or wireless network.
- the environmental sensor 3 is equipped with a communication unit (not shown) and outputs acquired data to the remote control device 6.
- the image display device 4 is, for example, an HMD (head mounted display).
- the image display device 4 and the remote control device 6 are connected via a wired or wireless network.
- the display unit 41 displays images required for remote operation in the robot workspace, which are output by the remote control device 6.
- the line-of-sight detection unit 42 detects the line of sight of the operator. Note that the image display device 4 does not necessarily have to include the line-of-sight detection unit 42 .
- the image display device 4 acquires an image from the remote control device 6 and outputs the detected line-of-sight information to the remote control device 6 .
- the operation input unit 5 and the remote control device 6 are connected via a wired or wireless network.
- the operation detection unit 51 is, for example, a data glove, and detects the movement and position of the operator's hand and fingers.
- the operation input unit 5 includes a communication unit (not shown) and outputs the detected data to the remote control device 6.
- the measurement sensor 7 is, for example, an RGB (red, green, blue)-D imaging device that can also acquire depth D information.
- the measurement sensor 7 may also be a combination of a distance sensor and an RGB imaging device.
- the measurement sensor 7 is installed, for example, in the remote control space.
- the measurement sensor 7 measures data about the operator's hand (finger length, spacing between fingertips, etc.).
- the measurement sensor 7 is connected to the remote control device 6 via a wired or wireless network.
- the measurement sensor 74 is equipped with a communication unit (not shown) and outputs the measured measurement data to the remote control device 6.
- the robot 2 includes at least an arm 21 and an end effector 22.
- the robot 2 may be a double-armed robot, or a walking bipedal robot, and may include a head, a body, etc.
- the robot 2 is connected to a remote control device 6 via a wired or wireless network.
- One end of the arm 21 is connected to, for example, the shoulder via a joint, and the other end is connected to the end effector 22 via a joint.
- the end effector 22 is a multi-fingered hand equipped with two or more fingers 221 (221-1, ..., 221-n (n is an integer greater than or equal to two)).
- fingers 221 (221-1, ..., 221-n (n is an integer greater than or equal to two)).
- n is an integer greater than or equal to two
- an example with five fingers 221 is described, but the number of fingers 221 is not limited to this and may be two or more.
- Each finger 221 has at least two joints.
- Sensors 23 are attached to each joint of the arm 21 and the end effector 22. Examples of sensors 23 include encoders, acceleration sensors, force sensors, and six-axis sensors.
- the drive unit 24 drives the arm 21 and end effector 22 in response to control commands received from the remote control device 6.
- Actuators are attached to each joint of the arm 21 and end effector 22.
- the communication unit 25 receives control commands from the remote control device 6 and outputs the detection values of the sensor 23 to the remote control device 6.
- the remote control device 6 operates the robot 2 using information acquired from the environmental sensor 3 , the operation input unit 5 and the measurement sensor 7 .
- the acquisition unit 61 acquires measurement data from the measurement sensor 7.
- the acquisition unit 61 acquires data detected by the sensor 23 from the robot 2.
- the acquisition unit 61 acquires data detected from the environmental sensor 3.
- the acquisition unit 61 acquires information related to detected operations from the operation input unit 5.
- the measurement unit 62 acquires measurement data before remote operation begins, and uses the acquired measurement data to measure the length of the operator's fingers, the distance between the fingertips when the hand is open, etc. Note that these measurements may be made in advance and stored in the memory unit 69. Furthermore, when measuring, the operator, for example, extends their fingers and points the palm side toward the measurement sensor 7 to perform the measurement. Note that the length of the fingers to be measured, etc. will be described later.
- the measurement unit 62 measures the distance between the operator's fingers using the measurement values of the measurement sensor 7, and measures the fingertip position of each finger.
- the determination unit 63 determines the type of work the operator is about to perform. Examples of such work include delicate tasks using the fingertips, or tasks using the pads of the fingers and palm.
- the determination unit 63 estimates the operation target and the operation intention based on the information acquired from the environmental sensor 3 and the operation instructions acquired from the operation input unit 5, using a method such as that described in JP 2022-157101 A.
- the priority determination unit 64 determines whether the priority to be given when calculating the joint angles of the fingers of the end effector 22 is the operator's fingertip position or the operator's joint angle, depending on the determination result of the determination unit 63.
- the calculation unit 65 calculates weights (e.g., cost functions) related to the fingertip positions from the operator's fingertip positions. For example, in the case of a task using the finger pads and palms, the calculation unit 65 calculates weights (e.g., cost functions) related to the joint angles from the operator's finger joint angles. Note that the weights may be calculated by the priority determination unit 64 according to the priorities. The calculation unit 65 performs optimization calculations to calculate the joint angles for the robot 2.
- weights e.g., cost functions
- the control unit 66 generates control commands using the joint angles of the robot 2 calculated by the calculation unit 65, and outputs the generated control commands to the robot 2 via the output unit 68.
- the control unit 66 may also perform control to support the operator's operation based on the estimated operation intention, using, for example, the method described in Japanese Patent Application No. 2023-045616.
- the image generation unit 67 uses, for example, data acquired from the environmental sensor 3 to generate an image required for remote control to be displayed on the image display device 4.
- the image required for remote control is, for example, an image including a finger and a target object.
- the image generation unit 67 outputs the generated image data to the image display device 4 via the output unit 68.
- the output unit 68 outputs the image data generated by the image generation unit 67 to the image display device 4.
- the output unit 68 outputs the control commands generated by the control unit 66 to the robot 2.
- the memory unit 69 stores programs, thresholds, formulas, etc. used by each part of the remote control device 6.
- the memory unit 69 stores mapping data in association with the tree-specific information, as shown in Figure 5.
- Figure 5 is a diagram showing an example of information stored in the memory unit.
- each device may be equipped with a power source.
- FIG. 6 is a diagram showing a first mapping example of fingertip positions.
- the fingertip positions are represented by the distance from the origin O of the palm.
- the origin O of the palm position may be arbitrarily set by the measurement unit 62, or may be set by the measurement unit 62 by determining the center of the palm.
- the shape of the operator's hand during measurement as shown in FIG. 6 may be determined in advance and displayed, for example, on the image display device 4.
- the hand shape during measurement is, for example, the shape of the hand when grasping an object, with the thumb and index finger facing each other.
- the measurement unit 62 measures dhOI, which relates to the length of the operator's fingers, and drOI, which relates to the length of the robot 2's fingers, using the measurement values of the measurement sensor 7.
- the measurement unit 62 then calculates the first cost function Cost do using the measured drOI and dhOI.
- the measurement unit 62 may obtain the first cost function by calculating the difference between the distance from the palm origin of the robot 2 to each finger and the distance from the palm origin of the operator's finger, and then summing the differences between the fingers.
- the measurement unit 62 may obtain the first cost function by calculating the ratio between the distance from the palm origin of the robot 2 to each finger and the distance from the palm origin of the operator's finger, and then averaging the ratios between the fingers.
- Figure 7 is a diagram showing a second example of mapping fingertip positions.
- the example in Figure 7 is an example of measuring the distance between each finger.
- the distances between each finger are shown as examples of the distances from the tip of the thumb to the tip of the index finger, the tip of the thumb to the tip of the middle finger, the tip of the thumb to the tip of the ring finger, the tip of the thumb to the tip of the little finger, the tip of the middle finger to the tip of the index finger, the tip of the ring finger to the tip of the middle finger, and the tip of the little finger to the tip of the ring finger.
- the fingers used as the reference for measuring the distance may be different.
- “from the tip of the middle finger to the tip of the index finger” may also be "from the tip of the index finger to the tip of the middle finger," as long as it is the distance between the fingertips.
- the measurement unit 62 measures dhTI, which is the distance between the operator's fingers, and drTI, which is the distance between the robot 2, using the measurement values of the measurement sensor 7. Then, the measurement unit 62 calculates the third cost function Cost df using the measured distances between the fingers, drTI and dhTI. Note that the measurement unit 62 may obtain the third cost function by calculating the difference between the distance between each of the fingers of the robot 2 and the distance between each of the fingers of the operator, and summing up the differences between the distances between each of the fingers. Alternatively, the measurement unit 62 may obtain the third cost function by calculating the ratio between the distance between each of the fingers of the robot 2 and the distance between each of the fingers of the operator, and averaging the ratios of the distances between each of the fingers.
- Fig. 8 is a diagram showing an example of mapping of joint angles.
- the example in Fig. 8 shows the shape of the hand and fingers when pinching an object with the thumb and index finger. Note that the shape of the operator's hand during measurement as shown in Fig. 8 may be determined in advance and displayed on the image display device 4, for example.
- the measurement unit 62 calculates the angle of each joint of each finger using the results of measurement by the measurement sensor 7. For example, joint angle ⁇ hI0 is the angle of the third joint of the operator's index finger, joint angle ⁇ hI1 is the angle of the second joint of the operator's index finger, and joint angle ⁇ hI3 is the angle of the first joint of the operator's index finger.
- joint angle ⁇ rI0 is the angle of the third joint of the robot's index finger
- joint angle ⁇ rI1 is the angle of the second joint of the robot's index finger
- joint angle ⁇ rI3 is the angle of the first joint of the robot's index finger.
- the measurement unit 62 measures each joint angle of the operator and each joint angle of the robot 2 using the measurement values of the measurement sensor 7. Then, the measurement unit 62 calculates the second cost function Cost fs using the measured joint angles of the operator and the robot for each finger. The measurement unit 62 may also calculate the second cost function by calculating the difference between each joint angle of each finger of the robot 2 and each joint angle of each finger of the operator, and then summing the differences between the joint angles of each finger. Alternatively, the measurement unit 62 may calculate the second cost function by calculating the ratio between each joint angle of each finger of the robot 2 and each joint angle of each finger of the operator, and then averaging the ratios between the joint angles of each finger. The measurement unit 62 may also calculate the second cost function by multiplying the joint angles of the robot 2 or the joint angles of the operator by a weight indicating a difference in ease of bending the fingers of the robot 2 and the operator.
- FIG. 9 is a flowchart of the processing procedure of the remote control device according to this embodiment.
- Step S1 The measurement unit 62 measures the distance between the operator's fingers using the measurement values of the measurement sensor 7, and measures the fingertip position of each finger.
- the measurement unit 62 measures each joint angle of each finger using the measurement values of the measurement sensor 7. Note that these processes may be performed in advance and stored in the storage unit 69. Furthermore, the measurement unit 62 may measure the fingertip positions and joint angles of the robot 2 when measuring the operator's fingertip positions and joint angles, or may measure them in advance and store them in the storage unit 69.
- Fig. 9 is a flowchart of the processing procedure of the remote control device according to this embodiment.
- Step S2 The determination unit 63 estimates and acquires the work content using the data acquired from the environmental sensor 3.
- the operator may select or input the work content by operating the operation input unit 5 or the like.
- the acquisition unit 61 may acquire the selected or input work content.
- the selection may be made by displaying work content candidates on the display unit 41 of the image display device 4 and selecting by line of sight.
- the operator may input or select the work content by operating a keyboard (not shown) or the like connected to the remote control device 6.
- Step S3 The judgment unit 63 determines whether the work content is, for example, "work using fingertips” or “work using finger pads and palms.” If the work content is "work using fingertips,” the judgment unit 63 proceeds to processing in step S4, and if the work content is "work using finger pads and palms,” the judgment unit 63 proceeds to processing in step S5.
- Step S4 The priority determination unit 64 determines the operator's fingertip position as the priority when calculating the joint angles of the fingers of the end effector, based on the determination result of the determination unit 63.
- the calculation unit 65 calculates a weight related to the fingertip position (e.g., a second cost function) from the operator's fingertip position.
- the calculation unit 65 calculates a weight related to the distance between the operator's fingers (e.g., a third cost function). After processing, the calculation unit 65 proceeds to processing in step S6.
- Step S5 The priority determination unit 64 determines the operator's joint angle as the priority when calculating the joint angle of the finger of the end effector, based on the determination result of the judgment unit 63.
- the calculation unit 65 calculates a weight related to the joint angle (e.g., a second cost function) from the joint angle of the operator's finger. After processing, the calculation unit 65 proceeds to processing in step S6.
- Step S6 The calculation unit 65 uses the weights determined in step S6 or step S7 to calculate the joint angles of the robot 2, for example by performing optimization calculations.
- Step S7 The control unit 66 generates control commands for the robot 2 based on the calculated joint angles.
- Step S8 The control unit 66 outputs the generated control command to the robot 2 via the output unit 68 to control the operation of the robot 2.
- the joint angles of the robot 2 are determined by calculating, for example, the weight of the evaluation function that prioritizes the position of the operator's fingertips, or the weight of the evaluation function that prioritizes the joint angles of the operator's fingers, depending on the task content.
- the weights are, for example, for each of the first to third cost functions, and are, for example, ⁇ , ⁇ , and ⁇ in the following equation (1).
- the calculation unit 65 changes and controls the dominant cost function by changing these weights ⁇ , ⁇ , and ⁇ depending on the task content.
- the calculation unit 65 then performs optimization calculations to minimize the sum cost of the weighted cost functions, and determines the joint angles of each finger of the robot 2. For example, in the case of pinching tasks using the fingertips, the weight ⁇ or ⁇ , or the weights ⁇ and ⁇ , are changed. Furthermore, for example, in tasks using the palm, the weight ⁇ is changed.
- steps S2 to S8 may be performed, for example, at predetermined time intervals, and the weighting used when the task content changes may be changed.
- Step S1 may also be performed at predetermined time intervals.
- the weight is switched depending on whether the task is "task using fingertips” or “task using finger pads and palms,” but this is not limited to this.
- the task is not limited to "task using fingertips” and “task using finger pads and palms,” and can be other task content as shown in Figure 2, and the number of tasks to be switched is not limited to two, but can be three or more.
- the weights of the fingertip positions and joint angles are changed when calculating the joint angles of the robot 2 depending on the operation content.
- the robot 2 has one arm, but it may also be a double-armed robot with two arms.
- the arm and end effector to be used may be selected depending on the task and controlled using the above-mentioned method, or each arm may be controlled using the above-mentioned method so that the two end effectors work in coordination.
- a program for implementing all or part of the functions of the remote control device 6 of the present invention may be recorded on a computer-readable recording medium, and the program may be loaded into a computer system and executed to perform all or part of the processing performed by the remote control device 6.
- computer system includes hardware such as an OS and peripheral devices. It also includes a WWW system equipped with a website provision environment (or display environment).
- computer-readable recording medium refers to portable media such as floppy disks, optical magnetic disks, ROMs, and CD-ROMs, as well as storage devices such as hard disks built into computer systems.
- computer-readable recording medium also includes devices that retain a program for a certain period of time, such as volatile memory (RAM) within a computer system that acts as a server or client when the program is transmitted via a network such as the Internet or a communication line such as a telephone line.
- RAM volatile memory
- some or all of these components may be realized by hardware (including circuitry) using LSIs (Large Scale Integration) such as ASICs (Application Specific Integrated Circuits), FPGAs (Field-Programmable Gate Arrays), GPUs (Graphics Processing Units), and SOCs (System On Chips), or may be realized by a combination of software and hardware.
- LSIs Large Scale Integration
- ASICs Application Specific Integrated Circuits
- FPGAs Field-Programmable Gate Arrays
- GPUs Graphics Processing Units
- SOCs System On Chips
- the above program may be transmitted from a computer system that stores the program in a storage device or the like to another computer system via a transmission medium, or by transmission waves in the transmission medium.
- the "transmission medium” that transmits the program refers to a medium that has the function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
- the above program may also be one that realizes some of the above-mentioned functions. Furthermore, it may be a so-called differential file (differential program) that can realize the above-mentioned functions in combination with a program already recorded in the computer system.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
L'invention concerne un système de commande à distance qui commande à distance un effecteur terminal qui a une partie doigt dotée d'articulations et comprend : une première unité de détection pour détecter une position du bout de doigt d'un opérateur ; une seconde unité de détection pour détecter un angle d'articulation d'un doigt de l'opérateur ; une unité de détermination pour déterminer le contenu de l'opération de l'opérateur ; une unité de décision de priorité pour décider si une priorité doit être donnée à la position du bout de doigt de l'opérateur ou à l'angle d'articulation d'un doigt de l'opérateur lors du calcul d'un angle d'articulation de la partie doigt de l'effecteur terminal, en fonction du contenu de l'opération déterminé par l'unité de détermination ; une unité de calcul pour effectuer un calcul d'optimisation d'angle d'articulation comprenant la priorité déterminée par l'unité de décision de priorité ; et une unité de commande pour délivrer à l'effecteur terminal l'angle d'articulation calculé par l'unité de calcul comme instruction de commande.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024-044918 | 2024-03-21 | ||
| JP2024044918A JP2025144969A (ja) | 2024-03-21 | 2024-03-21 | 遠隔制御システム、遠隔制御装置、遠隔制御方法、およびプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025197687A1 true WO2025197687A1 (fr) | 2025-09-25 |
Family
ID=97139154
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2025/009065 Pending WO2025197687A1 (fr) | 2024-03-21 | 2025-03-11 | Système de commande à distance, dispositif de commande à distance, procédé de commande à distance et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2025144969A (fr) |
| WO (1) | WO2025197687A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011110620A (ja) * | 2009-11-24 | 2011-06-09 | Toyota Industries Corp | ロボットの動作を制御する方法およびロボットシステム |
| JP2023107569A (ja) * | 2022-01-24 | 2023-08-03 | 本田技研工業株式会社 | 遠隔操作補助装置、遠隔操作補助方法、およびプログラム |
| JP2023131033A (ja) * | 2022-03-08 | 2023-09-21 | 本田技研工業株式会社 | 遠隔制御システム |
-
2024
- 2024-03-21 JP JP2024044918A patent/JP2025144969A/ja active Pending
-
2025
- 2025-03-11 WO PCT/JP2025/009065 patent/WO2025197687A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011110620A (ja) * | 2009-11-24 | 2011-06-09 | Toyota Industries Corp | ロボットの動作を制御する方法およびロボットシステム |
| JP2023107569A (ja) * | 2022-01-24 | 2023-08-03 | 本田技研工業株式会社 | 遠隔操作補助装置、遠隔操作補助方法、およびプログラム |
| JP2023131033A (ja) * | 2022-03-08 | 2023-09-21 | 本田技研工業株式会社 | 遠隔制御システム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2025144969A (ja) | 2025-10-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Sato et al. | Finger-shaped gelforce: sensor for measuring surface traction fields for robotic hand | |
| US6622575B1 (en) | Fingertip-mounted six-axis force sensor | |
| US20190283259A1 (en) | Robot control method | |
| EP2418562B1 (fr) | Modelage de la position et orientation des mains et bras | |
| US10976863B1 (en) | Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user | |
| US11175729B2 (en) | Orientation determination based on both images and inertial measurement units | |
| JP2000132305A (ja) | 操作入力装置 | |
| US20210278898A1 (en) | Ring device having an antenna, a touch pad, and/or a charging pad to control a computing device based on user motions | |
| Gunawardane et al. | Comparison of hand gesture inputs of leap motion controller & data glove in to a soft finger | |
| CN116113523A (zh) | 信息处理装置、信息处理方法和程序 | |
| WO2019173678A1 (fr) | Suivi de pose de main optimal à l'aide d'un gant de détection à base d'électronique souple et apprentissage machine | |
| Fisher | Telepresence master glove controller for dexterous robotic end-effectors | |
| JP2008135033A (ja) | 手姿勢動作検出装置 | |
| Falck et al. | DE VITO: A dual-arm, high degree-of-freedom, lightweight, inexpensive, passive upper-limb exoskeleton for robot teleoperation | |
| Yuan et al. | Tactile-reactive roller grasper | |
| Solazzi et al. | Design of a cutaneous fingertip display for improving haptic exploration of virtual objects | |
| JP2008065860A (ja) | 操作入力装置 | |
| Paredes-Madrid et al. | Dataglove-based interface for impedance control of manipulators in cooperative human–robot environments | |
| CN111002295A (zh) | 一种二指抓取机器人的示教手套及示教系统 | |
| JP3884249B2 (ja) | 人間型ハンドロボット用教示システム | |
| WO2025197687A1 (fr) | Système de commande à distance, dispositif de commande à distance, procédé de commande à distance et programme | |
| KR102334543B1 (ko) | 가상현실 기반 주행형 농업기계의 안전교육을 위한 손 동작 추적 시스템 및 방법 | |
| CN110877335A (zh) | 一种基于混合滤波器自适应无标记机械臂轨迹跟踪方法 | |
| JP2008102951A (ja) | 操作入力装置 | |
| JP2008112459A (ja) | 操作入力装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25773508 Country of ref document: EP Kind code of ref document: A1 |