WO2024105777A1 - 制御装置およびコンピュータ - Google Patents
制御装置およびコンピュータ Download PDFInfo
- Publication number
- WO2024105777A1 WO2024105777A1 PCT/JP2022/042393 JP2022042393W WO2024105777A1 WO 2024105777 A1 WO2024105777 A1 WO 2024105777A1 JP 2022042393 W JP2022042393 W JP 2022042393W WO 2024105777 A1 WO2024105777 A1 WO 2024105777A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- effector
- constraint
- constraints
- user
- screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
Definitions
- This disclosure relates to a control device and a computer.
- the user In industrial robots, the user generally teaches the robot to make it perform the desired operation.
- the robot's control device generates a path based on the instruction and moves the robot.
- a mode in which the robot moves without user operation based on preset operation commands is called automatic operation mode or AUTO mode.
- AUTO mode a mode in which the robot moves without user operation based on preset operation commands
- the user performs an operation called jog operation using a portable operation panel.
- the user must take care to ensure that the robot moves safely while closely observing the robot, effector, workpiece, and other objects.
- the user must also take care of the posture of the effector so that it can perform its function.
- cycle time is generally a priority.
- constraints may be set on the robot's movements, such as only allowing rotations around an axis perpendicular to the ground.
- a function is generally known in which an operable area or an inaccessible area is set in advance so that the robot does not interfere with the surrounding environment, and the robot is operated only within the area where no interference occurs.
- a function is also known in which a detailed interference calculation is performed using a 3D model of the robot and the surrounding environment. See, for example, Patent Document 1.
- a technique is also known for generating a path so that a protrusion of an effector does not point toward a person or the like. For example, see Patent Document 2.
- the robot's path As one example, if there is an appropriate range for the posture, position, etc. of the effector for the effector to perform its function safely, it is desirable for the robot's path to satisfy this range. As one example, if path generation and jog operation are not performed reflecting the properties of the effector, undesirable situations such as dropping the target such as a workpiece may occur. As one example, effectors attached to the tip of the robot are diverse, such as hands and suction cups for handling objects, welding torches, and inspection scanners, and it is desirable for the robot to operate in accordance with the effector. As one example, settings that fix the posture of the effector to a specific state narrow the options for path generation and jog operation, are not efficient, and may result in a decrease in cycle time. There is a demand for technology that allows settings according to the type of effector, the function required of the effector, the type of target, the type of work, and the functions required for the work.
- the control device of the first aspect of the present disclosure includes a processor and a memory unit that stores effector constraints, which are constraints on changes in at least one of the position and orientation of a robot's effector as viewed from a predetermined reference coordinate, and the processor causes the robot to perform an action constrained by the effector constraints that are set based on input from a user or an external device.
- effector constraints which are constraints on changes in at least one of the position and orientation of a robot's effector as viewed from a predetermined reference coordinate
- the control device of the second aspect of the present disclosure includes a processor, a memory unit, and a display device that displays a setting screen for effector constraints, which are constraints on changes in at least one of the position and orientation of a robot's effector as viewed from a predetermined reference coordinate, and the setting screen is for setting the effector constraints based at least on user input.
- the computer of the third aspect of the present disclosure includes a processor, a storage unit, and a display device that displays a setting screen for effector constraints, which are constraints on changes in at least one of the position and orientation of a robot's effector as viewed from a predetermined reference coordinate, and the setting screen is for setting the effector constraints based at least on user input, and the processor performs a simulation to cause the robot model to perform an operation based on the effector constraints and determines whether the operation satisfies a standard.
- effector constraints which are constraints on changes in at least one of the position and orientation of a robot's effector as viewed from a predetermined reference coordinate
- the setting screen is for setting the effector constraints based at least on user input
- the processor performs a simulation to cause the robot model to perform an operation based on the effector constraints and determines whether the operation satisfies a standard.
- FIG. 1 is a schematic diagram of a robot system including a robot according to an embodiment.
- FIG. 2 is a block diagram showing the configuration of a control device for a robot according to the present embodiment. 2 is a schematic diagram of various effectors attached to the robot of the present embodiment.
- FIG. 5A to 5C are schematic diagrams illustrating the operation of an effector attached to the robot of the present embodiment. 5 is a diagram showing an example of effector constraints set in the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- FIG. 2 is a block diagram showing an example of functions of a control device according to the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment.
- 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment. 5 is an example of a screen displayed by the control device of the present embodiment.
- the control device 1 is provided for controlling an arm 10A of a robot 10 (FIG. 1).
- the robot 10 is not limited to a specific type, but the robot 10 of this embodiment is a multi-joint robot having six axes.
- the robot 10 may be a multi-joint robot having five or fewer axes or seven or more axes, a horizontal multi-joint robot, a multi-link robot, or the like.
- the robot 10 or its arm 10A may be supported by a traveling device such as a linear guide, an AGV (Automatic Guided Vehicle), a vehicle, a walking robot, or the like.
- the robot 10 may be a collaborative robot that can avoid contact or proximity with surrounding people, objects, etc. by using well-known sensors such as visual sensors and force sensors.
- the arm 10A has a number of movable parts 12 connected to each other by joints, and a number of servo motors 11 that drive each of the movable parts 12 (Figs. 1 and 2).
- Each servo motor 11 has an operating position detection device such as a sensor for detecting its operating position, an encoder 11A, etc.
- the control device 1 receives the detection value of the encoder 11A.
- an effector 30 such as a hand or tool is attached to the tip of an arm 10A, and the arm 10A is part of a robot system that performs work on an object 2, which is a work target on a transport device, for example.
- the operations are well-known operations such as removing object 2, processing object 2, and attaching parts to object 2.
- Processing object 2 is well-known processing such as machining, painting, and cleaning.
- the transport device can be a conveyor, an AGV (Automatic Guided Vehicle), or anything that can move object 2 such as a car under manufacture.
- object 2 In the case of a car under manufacture, the chassis, tires, motor, etc. function as the transport device, and object 2, which is the body on the chassis, etc., is transported.
- Object 2 can be various objects such as industrial products, goods including food, parts of goods, parts of structures, animals, parts of animals, parts of people, etc.
- the effector 30 may be a dedicated hand, suction cup, etc. for handling items.
- the effector 30 may also be equipped with a wide variety of devices, such as tools for assembly processes, guns for spot welding, torches for arc welding, scanners for inspection systems, etc. In this way, the effector 30 is not limited to a specific effector.
- the effector 30 When the effector 30 has a moving part such as a finger of a hand, the effector 30 is equipped with a servo motor 31 that drives the moving part ( Figure 2).
- the servo motor 31 has an operating position detection device for detecting its operating position, and an example of the operating position detection device is an encoder. The detection value of the operating position detection device is transmitted to the control device 1.
- Various types of servo motors such as rotary motors and linear motors can be used as each of the servo motors 11 and 31.
- the effector 30 is usually attached to the tip of the arm 10A, but may also be attached to the longitudinal middle part or base end of the arm 10A.
- a hand that grasps the target 2 as the effector 30 or a hand that attracts the target 2 using a suction cup, magnet, electromagnet, etc. is often used.
- the target 2 may be placed in a container or flat tray as the effector 30.
- the target 2 may also be placed in a box or basket as the effector 30.
- the effector 30 may have limited appropriate postures for functioning as an effector.
- the effector 30, which is a hand using a suction cup, a magnet, or an electromagnet may not be able to hold the target 2 reliably if it cannot attract the target 2 from a predetermined direction such as above.
- the user when placing the target 2 on the effector 30, which is a tray, the user must take care to prevent the target 2 from falling off.
- the control device 1 has a processor 21 having one or more processor elements such as a CPU, a microcomputer, an image processor, etc., and a display device 22.
- the control device 1 also has a storage unit 23 having a non-volatile storage, ROM, RAM, etc.
- the control device 1 also has a servo controller 24 corresponding to each of the servo motors 11 of the robot 10, and a servo controller 25 corresponding to the servo motor 31 of the effector 30.
- the control device 1 also has an input unit 26 connected to the control device 1 by wire or wirelessly.
- the input unit 26 is an input device such as a portable operation panel that can be carried by the user.
- the input unit 26 is a tablet computer. In the case of a portable operation panel, tablet computer, etc., the input is performed using a touch screen function.
- the portable operation panel or tablet computer may also have a display device 22.
- the memory unit 23 stores a system program 23A, which performs the basic functions of the control device 1.
- the memory unit 23 also stores one or more operation programs 23B.
- the operation program 23B includes multiple commands, information, etc. for operating the robot.
- the operation program 23B includes at least information on the coordinates and posture of multiple teaching points, commands related to movements between teaching points, etc.
- the storage unit 23 also stores a control program 23C, a path generation program 23D, etc.
- the control program 23C is a known feedback program, feedforward program, etc.
- the control device 1 generates a path based on the operation program 23B using the path generation program 23D, and generates control commands using the control program 23C to move along the path, thereby controlling the arm 10A.
- the position and posture of the arm 10A of the robot 10 it is common to specify coordinates as viewed from a robot reference coordinate system 101 ( FIG. 1 ) that serves as a reference that does not move in space as the teaching points, etc.
- a robot reference coordinate system 101 FIG. 1
- the position and posture of a coordinate system set on a flange surface (mechanical interface) at the tip of the arm 10A are generally specified as the teaching points, etc.
- an effector coordinate system 102 FIG. 1
- the position and posture of the effector coordinate system 102 are generally specified as the teaching points, etc.
- the coordinate system set at the tip of the arm 10A is also considered to be the effector coordinate system 102, and the coordinate system set on the flange surface is also treated as the effector coordinate system 102.
- a reference coordinate system 101 and an effector coordinate system 102 that does not move relative to the effector 30 are set.
- the effector coordinate system 102 may also be called by other names such as a tool coordinate system.
- the control device 1 recognizes the position and orientation of the effector coordinate system 102 in the reference coordinate system 101 by well-known calibration or the like.
- the user can set effector constraints that constrain the relative change of the effector coordinate system 102 with respect to the reference coordinate system 101 .
- An example of setting effector constraints is shown in Fig. 5.
- a first example of effector constraints is a constraint on the position coordinates (X, Y, Z) of the effector coordinate system 102.
- a place where "0" is input as both the upper and lower limits means that no change is allowed.
- the fact that no effector constraint is set may be expressed by "-" or the like.
- the constraint on the relative change of the effector coordinate system 102 in the first example may be set based on the position and orientation of the reference coordinate system 101, the effector coordinate system 102, or another coordinate system.
- the reference coordinate system 101, the effector coordinate system 102, or another coordinate system is a predetermined coordinate system, which may be simply referred to as a coordinate system in the following description.
- the constraint on the orientation of the effector coordinate system 102 in the second example may also be set based on the position and orientation of the coordinate system.
- the constraint on the position and orientation of the effector coordinate system 102 may be set based on the position and orientation of the effector coordinate system 102 before the arm 10A starts a certain operation.
- a third example of an effector constraint is a constraint on the velocity of the effector coordinate system 102.
- the velocity is, for example, the velocity in the direction of travel of the effector coordinate system 102 in the coordinate system, or the velocities in each of the X, Y, and Z directions.
- a fourth example of an effector constraint is a constraint on the angular velocity of the effector coordinate system 102.
- the angular velocity is the angular velocity around an axis of the effector coordinate system 102 in the coordinate system, or the angular velocity around the X, Y, and Z axes.
- a fifth example of an effector constraint is a constraint on the acceleration of the effector coordinate system 102.
- the acceleration is, for example, the acceleration in the direction of travel of the effector coordinate system 102 in the coordinate system, or the acceleration in each of the X, Y, and Z directions.
- a sixth example of an effector constraint is a constraint on the angular acceleration of the effector coordinate system 102.
- the angular acceleration is the angular acceleration around an axis of the effector coordinate system 102 in the coordinate system, or the angular acceleration around the X, Y, and Z axes.
- the third to sixth examples of effector constraints are also constraints on the change in at least one of the position and orientation of the effector 30.
- the effector constraint may be a combination of any two or more of the first to sixth examples. Also, a value or formula equivalent to the amount obtained by time-differentiating the position and/or orientation three or more times may be used. Also, the effector constraint may be a constraint on the change in the position and/or orientation of the effector coordinate system 102 relative to a specified reference coordinate. Note that the change in the position and/or orientation of the effector coordinate system 102 relative to the specified reference coordinate is the change in the position and/or orientation of the effector relative to the specified reference coordinate. Also, the constraints on angular velocity, accelerations, etc. in the third to sixth examples are constraints on the change in the position and/or orientation of the effector as viewed from the specified reference coordinate.
- the operation program 23B coordinate and posture information, the command, and effector constraints are set for each teaching point.
- effector constraints are not set for teaching point 1 (position and posture [1]) and teaching point 2 (position and posture [2]).
- effector constraints 1 and 2 described below are set for teaching point 3 (position and posture [3]) and teaching point 4 (position and posture [4]), respectively.
- the screen 200 of FIG. 6 is a screen that accepts an operation for displaying a screen related to setting effector constraints. The operation is a tap at a predetermined position on the screen 200 or a predetermined button. The button may be provided in the input unit 26.
- effector constraint setting screen 210 shown in FIG. 6 appears.
- An effector constraint or effector constraint set, described below, can be selected on setting screen 210.
- an effector constraint or effector constraint set is set at any teaching point, as shown in FIG. 7.
- the user can set, as effector constraints, a coordinate system and constraints on positional and pose changes of the effector coordinate system 102 relative to the reference coordinates.
- an input unit 26 that allows the user to edit such settings is provided on a portable operation panel also known as a teach pendant.
- Settings such as effector constraints are stored in the memory unit 23, or a specified memory unit such as a memory unit of a separate control device or a memory unit on the cloud.
- effector constraints are stored in a memory unit of a separate control device or a memory unit on the cloud, these memory units and memory units function as the memory unit of the control device 1.
- a screen related to the setting is displayed on the display device 22 of the input unit 26.
- the processor 21 of the control device 1 causes the display device 22 to display a screen 300 shown in Fig. 8.
- the screen 300 is a screen for the user to select a transition to a setting screen for the effector constraints.
- An operation unit 500 for performing the selection and the like is displayed on the display device 22.
- Directional keys, a decision key, a back key for returning to the screen before the transition or to the screen of a higher layer, and the like are displayed on the operation unit 500, and the user performs input by operating these keys.
- buttons corresponding to the functions may be provided on the input unit 26.
- the processor 21 causes the display device 22 to display a screen 301 of Fig. 9.
- the screen 301 is a screen for the user to select a transition to a reference coordinate system setting screen.
- the processor 21 causes the display device 22 to display a screen 302 of Fig. 9.
- the screen 302 is a screen for the user to select the setting of an arbitrary reference coordinate system from among a plurality of reference coordinate systems.
- the processor 21 When the user selects, for example, reference coordinate system 1 from among the multiple reference coordinate systems on screen 302, the processor 21 causes the display device 22 to display screen 303 of FIG. 9.
- Screen 303 is a screen for setting the reference coordinate system 1 selected by the user. As shown on screen 303, the user can set the position and orientation of reference coordinate system 1.
- the processor 21 causes the display device 22 to display a screen 303 of Fig. 10.
- the user can set the selected reference coordinate system 2.
- the coordinate systems set by the reference coordinate systems 1, 2, etc. can be used as the reference coordinate system 101.
- the user can set a plurality of reference coordinate systems using the screens 302 and 303. This configuration is useful for improving the degree of freedom in setting effector constraints, which will be described later.
- the processor 21 causes the display device 22 to display screen 304 of FIG. 11.
- Screen 304 is a screen for the user to select the setting of one of a number of effector coordinates.
- the processor 21 When the user selects, for example, effector coordinate 1 from among the multiple effector coordinates on screen 304, the processor 21 causes the display device 22 to display screen 305 of FIG. 11.
- Screen 305 is a screen for setting the effector coordinate 1 selected by the user. As shown on screen 305, the user can set the position and orientation of effector coordinate 1.
- the processor 21 causes the display device 22 to display a screen 305 of Fig. 12.
- the user can set the selected effector coordinate 2.
- the user can set a plurality of effector coordinates using the screens 304 and 305. This configuration is useful for improving the degree of freedom in setting effector constraints, which will be described later.
- the processor 21 causes the display device 22 to display screen 306 of FIG. 13.
- Screen 306 is a screen that allows the user to select the setting of any one of the multiple effector constraints.
- effector constraint 1 When the user selects, for example, effector constraint 1 from among the multiple effector constraints on screen 306, the processor 21 causes the display device 22 to display screen 307 of FIG. 13.
- Screen 307 is a screen for setting effector constraint 1 selected by the user, and the user can set the effector constraint using screen 307.
- the effector constraint is intended to restrict changes as viewed from a specified reference coordinate of the effector coordinate system 102 fixed to the effector 30.
- the user can set a reference coordinate system that serves as the basis for effector constraint 1.
- Effector constraint 2 can be set in a similar manner.
- the reference coordinate system is always fixed, or when reference coordinate system 101 is used, it is possible to omit setting the reference coordinate system on screen 307.
- effector coordinates 1 are set for effector constraint 1.
- effector coordinates 2 are set for effector constraint 2.
- Effector constraints restrict changes in the position and/or posture of the effector 30 as viewed from the set effector coordinates (predetermined reference coordinates). For this reason, the configuration in which effector coordinates can be set or selected as described above, and the configuration in which the user can set effector coordinates for each effector constraint, each lead to an improvement in the degree of freedom of setting by the user.
- effector constraint elements described below, are set for each effector constraint.
- effector coordinate 1 is set diagonally upward relative to the effector coordinate system 102
- effector coordinate 2 is set at a different position horizontally relative to the effector coordinate system 102.
- effector constraint 1 is set at teaching point 3 (position and attitude [3]).
- the processor 21 operates the arm 10A so that the effector 30 moves based on the operation program 23B.
- teaching point 2 position and attitude [2]
- teaching point 3 the change in the position and attitude of the effector coordinate system 102 as viewed from effector coordinate 1 (predetermined reference coordinate) is constrained by the effector constraint element set in effector constraint 1.
- teaching point 4 the change in the position and attitude of the effector coordinate system 102 as viewed from effector coordinate 2 (predetermined reference coordinate) is constrained by the effector constraint element set in effector constraint 2.
- effector coordinate 1 for effector 30 at teaching point 3 corresponds to the position of effector 30 in effector coordinate 1 shown on screen 305 in FIG. 11.
- the position of effector coordinate 2 can also be set in a similar manner.
- a teaching point or a passing point between teaching points may be used as the predetermined reference coordinate.
- the change in position and attitude at each teaching point or each passing point of the effector 30 moving according to the operation program 23B is controlled so as to be within the range of the effector constraint element as viewed from the position and attitude of the teaching point or the passing point.
- a teaching point or a passing point between teaching points is used as the predetermined reference coordinate, it is not necessary to set the screen 305 in Fig. 11 and Fig. 12, and it is also not necessary to set the effector coordinates on the screen 307 in Fig. 13.
- the screen 307 in Fig. 13 may be configured to accept a setting that sets the effector coordinates 1 as the position and orientation of the teaching point or the passing point.
- the effector constraint element of the effector constraint can also be said to indicate the range within which changes in the position of the effector 30 are permitted.
- the processor 21 operates the arm 10A in the above-described configuration, the actual position and orientation of the effector 30 (effector coordinate system 102) is positioned within the range within which changes in the position of the effector 30 are permitted by the effector constraint.
- the target of effector constraint 1 is a section.
- an item "Applicable range of effector constraint” is displayed on screen 307, and the user inputs the teaching point number or the like of the target of the effector constraint to the right of "Range of effector constraint". If the teaching point numbers are consecutive numbers, the section becomes the target of effector constraint 1.
- the section subject to the effector constraint may be specified by describing the start/end of the effector constraint within the operation program 23B.
- an effector constraint that is always applied regardless of the operation program 23B may be set.
- an operation program 23B that is always applied may be set for each effector constraint.
- a space or a posture type of the arm 10A may be set as the "application range of the effector constraint" on the screen 307 shown in FIG. 13.
- the range of the dashed line 307A in FIG. 13 indicates the range in the X-Z direction, but a range of, for example, about several tens of centimeters in the Y direction may also be set within that range.
- a plurality of posture types of the arm 10A may be displayed on the screen 307, and the selected posture type may be input to the right of the "application range of the effector constraint".
- the effector constraint 1 is applied while the posture of the arm 10A corresponds to that posture type.
- a configuration in which the user can set a route to be subject to the effector constraint on the screen 307 may also be adopted.
- the control device 1 may automatically set effector constraints based on the effector constraints set for each teaching point of the operation program 23B and other set effector constraints. This automatically set effector constraint is also based on the effector constraints set by the user for each teaching point, and is therefore an effector constraint set based on user input.
- the arm 10A may be placed on a bar counter.
- the work may be a work in which the arm 10A holds an object 2 such as a cup using the effector 30 which is a hand, and a work in which the held object 2 is provided to a position at the counter corresponding to a customer.
- a visual sensor is provided to observe the working range of the arm 10A, and the control device 1 recognizes the position of the effector 30, the position of the target 2, the surrounding environment 4 in which there is movement within the space, approaching objects including the customer, etc., based on the output of the visual sensor.
- the control device 1 sequentially calculates the path along which the effector 30 moves for the work while recognizing the range of the surrounding environment 4 and the approaching objects.
- the processor 21 can apply the effector constraints set in the space when generating the path.
- screen 307 the user can set the movable range of the effector 30 in the X, Y, and Z directions as effector constraint 1.
- Screen 307 allows the user to set a "reference”.
- the “reference” is shown by coordinates in, for example, reference coordinate system 1, reference coordinate system 101, effector coordinate system 102, etc.
- Screen 307 allows the user to set an "upper limit” and a “lower limit”.
- the "upper limit” and “lower limit” are, for example, the movable amount or movable range relative to the coordinates of the "reference”.
- effector constraint elements the movable ranges in the X, Y, and Z directions having the "reference”, “upper limit”, and “lower limit” are referred to as effector constraint elements.
- the user can set the rotational movable range, angular velocity, and angular acceleration of the effector 30 around the X, Y, and Z axes, and the velocity and acceleration in the X, Y, and Z directions as effector constraint 1.
- Values or formulas equivalent to the amount obtained by time-differentiating the range of rotational movement around the X, Y, and Z axes, velocity, acceleration, angular velocity, angular acceleration, position, or orientation three or more times are also called effector constraint elements.
- effector coordinates 1 set as the effector coordinates on screen 307 is used as a "reference”, or if the "reference" is automatically set by control device 1, input and display of the "reference” may be omitted. Also, it is not necessary to set all effector constraint elements, and if some are fixed, they may be automatically set by control device 1, etc.
- the user can set the "reference” arbitrarily. Therefore, the user can set a position and posture different from the position and posture of the effector 30 set at each teaching point and the position and posture of the effector coordinate 1 set on the screen 307 as the "reference".
- This configuration leads to an improvement in the degree of freedom of setting by the user, and the accuracy, safety, efficiency, etc. of the operation of the arm 10A.
- the user can set each "reference" around the X, Y, and Z axes as a neutral posture of the effector 30.
- improving the efficiency of the operation of the arm 10A includes improving the cycle time of the operation of the arm 10A, etc.
- the processor 21 when the user returns to screen 301 as shown in FIG. 15 and selects to transition to the effector constraint set setting screen, the processor 21 causes the display device 22 to display screen 308 of FIG. 15.
- Screen 308 is a screen that allows the user to select the setting of any one of the multiple effector constraint sets.
- the processor 21 When the user selects, for example, set 1 from among the multiple sets on screen 308, the processor 21 causes the display device 22 to display screen 309 of FIG. 15.
- Screen 309 is a screen for setting effector constraint set 1 selected by the user, and the user can set the effector constraint set using screen 309.
- An effector constraint set can associate multiple effector constraints.
- the user can incorporate any selected effector constraints 1 to 3 into effector constraint set 1, and can also set each of effector constraints 1 to 3 to be enabled or disabled.
- the user can also set the relationship between multiple effector constraints 1 to 3 as "1 ⁇ 2 ⁇ 3", where "1 ⁇ 2 ⁇ 3" means effector constraint 1, effector constraint 2, and effector constraint 3.
- "Effector Constraint Set 1" can be set in the "Effector Constraint” column on screen 200 in FIG. 7, instead of "Effector Constraint 1", etc.
- This configuration allows the user to improve the freedom of settings.
- This configuration also allows the user to organize and apply multiple effector constraints set on screen 307, which leads to accuracy, safety, efficiency, etc. of the movement of arm 10A.
- screens 306, 307, etc. can be used to set each effector constraint and each effector constraint element to be enabled or disabled. It is also possible to omit the settings on screen 309 as necessary.
- the processor 21 uses the path generation program 23D to create a path for moving the position and orientation of the effector coordinate system 102 from the previous teaching point to the target teaching point based on the operation program 23B, etc. For example, the processor 21 creates the path while performing a well-known interpolation calculation between the previous teaching point and the target teaching point.
- the processor 21 performs the path creation while also applying the effector constraint.
- the path creation may be described as path creation or path generation.
- the processor 21 transmits a control command according to the created path to each servo controller 24 .
- the processor 21 performs similar processing even when the robot 10 is a collaborative robot.
- the processor 21 may generate an avoidance path for avoiding an avoidance target.
- any state may be set within the effector constraints.
- that state may be set as the neutral state.
- the path generation may result in the effector 30 remaining tilted.
- the processor 21 may, for example, bring the final attitude of the effector 30 closer to or back to 0 deg.
- the position and posture of the effector 30 at the time of setting each teaching point may be set as a neutral state.
- the user places the effector 30 in a first position and posture by hand guide operation, and performs an operation for setting a teaching point at the input unit 26, for example.
- the first position and posture are set for teaching point 1 on the screen 200, for example.
- the user can set teaching point 2 and subsequent points in the same manner.
- the user may place the actual position and posture of the effector 30 according to the image of the arm 10A during operation. For this reason, a configuration in which the above-mentioned first position and posture are set as a neutral state at each teaching point is useful for reducing the user's efforts and achieving both accuracy, safety, efficiency, etc. of the operation of the arm 10A.
- the processor 21 controls the arm 10A to perform restoring operation control to return the position and posture of the effector 30 to a neutral state.
- the restoring operation control is performed using at least one of values calculated according to, for example, a constant velocity or angular velocity, a constant acceleration or angular acceleration, the amount of deviation from the neutral state, etc.
- a spring-like variable that acts like a spring according to the amount of deviation may be used to perform the restoring operation control.
- a damper-like variable that acts like a damper according to the rate of change or angular velocity of change of the amount of deviation may be used to perform the restoring operation control.
- An inertial variable that acts like an inertial force according to the acceleration of change or angular acceleration of change of the amount of deviation may be used to perform the restoring operation control. A combination of these variables may also be used.
- the object 2 is carried on a simple tray-shaped effector 30. Since the effector 30 is in a tray shape, the object 2 may fall due to the effector 30 being tilted or moving at an inappropriate speed.
- the position of the effector coordinate 1 is set slightly above the center of gravity of the target 2 by the screens 305 and 307, and constraints on the attitude, angular velocity, and angular acceleration are set.
- the processor 21 Based on this setting, the processor 21 generates a path of the effector coordinate system 102 (effector 30) from one position and posture to another. At this time, the effector 30 carrying the target 2 tends to move like a pendulum around the neutral state position and posture set by the effector constraint. This limits large tilts and accelerations at the position of the target 2, and furthermore, the centrifugal force generated by the pendulum movement presses the target 2 against the effector 30, which helps prevent the target 2 from falling.
- the user can set the effector constraint element to a certain value that corresponds to the allowable range of acceleration in the direction corresponding to the vertical direction of the effector 30 and the direction corresponding to the centrifugal force. Also, the user can set the allowable range of acceleration in other directions to a sufficiently small value, such as 1/5 or less of the above value. In this case, the effector 30 also tends to move like a pendulum.
- the posture constraint in the effector constraint is not limited to Euler angle notation, and quaternion notation, etc. may also be used. Furthermore, the constraint does not need to be a scalar value, and may be set as a function.
- the effector constraint may be set to switch depending on the position, posture, etc. of the arm 10A. The effector constraint may be set to switch depending on the state of the arm 10A (whether or not it is holding the target 2, etc.).
- a configuration is adopted in which the original teaching position or the effector constraint can be selected in the operation program 23B.
- a "constraint priority" column is added to the screen 200 in FIG. 7 for setting whether the effector constraint is to be prioritized over the teaching point designation of the operation program 23B for each teaching point or each section of the route.
- the user can easily and reliably set which of the operation program 23B and the effector constraint is to be prioritized.
- the position and orientation (X, Y, Z, ⁇ x, ⁇ y, ⁇ z) of the effector 30 is constrained by the teaching position and teaching orientation of the operation program 23B or the effector constraint is not limited to the above example.
- the above configuration leads to a reduction in the number of constraints set at each teaching point. Also, the above configuration realizes the operation of the arm 10A that can keep the position and posture of the effector 30 in an appropriate state by having effector constraints, which can lead to the creation and selection of a path that can improve the cycle time.
- multiple effector constraints can be set, but a configuration in which only one effector constraint can be set may also be adopted.
- the function of the effector constraint is realized by providing one set consisting of a reference coordinate system, effector coordinates, and effector constraint elements, but it may be difficult to express various functions using one effector constraint. Therefore, as shown in screens 306 and 307 in FIG. 13, a configuration in which multiple effector constraints can be set may also be adopted. Also, a configuration in which multiple effector constraints can be set so that they can be applied to each target section, range, teaching point, etc. may also be adopted.
- an effector constraint set is set.
- the user sets effector constraint 1 as the first effector constraint using screens 305, 306, and 307.
- the user sets reference coordinate system 1 at a position that does not move in space, and sets effector coordinates 1 above the center of gravity of the effector.
- Effector constraint 1 sets constraints to allow translational and rotational motion of effector 30.
- Effector constraint 1 also sets constraints on angular velocity and angular acceleration. If the user selects the corresponding tag on screen 307, it becomes possible to set angular velocity, angular acceleration, etc.
- effector constraint 2 as the second effector constraint using screens 305, 306, and 307. At that time, the user sets the position and orientation of effector coordinate 2 as the position and orientation of reference coordinate system 2, and sets effector coordinate 2 below the center of gravity of the effector. Translation and rotation are not allowed in effector constraint 2.
- effector constraint 3 as the third effector constraint using screens 305, 306, and 307. In doing so, the user constrains the position and orientation of effector coordinates 2 with respect to reference coordinate system 1. Effector constraint 3 is set to allow translational and rotational motion. Effector constraint 3 also constrains the translational speed and acceleration.
- the following example describes another example of setting an effector constraint set.
- the user sets effector constraint 1 as the first effector constraint using screens 305, 306, and 307.
- the user sets reference coordinate system 1 at a position that does not move in space.
- the user also sets effector coordinates 1 on the rotation axis J3 of joint 3C shown in FIG. 1, and sets effector constraint 1.
- Effector constraint elements are set in effector constraint 1 to allow translational and rotational motion.
- angular velocity and angular acceleration are restricted in effector constraint 1.
- effector constraint 2 as the second constraint using screens 305, 306, and 307. At that time, the user sets effector coordinate 1 as reference coordinate system 2, and sets effector coordinate 2 below the center of gravity of the effector. In effector constraint 2, effector constraint elements are set so that translational and rotational movements are permitted.
- effector constraint 3 as the third effector constraint using screens 305, 306, and 307. In doing so, the user constrains effector coordinates 2 with respect to reference coordinate system 1.
- effector constraint elements are set so that translational motion is permitted.
- effector constraint elements are set so as to constrain the translational speed and acceleration.
- joint 3C In a normal robot, when attempting to move joint 3B in Figure 1 around its rotation axis J2, joint 3C also moves symmetrically around rotation axis J3, and may move so as to maintain the posture of the wrist axis. On the other hand, when attempting to move the rotation axis, this action often does not occur. With conventional settings, it is difficult to perform movement around rotation axis J3 while keeping the posture of the wrist and movable part 12 (J2 arm) between joints 3B and 3C unchanged.
- a set of the reference coordinate system, effector coordinates, and effector constraint may be referred to as one unit of effector constraint.
- an effector constraint is a collection of individual constraints such as position, speed, and acceleration, and each individual constraint is referred to as an effector constraint element.
- multiple effector constraints may be prepared, and the processor 21 calls and uses the required effector constraint from the memory unit 23.
- Multiple effector constraint sets may be prepared according to various states of the arm 10A.
- the state of the arm 10A differs depending on the type of effector 30, the type of target 2, the type of arm 10A, etc.
- An effector constraint set is a combination of multiple effector constraints.
- the user only needs to use the prepared effector constraint set. This configuration reduces the effort required for the user to make settings, and also leads to accuracy, safety, efficiency, etc. of the operation of the arm 10A.
- effector constraints By setting effector constraints, it becomes possible to generate a path that takes into account the properties of the effector 30, target 2, etc., but it is difficult to accurately reflect the properties of the effector 30, target 2, etc. in the effector constraints.
- the user can determine the effector constraints using calculations, etc., but differences in experience between users will result in variations in the accuracy of the effector constraints. In such situations, trial and error is required to input the effector constraints.
- settings for constraints that are actually necessary will be omitted, leading to unintended malfunctions.
- the effector constraint includes a plurality of effector constraint elements, and a priority can be set for at least one of the plurality of effector constraint elements as shown in a screen 307 in FIG. 13.
- the screen 307 has a column of "priority”, and a priority can be set to correspond to each effector constraint element.
- an "absolute” priority is set for the "upper limit” and “lower limit” of the angle around the X-axis, which are effector constraint elements.
- the “absolute” priority can be said to be a must-have setting that must be used by the processor 21, for example.
- Priorities are also set for the other effector constraint elements, and "absolute", “high”, and “low” are set in descending order of priority.
- the robot 10 can operate under conditions where it is not necessary to observe any of the X, Y, and Z rotational position constraints among the effector constraints, for example, and the number of options for paths that the processor 21 can set increases.
- the processor 21 can select a more effective path that can improve cycle time, etc.
- the effector constraints have priorities, such as constraints that must be observed and constraints that do not necessarily have to be observed.
- priorities such as constraints that must be observed and constraints that do not necessarily have to be observed.
- the processor 21 may be configured not to observe constraints with a low priority based on preset criteria. To realize this configuration, a priority is set for each effector constraint and each effector constraint element, and the priority is stored in the memory unit 23.
- the presets can be prepared so that there are differences in the priority of effector constraint elements.
- Effector constraints include constraints that are intentionally set by the user and constraints that are not intentionally set.
- constraints that are intentionally set by the user may be called designated constraints
- constraints that are not intentionally set and can be optimized may be called dependent constraints.
- Information indicating whether a constraint is designated or dependent may be stored in the memory unit 23 together with each effector constraint.
- the control device 1 accepts a setting for each effector constraint element as a constraint element that causes the processor 21 to use a value designated by the user, or a setting for a constraint element that allows changes by the processor 21, and the accepted setting is stored in the memory unit 23.
- the settings are indicated by "designated” and "dependent” in Figures 13, 19, and 23.
- the effector constraint when the user uses a preset effector constraint, it is desirable to initially set the effector constraint as a dependent constraint, since the details of the effector constraint are not set by the user. If the user edits a preset effector constraint, the effector constraint becomes a specified constraint. The user can later change whether the effector constraint is a specified constraint or a dependent constraint.
- effector constraints and the distinction between designated constraints and dependent constraints may be set for each effector constraint element, or may be set collectively for each effector constraint set.
- intent of the constraints can be made clearer by providing priority and a distinction between specified and dependent constraints.
- a preset automatic setting program 23F that automatically sets effector constraints and/or effector constraining elements is stored in the storage unit 23.
- the preset automatic setting program 23F automatically sets effector constraints and/or effector constraining elements based on information on the effector 30 and the target 2 that the user can objectively obtain and functions and performance (functional requirements) that the user subjectively expects.
- the functional requirements may be expressed qualitatively, such as "I don't want it to shake,””Idon't want it to tip over,””Idon't want it to be dropped,””Idon't want it to be tilted,” or “I don't want it to move from its place,” for example, with respect to the object 2.
- This function requirement can be expressed as an effector constraint element, and therefore presets of the effector constraint elements corresponding to the function requirements are stored in the storage unit 23 in advance.
- the type of combination of effector 30 and target 2 is configured so that the user can select from multiple types of presets.
- Presets include a type that fits on a tray, a type that fits on a container, a type that fits into a box, a type that is grabbed by hand, a type that is sucked in, etc.
- Presets also include a type that processes the target with a welding gun, a type that processes the target with a welding torch, a type that processes the target with various tools, etc.
- This configuration does not limit the type of effector 30, and the presets are intended to assist in information input. Effectors that do not fit into the presets can also be used.
- 3D CAD models of the effector 30 and target 2 If the shape, as well as the center of gravity and weight of target 2, the center of gravity and weight of the effector, the movable parts of effector 30, etc. are used together with the 3D CAD model, a more accurate physical model will be created. It is desirable to add parameters necessary to explain physical behavior to the physical model, such as a spring constant indicating the hardness of the material, a damping coefficient that dampens vibrations, and a friction coefficient when objects rub against each other. With a physical model, it becomes possible to reproduce physical behavior such as the behavior of grabbing with a hand and the behavior of target 2 falling in a simulation.
- the physical model used in this embodiment is for carrying out a physical simulation. Since various settings of the physical model require a lot of work, it is desirable for the model to be constructed from information that is easily available to the user.
- the approximate arrangement of the effector 30 and target 2 is determined by selecting a preset of the type of combination of the effector 30 and target 2. Once the arrangement is determined, an approximate physical model can be generated simply by adding the shape, center of gravity, weight, and the like of the characteristic parts of the effector 30 and target 2.
- the control device 1 stores in the memory unit 23 information on the type, shape, etc. of the effector 30 and the target 2, information on functional requirements, and information on effector constraint elements appropriate for realizing the functional requirements, in a mutually associated state.
- the processor 21 sets effector constraint elements based on the above information, functional requirements input by the user, information on the physical model, etc., and presents them to the user.
- a screen for setting using the presets is displayed on the display device 22 of the input unit 26 .
- the processor 21 of the control device 1 causes the display device 22 to display a screen 401 shown in Fig. 16.
- the screen 401 may be displayed instead of the screen 301.
- the screen 401 is a screen for the user to select a transition to a setting screen for effector information.
- the processor 21 causes the display device 22 to display a screen 402 of Fig. 16.
- the screen 402 is a screen for the user to select any one of a plurality of effector type settings.
- the processor 21 causes the display device 22 to display screen 403 of FIG. 16.
- Screen 403 is a screen for setting effector type 1 selected by the user. As shown on screen 403, the user can set the effector type by selection.
- Screen 404 is a screen for setting the dimensions, center of gravity, and other positions of the selected effector type.
- screen 404 is configured so that the weight, material, etc. of the selected effector type can also be set.
- the processor 21 causes the display device 22 to display screen 405 of FIG. 17.
- Screen 405 is a screen that allows the user to select any one of multiple target type settings.
- Screen 406 is a screen for setting the target type 1 selected by the user. As shown on screen 406, the user can set the target type by selection.
- Screen 407 is a screen for setting the dimensions and position of the selected target type, such as the center of gravity.
- screen 407 is configured so that the weight, material, etc. of the selected target type can also be set.
- screen 407 may also be configured so that the position of the selected target type relative to the selected effector type can also be set.
- the processor 21 causes the display device 22 to display screen 408 of FIG. 18.
- Screen 408 is a screen for setting the positional relationship of the selected target type to the selected effector type.
- the processor 21 causes the display device 22 to display screen 409 of FIG. 18.
- Screen 409 is a screen for setting the positional relationship 1 selected by the user. As shown on screen 409, the user can set the positional relationship by inputting numerical values and moving the displayed effector diagram and/or target diagram.
- the processor 21 causes the display device 22 to display screen 410 of FIG. 19.
- Screen 410 is a screen for selecting an effector type, a target type, a target positional relationship, etc.
- the effector type information may be automatically set based on input information (input) from an external device.
- input information input information
- the effector 30 is connected to the control device 1
- a signal may be sent from the effector 30 to the control device 1
- the processor 21 may set the effector type based on the input signal (input).
- the target type and target positional relationship may be automatically set.
- Screen 410 is a screen for selecting a transition to a function request (request) setting screen and for displaying the set function request.
- processor 21 causes display device 22 to display screen 411 of FIG. 19.
- Screen 411 is a screen for the user to select a function request.
- Screen 411 displays "enabled” in the position corresponding to each function request, indicating that it has been set.
- a function request (request) is, for example, a user request regarding an operation to be performed by effector 30 on target 2.
- the effector constraints are set by the settings on screens 410 and 411.
- the effector constraints include, for example, the same settings as those on screen 307. Therefore, the processor 21 can control the arm 10A using the effector constraints that have been set.
- the processor 21 causes the display device 22 to display screen 412 of FIG. 19.
- Screen 412 displays the contents of the effector constraints that have been set, and accepts changes to each setting of the effector constraints.
- Screen 412 is configured to accept user input for registering the effector constraint, whose settings have been changed, as one of the presets.
- the memory unit 23 stores a plurality of effector constraints. Furthermore, the memory unit 23 stores a plurality of effector constraints so that they correspond to a plurality of combinations of the effector type, which is the type of the effector 30, and the target type, which is the type of the target 2.
- the processor 21 sets the corresponding effector constraint. This configuration reduces the effort required for the user to make settings, and also leads to accuracy, safety, efficiency, etc. of the operation of the arm 10A.
- the effector constraint is set based only on the effector type setting.
- the effector constraint is set based only on the target type setting.
- the processor 21 sets the effector constraint based on at least one of the information on the effector type and the information on the target type, and the input for the setting by the user.
- the processor 21 sets the effector constraint based at least on the information on the effector type and the input from the external device.
- effector constraints are set based on requests input by the user. This configuration is useful for achieving a high level of both reducing the effort required for users to make settings and improving the accuracy, safety, efficiency, etc. of the operation of the arm 10A.
- the effector constraints are set according to the input values of the user, and preset effector constraints are set based on the functional requirements input by the user.
- the set effector constraints do not necessarily function normally as expected by the user. There is a possibility that important settings may be omitted, unnecessary settings may be present, fine adjustments of effector constraint elements may be insufficient, and the path may not be as expected by the user.
- the most reliable method is to check the movement path using the actual robot 10. However, if there are any imperfections in the settings, the act of checking itself poses a risk. Therefore, checking whether the effector constraints are appropriate through simulation reduces the risk.
- the user To perform a simulation, the user must input the movement pattern (movement program 23B) of the arm 10A.
- movement pattern movement program 23B
- a set of various movement patterns of the arm 10A is preferably prepared in advance as presets. The user normally selects any of the movement patterns from the presets, and in exceptional individual cases, the user creates or modifies a supplementary movement pattern by inputting it.
- 3D models of the surrounding environment 4, approaching objects including people and items carried by people, robot 10, effector 30, target 2, etc. are reproduced on the simulator, and operations such as automatic driving, jogging, and hand guide operation are simulated.
- the simulation is preferably a physical simulation that can reproduce the falling of target 2, etc. For example, already created physical models of effector 30 and target 2 are utilized.
- the simulation can calculate the acceleration of the effector 30 and the target 2, which cannot normally be monitored in reality.
- Simulation tolerances are set as permissible thresholds for the position, attitude, speed, acceleration, angular speed, angular acceleration, etc. of the effector 30 and the target 2.
- the simulation can check whether the operation of the effector 30 falls within the simulation tolerances. If a simulation tolerance corresponding to the functional requirements has been prepared in advance, that tolerance may be used. Alternatively, values, settings, etc. used as the simulation tolerances may be selected from the effector constraint set.
- the simulation can determine whether or not the functional requirements are met under any conditions assumed by the user. It is preferable that the processor 21 displays the state of operation in the simulation on the display device 22, etc.
- the processor 21 may modify, improve, or optimize the following effector constraints based on the constraint modification program 23G. This configuration is useful for reducing the user's workload while also achieving accuracy, safety, efficiency, etc., of the operation of the arm 10A.
- effector constraint elements For example, the fine-tuning of effector constraint elements by the user described above is trial and error-based, which places a large burden on the user. If priority, importance, etc. are set when setting effector constraint elements, the effector constraint elements with low importance and low priority are likely to be changed. These become the effector constraint elements to be adjusted.
- a constraint modification program 23G that modifies the effector constraint set based on the results of the simulation is stored in the memory unit 23.
- the requirement for the simulation tolerance may be used as an indicator for judging whether the effector constraint set is good or bad.
- the cycle time may also be an indicator for judging whether the effector constraint set is good or bad.
- the above-mentioned indicators for judging whether the effector constraint set is good or bad are merely examples and are not limited to these.
- the effector constraint set index may be set as an indicator for judging whether the effector constraint set is good or bad.
- the effector constraint set with the maximum (or minimum) effector constraint set index is the best effector constraint set.
- a method for modifying an effector constraint set using a simulation the following method is considered. First, a general genetic algorithm can be applied. After performing a simulation, an effector constraint set index is calculated. Based on the result of the simulation, an alternative plan for the effector constraint element to be adjusted is created. A number of alternative plans may be created at once.
- a simulation is performed again using the effector constraint elements of the alternative, and an effector constraint set index is calculated. Further alternatives are generated based on the effector constraint set with an improved effector constraint set index. The number of alternatives generated can be changed depending on the degree of improvement in the effector constraint set index.
- the above simulation and the improvement or optimization of the effector constraints based on the results of the simulation may be performed by the processor 21 of the control device 1 or by another computer.
- the other computer has a processor, display device, memory unit, input unit, etc. similar to those of the control device 1.
- the memory unit of the other computer stores programs, data, information, etc. similar to those of the memory unit 23.
- the memory unit of the other computer also stores a simulation program and models of the surrounding environment 4, robot 10, effector 30, target 2, etc.
- Effector constraints improved or optimized by another computer may be input to the control device 1, and when the input is received, the processor 21 of the control device 1 may set the input effector constraints in the operation program 23B, etc. In this case, based on the input from the computer as an external device, the processor 21 causes the arm 10A to perform an operation constrained by the effector constraints.
- a screen for simulating effector constraints is displayed on the display device 22 of the input unit 26 .
- the processor 21 causes the display device 22 to display screen 421 of Fig. 21.
- Screen 421 is a screen for the user to select an arbitrary simulation condition setting from among a plurality of simulation condition settings.
- the processor 21 When the user selects the setting of simulation condition 1 on screen 421, the processor 21 causes the display device 22 to display screen 422 of FIG. 21. Screen 422 is a screen for making various settings for the simulation. When the user selects the simulation setting on screen 422, the processor 21 causes the display device 22 to display screen 423 of FIG. 21. Screen 423 is a screen for setting the evaluation items to be evaluated in the simulation, setting the conditions for each evaluation item including the setting of the simulation tolerances, etc.
- the user After configuring settings on screens 422 and 423, the user performs operations to execute a simulation on screen 421. This causes the processor 21 to display the simulation execution screen 424 in FIG. 22, and also displays the results of the evaluation items that were set on screens 425 and 426 in FIG. 22.
- the processor 21 may also evaluate whether the operation of the effector 30 is within the simulation tolerance. When the operation of the effector 30 is not within the simulation tolerance, the processor 21 may display screen 427 of FIG. 23. When the operation of the effector 30 is not within the simulation tolerance, the processor 21 may determine or estimate the effector constraint element that is causing this, and display the effector constraint element to the user, as in screen 427. In screen 427, the color of the effector constraint element determined to be the cause is changed.
- the processor 21 can improve or optimize the effector constraints using the results of the simulation based on the constraint modification program 23G. For example, when "optimize settings" is selected on the screen 401, the effector constraints are improved or optimized.
- each effector constraint element on screen 307 of Fig. 13 is a designated constraint (user)
- some of the effector constraint elements in the acceleration/angular acceleration tabs, etc. of the screen 307 in FIG. 13 are the cause as shown in FIG. 23, and are not set with "designation", that is, are dependent constraints (optimizable).
- the processor 21 performs the improvement or optimization by changing the effector constraint elements that are determined to be the cause and are not set with "designation".
- the user can instruct the processor 21 to perform the improvement or optimization while recognizing the constraint elements that are not automatically changed.
- This configuration leads to easier setting by the user, and also leads to accuracy, safety, efficiency, etc. of the operation of the arm 10A.
- the processor 21 controls the arm 10A so that it is within the range of the effector constraint even during a jog operation, and the arm 10A operates in this manner. This reduces or prevents the effector 30 from being placed in an unintended position due to an operational error, etc. In addition, since the operation of the effector 30 is restricted by the effector constraint, the number of steps required for confirmation by the user during a jog operation is reduced. Furthermore, when the effector constraint is set to the neutral state described above, the processor 21 operates the arm 10A during a jog operation so as to bring the posture of the effector 30 closer to the neutral posture. With this configuration, the effector 30 is maintained in a state close to a desired posture without the user having to perform any special operation on the directional keys, joystick, etc.
- a hand guide operation in which the user holds the tip of the arm 10A and applies an external force to the tip to move the arm 10A.
- the direction and magnitude of the external force are detected by a sensor, and the processor 21 moves the arm 10A in the direction of the external force according to the detection result of the sensor.
- the processor 21 moves the arm 10A in the direction of the external force according to the detection result of the sensor.
- the hand guide operation if the user applies an external force in the wrong direction, this can also result in an operation error.
- a 3D model of the robot 10, a 3D model of the effector 30, and a 3D model of the target 2, which is a workpiece, are stored in the storage unit 23.
- the target 2 In handling an object, the target 2 is not always grasped, but may be integrated with the surrounding environment 4, and in particular, the target 2 may be moving on a conveyor or may be grasped by another robot system. For this reason, it is desirable to distinguish between the state of the target 2 moving together with the effector 30 (target on the effector side) and the state of the target 2 moving together with the surrounding environment 4 (target on the surrounding environment 4 side).
- a 3D model corresponding to the surrounding environment 4 is also stored in the memory unit 23, and this 3D model is also used in the interference calculation.
- the processor 21 calculates the distance between the models based on the interference calculation program 23H using robot control commands such as the robot 10, effector 30, target 2, surrounding environment 4, and operation program 23B. In principle, the processor 21 safely stops the arm 10A if the result of the interference calculation falls below the allowable distance for approaching.
- the processor 21 may be able to avoid interference by operating the arm 10A within the range of the effector constraints. For example, during jog or hand guide operations, interference may be likely to occur between the 3D model of the effector 30 and a 3D model corresponding to the surrounding environment 4. At this time, the processor 21 can move the arm 10A within the range of the effector constraints to avoid contact with the surrounding environment 4. If there were no effector constraints, the processor 21 would in principle stop the arm 10A, but being able to move as described above means that the jog or hand guide operation by the user is not interrupted. This configuration enables efficient and flexible jog and hand guide operations.
- the memory unit 23 stores effector constraints, which are constraints on changes in the position and posture of the effector 30 as viewed from a specified reference coordinate.
- the processor 21 also causes the robot 10 to perform operations constrained by the effector constraints set based on input from the user or an external device. This leads to accuracy, safety, efficiency, etc. of the robot 10's operations. For example, it becomes easier or more certain to set (avoid) postures that should be avoided depending on the type of effector 30 or target 2. It may also be possible to reduce or facilitate the effort required for the teaching or setting work described above. It may also lead to the creation or selection of an avoidance path that can improve cycle time while realizing operations of the arm 10A that can maintain the position and posture of the effector 30 in an appropriate state.
- the control device 1 also includes an input unit 26 that allows the user to input effector constraint elements of the effector constraint. This configuration is useful for setting appropriate effector constraints for a wide variety of effectors 30 and a wide variety of tasks.
- the effector constraint can set at least one of the effector constraint elements of the speed constraint, acceleration constraint, angular velocity constraint, and angular acceleration constraint as viewed from a specified reference coordinate of the effector 30.
- This configuration is useful for setting appropriate effector constraints for a wide variety of effectors 30 and a wide variety of tasks.
- by setting these effector constraint elements it may be possible to facilitate the setting of the operation settings or operation constraints of the arm 10A, for example, when setting a large number of teaching points for a complex task on the arm 10A.
- Appendix 1 A processor; a storage unit for storing an effector constraint, which is a constraint on a change in at least one of a position and a posture of the effector of the robot as viewed from a predetermined reference coordinate; A control device in which the processor causes the robot to perform an action constrained by the effector constraints set based on input from a user or an external device.
- Appendix 2 2. The control device according to claim 1, wherein the memory unit is capable of storing a plurality of the effector constraints.
- Appendix 3 3. The control device according to claim 1, wherein the storage unit is capable of storing an effector constraint set formed by combining a plurality of the effector constraints.
- Appendix 4 The control device described in Appendix 3, wherein the memory unit stores a plurality of operation programs for operating the robot and a plurality of effector constraint sets each corresponding to the plurality of operation programs.
- Appendix 5 A processor; A storage unit; a display device that displays a setting screen for setting effector constraints, which are constraints on changes in at least one of the position and posture of the effector of the robot as viewed from a predetermined reference coordinate; A control device, wherein the setting screen is for setting the effector constraints based at least on a user's input.
- Appendix 6 6. The control device according to claim 1, further comprising an input unit capable of inputting the effector constraint.
- the storage unit stores a plurality of effector constraints; the effector constraints each correspond to at least one of a type of the effector and a type of target of the effector's action; A control device as described in any of appendix 1 to 6, wherein the processor sets the effector constraints based at least on at least one of information regarding the type of the effector and information regarding the type of the target, and user input.
- the effector constraint comprises a plurality of effector constraint elements; The effector constraint can set a priority to at least one of the plurality of effector constraint elements; 7.
- the processor operates the robot using at least the effector constraints including the priority.
- the effector constraint comprises a plurality of effector constraint elements;
- the control device according to any one of claims 1 to 6, configured to accept, for each of the plurality of effector constraint elements, a setting of a designated constraint that causes the processor to use a value designated by a user, or a setting of a dependent constraint that allows the processor to change the value.
- the processor performs a simulation to cause the robot model to perform the movement using at least the effector constraints, and determines whether the movement satisfies a criterion.
- [Appendix 12] 12 The control device of claim 11, wherein the processor modifies the effector constraints to satisfy the criteria when the operation does not satisfy the criteria.
- Appendix 14 A processor; A storage unit; a display device that displays a setting screen for setting effector constraints, which are constraints on changes in at least one of the position and posture of the effector of the robot as viewed from a predetermined reference coordinate; the setting screen is for setting the effector constraint based at least on a user input,
- the processor simulates causing the robot model to perform an action using at least the effector constraints, and determines whether the action satisfies a criterion.
- Appendix 15 15. The computer of claim 14, wherein the processor modifies the effector constraints to satisfy the criteria when the action does not satisfy the criteria.
- Control device 2 Target 10 Robot 10A Arm 11 Servo motor 11A Encoder 12 Movable part 21 Processor 22 Display device 23 Memory unit 23A System program 23B Operation program 23C Control program 23D Path generation program 23F Preset automatic setting program 23G Constraint modification program 23H Interference calculation program 24 Servo controller 25 Servo controller 26 Input unit 200 Screen (operation program) 300 to 309 Screens 401 to 412 Screens 421 to 427 Screen 500 Operation section
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Numerical Control (AREA)
Abstract
Description
産業用ロボットにおいて、エフェクタの突起部が人などに向かないように経路生成する技術も知られている。例えば特許文献2を参照されたい。
ロボット10は特定の種類に限定されないが、本実施形態のロボット10は6軸を有する多関節ロボットである。ロボット10は5軸以下又は7軸以上を有する多関節ロボット、水平多関節ロボット、マルチリンクロボット等であってもよい。また、ロボット10又はそのアーム10Aが、リニアガイド等の走行装置、AGV(Automatic Guided Vehicle)、車両、歩行型ロボット等に支持されていてもよい。
なお、ロボット10が、視覚センサ、力センサ等の周知のセンサを用いて、周囲の人、物等との接触、近接等を回避できる協働ロボットであってもよい。
前述のエフェクタ30は、エフェクタとして機能するための適切な姿勢が限定的である場合がある。図4のように、例えば吸盤、磁石、電磁石を用いるハンドであるエフェクタ30は、対象2を上方等の所定の方向から吸引できないと、対象2の保持が確実でなくなる場合がある。また、例えばトレイであるエフェクタ30に対象2を載せる場合は、当然に対象2が落ちないようにユーザが配慮する必要がある。
制御装置1は、経路生成プログラム23Dを用いて動作プログラム23Bに基づき経路を生成し、制御プログラム23Cを用いて経路の通りに動けるように制御指令を生成し、アーム10Aを制御する。
なお、本実施形態では、アーム10Aの先端に設定される座標系もエフェクタ座標系102であると見なし、上記のフランジ面に設定される座標系もエフェクタ座標系102として扱う。
図5にエフェクタ制約の設定の例が示されている。図5のように、エフェクタ制約の第1の例は、エフェクタ座標系102の位置座標(X,Y,Z)の制約である。エフェクタ制約の第2の例は、エフェクタ座標系102の姿勢(X軸周り=θx、Y軸周り=θy、Z軸周り=θz)の制約である。なお、図5の例では、“0”が上限にも下限にも入力されている箇所は変化を許容しないことを意味する。エフェクタ制約が設定されていないことが“‐”等で表現されてもよい。
表示装置22には、前記選択等を行う操作部500が表示される。操作部500には、方向キー、決定キー、遷移前の画面又は上位レイヤーの画面に戻るための戻るキー等が表示され、ユーザはこれらのキー操作を用いて入力を行う。なお当該機能に対応したボタンが入力部26に設けられてもよい。
画面301においてユーザが基準座標系の設定画面への遷移を選択すると、プロセッサ21は表示装置22に図9の画面302を表示させる。画面302は、ユーザが複数の基準座標系のうち任意の基準座標系の設定を選択するための画面である。
本実施形態では、ユーザは画面302,303を用いて複数の基準座標系を設定できる。当該構成は、後述のエフェクタ制約の設定の自由度を向上する上で有用である。
本実施形態では、ユーザは画面304,305を用いて複数のエフェクタ座標を設定できる。当該構成は、後述のエフェクタ制約の設定の自由度を向上する上で有用である。
所定の基準座標として教示点や教示点間の通過点が用いられる場合、図11および図12の画面305の設定は不要となり、図13の画面307のエフェクタ座標の設定も不要となる。図13の画面307が、エフェクタ座標1を教示点又は通過点の位置および姿勢とする設定を受付けるように構成されていてもよい。
また、動作プログラム23Bの内部にてエフェクタ制約を開始/終了するように記述することによってエフェクタ制約の対象の区間が指定されてもよい。
また動作プログラム23Bによらず、常に適用されるエフェクタ制約が設定されてもよい。
またエフェクタ制約ごとに、常に適用される動作プログラム23Bが設定されてもよい。
なお、動作プログラム23Bの各教示点に設定されたエフェクタ制約やその他の設定されたエフェクタ制約に基づき、制御装置1が自動的にエフェクタ制約を設定することも可能である。この自動的に設定されるエフェクタ制約も、ユーザが各教示点に対し設定したエフェクタ制約に基づくので、ユーザの入力に基づき設定されるエフェクタ制約である。
この場合、例えばアーム10Aの作業範囲を観察する視覚センサが設けられ、制御装置1は視覚センサの出力に基づきエフェクタ30の位置、対象2の位置、前記空間内において動きがある周囲環境4、前記客を含む接近物等を認識する。制御装置1は、周囲環境4および接近物の存在範囲を認識しながら、前記作業のためにエフェクタ30が動く経路を逐次計算する。この場合でも、プロセッサ21は、前記経路の生成時に前記空間に設定されたエフェクタ制約を適用できる。
なお、本実施形態において、アーム10Aの動作の効率化には、アーム10Aの動作のサイクルタイムの向上等が含まれる。
そして、プロセッサ21は、作成された経路に応じた制御指令を各サーボ制御器24に送信する。
ロボット10が前記協働ロボットの場合でもプロセッサ21は同様の処理を行う。また、協働ロボットの場合、プロセッサ21は回避対象を回避するための回避経路の生成を行う場合もある。
例えば、画面305および画面307によってエフェクタ座標1の位置が対象2の重心より少し上に設定されると共に、姿勢、角速度、角加速度の制約が設定される。
他の例では、ユーザは、エフェクタ制約要素の設定において、エフェクタ30の上下方向に対応する方向且つ上記遠心力に対応する方向の加速度の許容範囲だけある値に設定できる。また、ユーザは、他の方向の加速度の許容範囲を前記値の1/5以下等の十分に小さい値に設定できる。この場合も、エフェクタ30が振り子のように動く傾向が出る。
上記の構成は、各教示点における制約の設定の低減に繋がる。また、上記の構成は、エフェクタ制約があることによって、エフェクタ30の位置および姿勢を適正な状態に保つことができるアーム10Aの動作を実現し、これはサイクルタイムの向上を図れる経路の作成、選択等に繋がり得る。
本実施形態では、エフェクタ制約は複数のエフェクタ制約要素を含み、図13の画面307に示すように複数のエフェクタ制約要素のうち少なくとも1つに優先度を設定できる。例えば、画面307は、“優先度”の列を有し、各エフェクタ制約要素に対応するように優先度を設定可能である。画面307では、エフェクタ制約要素であるX軸周りの角度の“上限”および“下限”に“絶対”の優先度が設定されている。“絶対”の優先度は例えばプロセッサ21が必ず用いなければならないマストアプライ設定であるとも言える。他のエフェクタ制約要素にもそれぞれ優先度が設定され、優先度が高い順に“絶対”、“高”、および“低”が設定されている。
複数のエフェクタ制約セットを有するケースでは、優先度と、指定制約と従属制約の区別をつけることで、制約の意図が分かりやすくなる。
本実施形態において、好ましくは、エフェクタ制約および/又はエフェクタ制約要素を自動的に設定するプリセット自動設定プログラム23Fが記憶部23に格納されている。プリセット自動設定プログラム23Fは、ユーザが客観的に取得できるエフェクタ30および対象2の情報と、ユーザが主観的に期待する機能や性能(機能要求)とに基づき、エフェクタ制約および/又はエフェクタ制約要素を自動的に設定する。
この機能要求は、エフェクタ制約要素として表現することが可能である。このために、機能要求に対応するエフェクタ制約要素のプリセットが予め記憶部23に記憶されている。
典型的なエフェクタ30と対象2であれば、前記のエフェクタ30と対象2の組合せのタイプのプリセットを選択することで、エフェクタ30と対象2のおおよその配置が決まる。配置が決まっていれば、エフェクタ30や対象2の特徴部分の形状、重心、重量等を追加するだけで、おおよその物理モデルが生成される。
例えば入力部26の表示装置22に前記プリセットを用いた設定のための画面が表示される。
先ず、制御装置1のプロセッサ21は表示装置22に図16に示される画面401を表示させる。画面401が画面301の代わりに表示されてもよい。画面401は、ユーザがエフェクタ情報の設定画面への遷移を選択するための画面である。
画面401においてユーザがエフェクタ情報の設定画面への遷移を選択すると、プロセッサ21は表示装置22に図16の画面402を表示させる。画面402は、ユーザが複数のエフェクタタイプの設定のうち任意のエフェクタタイプの設定を選択するための画面である。
本実施形態において、前述のように、ユーザの入力値によってエフェクタ制約が設定され、また、ユーザが入力する機能要求に基づきプリセットされたエフェクタ制約が設定される。しかし、たとえプリセットの場合であっても、設定したエフェクタ制約がユーザの期待通りに必ず正常に機能するとは限らない。重要な設定の抜け落ち、不要な設定の存在、エフェクタ制約要素の微調整の不足等が生じ、ユーザが想定していた経路とならない可能性がある。
シミュレーションを用いたエフェクタ制約セットの修正方法の例として、以下のような方法が考えられる。先ず、一般的な遺伝的アルゴリズムを適用することができる。シミュレーションの実施の後、エフェクタ制約セット指標が計算される。シミュレーションの結果に基づき、調整対象のエフェクタ制約要素の代替案が作成される。代替案は一度に複数作られてもよい。
例えば入力部26の表示装置22にエフェクタ制約のシミュレーションを行うための画面が表示される。
図20に示す画面401においてユーザがエフェクタ制約シミュレーションの画面への遷移を選択すると、プロセッサ21は表示装置22に図21の画面421を表示させる。画面421は、ユーザが複数のシミュレーション条件の設定のうち任意のシミュレーション条件の設定を選択するための画面である。
ordered)を意味する“指定”が設定されている。また、図13の画面307の加速度・角加速度のタブ等にあるエフェクタ制約要素のいくつかが図23に示すように前記原因であり、それらは“指定”が設定されていない、つまり従属制約(optimizable)であるものとする。例えば、プロセッサ21は、原因であると判断され“指定”が設定されていなエフェクタ制約要素を変更することによって前記改善又は最適化を行う。この場合、ユーザは、自動的に変更される制約要素とされない制約要素を認識しながら、プロセッサ21に前記改善又は最適化を指示できる。当該構成は、ユーザの設定の容易化に繋がり、アーム10Aの動作の正確性、安全性、効率化等にも繋がる。
ジョグ操作はユーザが入力部26の方向キー、ジョイスティック等を用いて直接アーム10Aを動かすものである。このため、アーム10Aの特性の理解や周囲環境4の観察が十分でないと、アーム10A又はエフェクタ30と周囲環境4との接触、エフェクタ30の望ましくない姿勢への変位等を招来する操作ミスが発生し易い。本実施形態では、ジョグ操作時にもエフェクタ制約を適用する設定が可能である。
また、エフェクタ制約に前述のニュートラルな状態の設定がされている場合は、プロセッサ21はジョグ操作時にエフェクタ30の姿勢をニュートラルな姿勢に近付けるようにアーム10Aを動作させる。当該構成によって、ユーザが方向キー、ジョイスティック等において特別な操作を行うことなく、エフェクタ30が好ましい姿勢に近い状態で維持される。
ジョグ操作、ハンドガイド操作を実行する際、ロボット10のアーム10Aおよびエフェクタ30が周囲環境4と接触しないことが重要である。接触しないことが明らかでない場合、干渉計算をすることが望ましい。この場合、制御装置1は、記憶部23に格納された干渉計算プログラム23Hに基づきプロセッサ21が干渉計算を行う。
まず、ロボット10の3Dモデルと、エフェクタ30の3Dモデルと、ワークである対象2の3Dモデルとが、記憶部23に格納されている。物品ハンドリングなどにおいて、対象2は常に把持されているとは限らず、周囲環境4と一体となっている場合もあり、特に対象2が搬送装置に乗って動いている場合や他のロボットシステムによって把持されている場合もある。このため、エフェクタ30と共に動く対象2(エフェクタ側の対象)か、周囲環境4と共に動く対象2(周囲環境4側の対象)かで状態を区別することが望ましい。
プロセッサと、
ロボットのエフェクタの位置および姿勢の少なくとも一方の所定の基準座標から見た変化の制約であるエフェクタ制約を格納する記憶部と、を備え、
前記プロセッサは、ユーザ又は外部機器からの入力に基づき設定される前記エフェクタ制約によって制約された動作を前記ロボットに行わせる、制御装置。
[付記2]
前記記憶部は、前記エフェクタ制約を複数記憶可能である、付記1に記載の制御装置。
[付記3]
前記記憶部は、複数の前記エフェクタ制約を組合せて成るエフェクタ制約セットを記憶可能である、付記1又は2に記載の制御装置。
[付記4]
前記記憶部には、前記ロボットを動作させるための複数の動作プログラムと、前記複数の動作プログラムにそれぞれ対応する複数の前記エフェクタ制約セットと、が格納されている、付記3に記載の制御装置。
[付記5]
プロセッサと、
記憶部と、
ロボットのエフェクタの位置および姿勢の少なくとも一方の所定の基準座標から見た変化の制約であるエフェクタ制約の設定画面を表示する表示装置と、を備え、
前記設定画面は、ユーザの入力に少なくとも基づき前記エフェクタ制約を設定するためのものである、制御装置。
[付記6]
前記エフェクタ制約を入力可能な入力部を備える、付記1~5の何れかに記載の制御装置。
[付記7]
前記記憶部は複数のエフェクタ制約を格納しており、
前記複数のエフェクタ制約は、前記エフェクタのタイプおよび前記エフェクタの作業の対象のタイプの少なくとも1つにそれぞれ対応しており、
前記プロセッサは、前記エフェクタの前記タイプに関する情報および前記対象の前記タイプに関する情報の少なくとも1つと、ユーザの入力とに少なくとも基づき前記エフェクタ制約を設定する、付記1~6の何れかに記載の制御装置。
[付記8]
前記ユーザの入力は、前記エフェクタによる前記作業に関し前記ユーザが求める要求を設定するものである、付記7に記載の制御装置。
[付記9]
前記エフェクタ制約は複数のエフェクタ制約要素を含み、
前記エフェクタ制約は、前記複数のエフェクタ制約要素のうち少なくとも1つに優先度を設定可能であり、
前記プロセッサは、前記優先度を含む前記エフェクタ制約を少なくとも用いて前記ロボットを動作させる、付記1~6の何れかに記載の制御装置。
[付記10]
前記エフェクタ制約は複数のエフェクタ制約要素を含み、
前記複数のエフェクタ制約要素の各々について、ユーザによって指定された値を前記プロセッサに使わせる指定制約の設定又は前記プロセッサによる変更を許容する従属制約の設定を受付けるように構成されている、付記1~6の何れかに記載の制御装置
[付記11]
前記プロセッサは、前記エフェクタ制約を少なくとも用いて前記ロボットのモデルに前記動作を行わせるシミュレーションを行い、前記動作が基準を満たしているか否かを判定する、付記1~10の何れかに記載の制御装置。
[付記12]
前記プロセッサは、前記動作が前記基準を満たしていない時に前記基準を満たすように前記エフェクタ制約を修正する、付記11に記載の制御装置。
[付記13]
前記エフェクタ制約は、前記エフェクタの前記所定の基準座標から見た速度の制約、前記エフェクタの前記所定の基準座標から見た加速度の制約、前記エフェクタの前記所定の基準座標から見た角速度の制約、前記エフェクタの前記所定の基準座標から見た角加速度の制約、および前記位置または前記姿勢を3回以上時間微分した量に相当する値若しくは式の制約の少なくとも1つを設定可能である、付記1~12の何れかに記載の制御装置。
[付記14]
プロセッサと、
記憶部と、
ロボットのエフェクタの位置および姿勢の少なくとも一方の所定の基準座標から見た変化の制約であるエフェクタ制約の設定画面を表示する表示装置と、を備え、
前記設定画面は、ユーザの入力に少なくとも基づき前記エフェクタ制約を設定するためのものであり、
前記プロセッサは、前記エフェクタ制約を少なくとも用いて前記ロボットのモデルに動作を行わせるシミュレーションを行い、前記動作が基準を満たしているか否かを判定する、コンピュータ。
[付記15]
前記プロセッサは、前記動作が前記基準を満たしていない時に前記基準を満たすように前記エフェクタ制約を修正する、付記14に記載のコンピュータ。
2 対象
10 ロボット
10A アーム
11 サーボモータ
11A エンコーダ
12 可動部
21 プロセッサ
22 表示装置
23 記憶部
23A システムプログラム
23B 動作プログラム
23C 制御プログラム
23D 経路生成プログラム
23F プリセット自動設定プログラム
23G 制約修正プログラム
23H 干渉計算プログラム
24 サーボ制御器
25 サーボ制御器
26 入力部
200 画面(動作プログラム)
300~309 画面
401~412 画面
421~427 画面
500 操作部
Claims (15)
- プロセッサと、
ロボットのエフェクタの位置および姿勢の少なくとも一方の所定の基準座標から見た変化の制約であるエフェクタ制約を格納する記憶部と、を備え、
前記プロセッサは、ユーザ又は外部機器からの入力に基づき設定される前記エフェクタ制約によって制約された動作を前記ロボットに行わせる、制御装置。 - 前記記憶部は、前記エフェクタ制約を複数記憶可能である、請求項1に記載の制御装置。
- 前記記憶部は、複数の前記エフェクタ制約を組合せて成るエフェクタ制約セットを記憶可能である、請求項1又は2に記載の制御装置。
- 前記記憶部には、前記ロボットを動作させるための複数の動作プログラムと、前記複数の動作プログラムにそれぞれ対応する複数の前記エフェクタ制約セットと、が格納されている、請求項3に記載の制御装置。
- プロセッサと、
記憶部と、
ロボットのエフェクタの位置および姿勢の少なくとも一方の所定の基準座標から見た変化の制約であるエフェクタ制約の設定画面を表示する表示装置と、を備え、
前記設定画面は、ユーザの入力に少なくとも基づき前記エフェクタ制約を設定するためのものである、制御装置。 - 前記エフェクタ制約を入力可能な入力部を備える、請求項1~5の何れか1項に記載の制御装置。
- 前記記憶部は複数のエフェクタ制約を格納しており、
前記複数のエフェクタ制約は、前記エフェクタのタイプおよび前記エフェクタの作業の対象のタイプの少なくとも1つにそれぞれ対応しており、
前記プロセッサは、前記エフェクタの前記タイプに関する情報および前記対象の前記タイプに関する情報の少なくとも1つと、ユーザの入力とに少なくとも基づき前記エフェクタ制約を設定する、請求項1~6の何れか1項に記載の制御装置。 - 前記ユーザの入力は、前記エフェクタによる前記作業に関し前記ユーザが求める要求を設定するものである、請求項7に記載の制御装置。
- 前記エフェクタ制約は複数のエフェクタ制約要素を含み、
前記エフェクタ制約は、前記複数のエフェクタ制約要素のうち少なくとも1つに優先度を設定可能であり、
前記プロセッサは、前記優先度を含む前記エフェクタ制約を少なくとも用いて前記ロボットを動作させる、請求項1~6の何れか1項に記載の制御装置。 - 前記エフェクタ制約は複数のエフェクタ制約要素を含み、
前記複数のエフェクタ制約要素の各々について、ユーザによって指定された値を前記プロセッサに使わせる指定制約の設定又は前記プロセッサによる変更を許容する従属制約の設定を受付けるように構成されている、請求項1~6の何れか1項に記載の制御装置。 - 前記プロセッサは、前記エフェクタ制約を少なくとも用いて前記ロボットのモデルに前記動作を行わせるシミュレーションを行い、前記動作が基準を満たしているか否かを判定する、請求項1~10の何れか1項に記載の制御装置。
- 前記プロセッサは、前記動作が前記基準を満たしていない時に前記基準を満たすように前記エフェクタ制約を修正する、請求項11に記載の制御装置。
- 前記エフェクタ制約は、前記エフェクタの前記所定の基準座標から見た速度の制約、前記エフェクタの前記所定の基準座標から見た加速度の制約、前記エフェクタの前記所定の基準座標から見た角速度の制約、前記エフェクタの前記所定の基準座標から見た角加速度の制約、および前記位置または前記姿勢を3回以上時間微分した量に相当する値若しくは式の制約の少なくとも1つを設定可能である、請求項1~12の何れか1項に記載の制御装置。
- プロセッサと、
記憶部と、
ロボットのエフェクタの位置および姿勢の少なくとも一方の所定の基準座標から見た変化の制約であるエフェクタ制約の設定画面を表示する表示装置と、を備え、
前記設定画面は、ユーザの入力に少なくとも基づき前記エフェクタ制約を設定するためのものであり、
前記プロセッサは、前記エフェクタ制約を少なくとも用いて前記ロボットのモデルに動作を行わせるシミュレーションを行い、前記動作が基準を満たしているか否かを判定する、コンピュータ。 - 前記プロセッサは、前記動作が前記基準を満たしていない時に前記基準を満たすように前記エフェクタ制約を修正する、請求項14に記載のコンピュータ。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE112022007752.2T DE112022007752T5 (de) | 2022-11-15 | 2022-11-15 | Steuereinheit und computer |
| JP2024558536A JPWO2024105777A1 (ja) | 2022-11-15 | 2022-11-15 | |
| PCT/JP2022/042393 WO2024105777A1 (ja) | 2022-11-15 | 2022-11-15 | 制御装置およびコンピュータ |
| CN202280101737.5A CN120187560A (zh) | 2022-11-15 | 2022-11-15 | 控制装置以及计算机 |
| TW112142122A TW202421392A (zh) | 2022-11-15 | 2023-11-01 | 控制裝置以及電腦 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/042393 WO2024105777A1 (ja) | 2022-11-15 | 2022-11-15 | 制御装置およびコンピュータ |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024105777A1 true WO2024105777A1 (ja) | 2024-05-23 |
Family
ID=91084011
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/042393 Ceased WO2024105777A1 (ja) | 2022-11-15 | 2022-11-15 | 制御装置およびコンピュータ |
Country Status (5)
| Country | Link |
|---|---|
| JP (1) | JPWO2024105777A1 (ja) |
| CN (1) | CN120187560A (ja) |
| DE (1) | DE112022007752T5 (ja) |
| TW (1) | TW202421392A (ja) |
| WO (1) | WO2024105777A1 (ja) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001071285A (ja) * | 1999-09-01 | 2001-03-21 | Minolta Co Ltd | 作業ロボット |
| WO2018092860A1 (ja) * | 2016-11-16 | 2018-05-24 | 三菱電機株式会社 | 干渉回避装置 |
| CN111399514A (zh) * | 2020-03-30 | 2020-07-10 | 浙江钱江机器人有限公司 | 一种机器人时间最优轨迹规划方法 |
-
2022
- 2022-11-15 DE DE112022007752.2T patent/DE112022007752T5/de active Pending
- 2022-11-15 JP JP2024558536A patent/JPWO2024105777A1/ja active Pending
- 2022-11-15 CN CN202280101737.5A patent/CN120187560A/zh active Pending
- 2022-11-15 WO PCT/JP2022/042393 patent/WO2024105777A1/ja not_active Ceased
-
2023
- 2023-11-01 TW TW112142122A patent/TW202421392A/zh unknown
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001071285A (ja) * | 1999-09-01 | 2001-03-21 | Minolta Co Ltd | 作業ロボット |
| WO2018092860A1 (ja) * | 2016-11-16 | 2018-05-24 | 三菱電機株式会社 | 干渉回避装置 |
| CN111399514A (zh) * | 2020-03-30 | 2020-07-10 | 浙江钱江机器人有限公司 | 一种机器人时间最优轨迹规划方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120187560A (zh) | 2025-06-20 |
| DE112022007752T5 (de) | 2025-07-03 |
| TW202421392A (zh) | 2024-06-01 |
| JPWO2024105777A1 (ja) | 2024-05-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11117254B2 (en) | Robotic navigation system and method | |
| Borst et al. | DLR hand II: Experiments and experience with an anthropomorphic hand | |
| US12296485B2 (en) | Robot arm with adaptive three-dimensional boundary in free-drive | |
| US10737396B2 (en) | Method and apparatus for robot path teaching | |
| US9387589B2 (en) | Visual debugging of robotic tasks | |
| US20210001484A1 (en) | Collaborative Robot System Incorporating Enhanced Human Interface | |
| US20220388156A1 (en) | Maintaining free-drive mode of robot arm for period of time | |
| JP6314134B2 (ja) | ロボット訓練のためのユーザインターフェース | |
| US20150127151A1 (en) | Method For Programming Movement Sequences Of A Redundant Industrial Robot And Industrial Robot | |
| US9919424B1 (en) | Analog control switch for end-effector | |
| JP4976883B2 (ja) | マニピュレータシステム | |
| KR20130122970A (ko) | 다관절형 로봇의 제어 장치, 제어 방법 및 제어 프로그램을 기록한 컴퓨터 판독가능한 기록 매체 | |
| US20220379463A1 (en) | Safe activation of free-drive mode of robot arm | |
| JP2019018272A (ja) | モーション生成方法、モーション生成装置、システム及びコンピュータプログラム | |
| US20240416504A1 (en) | Method for Precise, Intuitive Positioning of Robotic Welding Machine | |
| US20190311079A1 (en) | Simulation apparatus, robot, simulation method, and program therefor | |
| CN107336228B (zh) | 显示包含附加轴的状态的动作程序的机器人的控制装置 | |
| KR102219543B1 (ko) | 다관절 로봇 및 다관절 로봇 시스템 | |
| JP6904759B2 (ja) | ロボットの移動速度制御装置及び方法 | |
| Kuan et al. | VR-based teleoperation for robot compliance control | |
| WO2024105777A1 (ja) | 制御装置およびコンピュータ | |
| Allspaw et al. | Implementing virtual reality for teleoperation of a humanoid robot | |
| WO2024105779A1 (ja) | 制御装置およびコンピュータ | |
| JP7583490B1 (ja) | 制御装置、制御方法、及びプログラム | |
| JP2023140592A (ja) | 教示装置および教示プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22965753 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 112022007752 Country of ref document: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024558536 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202280101737.5 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 202280101737.5 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 112022007752 Country of ref document: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22965753 Country of ref document: EP Kind code of ref document: A1 |