[go: up one dir, main page]

WO2008072756A1 - Procédé de présentation de force de réaction et système de présentation de force - Google Patents

Procédé de présentation de force de réaction et système de présentation de force Download PDF

Info

Publication number
WO2008072756A1
WO2008072756A1 PCT/JP2007/074188 JP2007074188W WO2008072756A1 WO 2008072756 A1 WO2008072756 A1 WO 2008072756A1 JP 2007074188 W JP2007074188 W JP 2007074188W WO 2008072756 A1 WO2008072756 A1 WO 2008072756A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
sensor
data
reaction force
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2007/074188
Other languages
English (en)
Japanese (ja)
Inventor
Kazuyuki Nagata
Mitsunori Tada
Hidetake Iwasaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Institute of Advanced Industrial Science and Technology AIST
BL Autotec Ltd
Original Assignee
National Institute of Advanced Industrial Science and Technology AIST
BL Autotec Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Institute of Advanced Industrial Science and Technology AIST, BL Autotec Ltd filed Critical National Institute of Advanced Industrial Science and Technology AIST
Priority to JP2008549389A priority Critical patent/JP5177560B2/ja
Publication of WO2008072756A1 publication Critical patent/WO2008072756A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • the present invention relates to a haptics presentation system capable of reproducing and reproducing haptics mainly used for teaching and training.
  • Haptics are a combination of tactile sensor information and motion information.
  • this is a katsu sense presentation system that can measure and reproduce operation data (operation force and movement) when a person actually operates an object.
  • operation data operation force and movement
  • a system that can record and play back operation data of tools handled by humans.
  • Patent Document 1 (Patent No. 3802483 No. Koyuki) Calligraphy learning support system
  • Patent Document 2 Japanese Patent Laid-Open No. 2005-287656 Acupuncture training system using force and touch
  • Patent Document 3 Japanese Patent Laid-Open No. 2005-189297
  • Patent Document 4 Japanese Unexamined Patent Publication No. 2000-259074
  • Patent Document 5 Japanese Unexamined Patent Publication No. 10-91327) Drawing device
  • Patent Document 6 Japanese Patent Application Laid-Open No. 2004-213351
  • Katsu sense presentation device and Katsu sense presentation system
  • Patent Document 7 Japanese Patent Application Laid-Open No. 2002-132138 Stiff puncture simulator
  • Patent Document 7 is an old training model.
  • Patent Documents 1 to 6 disclose a training method for manipulating an object in a converter virtual space, presenting a reaction force in a low-minded space, but when manipulating an object for actual operation data. There is no function.
  • the inventor of the present invention has invented an operation tool equipped with a finger-fitted kap sensor that improves the input accuracy of operation data or presents a force sense.
  • Patent Document 8 Patent No. 3261653
  • Patent Document 9 Patent No. 3 4 0 9 1 60
  • Patent Document 1 (Patent No. 3 6 2 4 3 7 4)
  • Patent Document 10 provides a reaction force function when an object is operated in i-space and a function for recording actual operation data (movement and force when an existing object is operated). It is possible to say and play back operation data when an object is operated with a finger, and there is no function to record and play back tool operation data.
  • the operating force acting on the tool can be measured by the resultant force of the fingertip sensor, and the finger movement can be measured. You can't even measure tool movement.
  • the movement of the tool cannot be measured from the movement of the finger. For example, when a person holds a pen, the grasping restraint along the central axis of the pen is indefinite, and the movement of the pen in the central axis direction cannot be measured.
  • the specific reaction force mechanism disclosed in Patent Document 10 is as follows.
  • the invention disclosed in Patent Document 10 is a knot presentation device that includes a knot sensor, a manipulator, a base, and a control device that can be worn on a finger, and the base is fixed to an external fixation.
  • the above-mentioned mapper has an active joint composed of an actuator and an angle detection sensor, a passive joint composed of only an angle detection sensor, and a link.
  • the actuator and the angle detection sensor are selectively attached to the link and the link pivot supporting portion and the link and base pivot mounting portion, respectively, which constitute the link mechanism, and measure the movement of the link mechanism by the angle detection sensor.
  • the haptic presentation device that is controlled by the control device and difficult to present the haptic to the fingertip is difficult.
  • the above-mentioned actuator may be a DC ⁇ motor or an AC W) motor.
  • the angle detection sensor may be a potentiometer or an encoder.
  • the present knot presentation device is composed of a kull sensor 1, a / h manipulator 2, and a base 3 that can be fitted to a human finger.
  • FIG. 21 shows the configuration of the katsu sensor 1 used in the katsu presentation device of the present invention.
  • the force sensor 1 includes a finger sack 20, an elastic structure 21, a finger cover 22, and a link coupling portion 23.
  • the reference numerals described in FIGS. 20 to 21 refer to the reference numerals described in Patent Document 10 and are not related to the reference numerals of the present invention.
  • the elastic structure 21 has a structure that is easily distorted with respect to a specific haptic component.
  • Figure 22 shows an example of the elastic structure 21.
  • a base 30 and a peripheral ring 31 are connected via three beams 3 2, and a strain gauge 33 is attached to each surface of the beam 3 2.
  • the beam 32 When an external force is applied to the elastic structure 21, the beam 32 is distorted according to the external force component. By converting this strain into an electrical signal by the strain gauge 33, the sense component can be taken out as an electrical signal of the strain gauge 33.
  • the strain stiffness matrix representing the relationship between the six-axis force (force and moment in three directions) acting on the elastic structure 21 and the output of the distortion stage of each beam is obtained in advance by calibration.
  • the strain Stillness Matrix is a matrix that converts the strain gauge output of each beam into a knot. Using the strain stiffness matrix, the 6-axis force acting on the elastic structure 21 can be calculated from the output signal of the strain gauge.
  • the finger cover 2 2 is the part that actually touches the object, and is connected to the peripheral ring 3 1 of the elastic structure 2 1 via the mounting block 2 4.
  • the link connection 3 ⁇ 4 2 3 is connected to the finger cover 2 2, and is connected to the manipulator 2 of the haptic device.
  • the number of active joints is equal to or greater than the number of les and sensation components presented. It becomes. Therefore, the number of active joints required is 3 or more when only 3 axis forces are presented to the fingertip, and the number of active joints required is 6 or more at the age of presenting all 6 axis forces. Become.
  • the arrangement of the degrees of freedom of the active joints was presented by the tip of the small manipulator 2! /, Placed so that motion can be generated in all directions of the haptic component.
  • the passive joint is required when the tip of the / JN manipulator 2 cannot follow the human finger 51 »only with the degree of freedom of the active joint.
  • the number of passive joints is the difference between the number of degrees of freedom of human fingers and the number of active joints equal to the degree of freedom of human fingers.
  • the basic principle of the knot presentation device of the present invention is as follows.
  • a force is applied to the human fingertip 1 2 according to the number of active joints. Can do.
  • the force applied to the fingertip 12 by the / JN3 ⁇ 4Maepiulator 2 is detected by a 6-axis knuckle sensor attached to the fingertip.
  • the target force given by the computer and the force applied to the human fingertip 1 2 by the manipulator 2 are fed back while feeding back the output of the 6-axis motion sensor 1 /
  • the MAUPULATOR 2 By controlling the MAUPULATOR 2, it is possible to present a sensation to the human fingertip 1 2.
  • a virtual object 10 Inside the plan, there is a virtual object 10 and an ideal finger 9 (hereinafter referred to as a virtual finger) force S that operates the virtual object 10.
  • the virtual finger 9 can freely move in the virtual world of the computer screen ⁇ according to the movement of the fingertip of the haptic device.
  • a human inserts a finger into the katsu sensor 1 of the katsu presentation device and moves the finger.
  • the fingertip position'fingertip speed is calculated using the angle data of the angle detection sensor 5 of the coupler 2.
  • the virtual finger 9 Exercise.
  • the virtual finger 9 is manipulated by a human finger 12 wearing a knot presentation device.
  • the gauge calculates the sense of force acting on the virtual finger 9 when moving the virtual finger 9 force S virtual world or operating the virtual object 10. This calculated force is given as a target value for force control of the manipulator 2.
  • the target force calculated by the computer is compared with the 6-axis motion sensor of the motion display device; the command value given to the actuator 4 of the M manipulator 2 so that the difference is zero. Is calculated due to the total loss.
  • the calculated command value is output from the meter to the servo driver of the actuator 4, and the actuator 4 is driven so that the motion data of the 6-axis motion sensor 1 at the fingertip is converged to the target force.
  • the virtual finger 9 inside the computer is manipulated by the movement of the finger, and the computer calculates the haptic that acts on the virtual finger 9 according to the movement of the virtual finger 9 and uses that sensation as the target force / manipulator 2
  • force-controlling it is possible to make it feel as if the virtual object is difficult to operate at the fingertip of the operation W and that it is operating.
  • a conventional katsuomi presentation system consists of a haptic interface, an object model (object model / library), an environmental model model / library), a virtual space, and a display device.
  • the Haboutique interface often uses a small robot arm composed of multiple links and joints, and the movement of the Haboutique interface and the movement of objects in the virtual space are linked.
  • a person can operate an object in the virtual space by holding the tip of the haptic interface in his hand.
  • the system has a means to place pre-built object models and difficult models in the virtual space.
  • the operation uses the haptic interface while looking at the objects in the virtual space displayed on the display device. To manipulate objects in virtual space. At this time, the system uses virtual objects and virtual ⁇ By determining the transversal force, calculating the transversal force, and controlling the torque generated by the actuator of the Haboutique interface, the worm is presented to the person.
  • Patent Document 1 Japanese Patent No. 3802483
  • Patent Document 4 Japanese Unexamined Patent Publication No. 2000-259074
  • Patent Document 8 Japanese Patent No. 3261653
  • Patent Document 9 Japanese Patent No. 3409160
  • Patent Document 10 Japanese Patent No. 3624374 Disclosure of Invention
  • An object of the present invention is to provide a haptic presentation system that can reproduce haptics. For example, since the operation data of tools handled by humans can be taken and reproduced, it is possible to take advantage of the actual environment (including fleas and training models) used for the blasphemy training. By presenting this to the trainer, we propose a system that enables more practical training and acquisition. This will enable skills.
  • the virtual model construction status is the result of operating the haptic interface when the environmental factors or tool factors are changed.
  • a reaction force presentation method characterized in that the reaction force is to be output and presented.
  • a haptic interface equipped with a tool-type haptic sensor and a haptic presentation system equipped with a control device
  • a haptic interface with a tool-type sensor is composed of a tool-type sensor and a manipulator connected to the tool-type sensor,
  • the tool type sensor has a tool sensor part and a handle, and a tool sensor part between the tool function part and the grip part.
  • the manipulator can measure the three-dimensional motion of the tool-type sensor and It has the function of presenting the sensation to the spleen that has been drawn by the user and holds the disgusting tool type sensation sensor,
  • the control device consists of an actual measurement operation unit, data storage unit, model estimator, object model database, ⁇ model database, virtual space operation unit consisting of object model, ⁇ model and interaction operation unit, overlay processing unit and display unit And
  • the control device presents a function for manipulating the virtual model in the virtual cage based on the chrysanthemum value by manipulation of the haptic interface and the manipulation when manipulating the haptic interface. It has a function to calculate the corrected katsu sense,
  • the actual operation amount obtained by the operation of the haptic interface is directly input as an operation amount to the i model that has already been constructed, and virtual objects in the virtual space or
  • the reaction force is calculated by calculating the interaction between the virtual object and the virtual environment, and the reaction force data obtained from the difficult ij value by the operation of the tool ⁇ 3 ⁇ 4 sensor is superimposed and processed by the processing unit.
  • the modified reaction force created by the haptics interface and the actual environment Presenting a sense of force that reflects the reaction force received, or presenting a corrected sense of force is difficult or the age at which the tool was changed depends on the amount of interaction modification that takes into account the amount of interaction modification.
  • Offer The difference between the reaction force that the virtual model receives due to the rule operation amount and the difference between the reaction force received from the real environment and the correction reaction force that takes into account the change factor is calculated.
  • a katsu sense presentation system that uses glue to present a katsu sense to the operation.
  • a tool sensor equipped with a 6-axis force sensor has a frequency that means that the handle or the grip part can be attached to and detached from the 6-axis sensor body (3 ) Katsukan system.
  • the present invention makes it possible to present a reaction force using an actual tool by actually operating a tool-type sensor in which the sensor is incorporated in an actual tool.
  • V Reproducibility such as training is improved.
  • the present invention it is possible to present tactile augmented reality by superimposing chrysanthemum data and model data.
  • the writing comfort of the pen and the sharpness of the scalpel can be changed in actual operations, soft objects can be presented hard, and hard objects can be presented softly.
  • the present invention it is possible to conceal the secrets of the skill of a master, such as collecting data on brush strokes of humans, female judgments of famous doctors, and extracting and abstracting the liver for using the tool. In the future, it can be applied to skill training such as surgery 'skill transfer' and l3 ⁇ 4 of tools handled by hand.
  • the haptics presentation system of the present invention is capable of mechanical interaction between the tool and «
  • FIG.5 3D display of handwriting and pen pressure (3-axis force) using a pen-type 6-axis force sensor
  • FIG.11 Shows the coupling of the writing instrument type two-on-senser body and the gripping part
  • FIG.12 A drawing showing an example of a writing instrument tip
  • FIG.13 Shows the state of attaching the tool-type katsu sensor to the arm
  • Force presentation device Reproduces X, Y, Z position information
  • Kaku SENSOR Measures the trainee's ability and displays the difference from the teacher's model
  • FIG.14 n example of three-dimensional display of frost and pen pressure (3-axis force) using a pen-type 6-axis force sensor Handwriting and pen pressure (3-axis force) recorded using a pen-type 6-axis force sensor N examples are shown with 3D display and different angles. It is an image display that is rotated clockwise with 111 as 0 °, with the vertical bar of Katakana character “i” as the axis.
  • n2 Display shaken by 36 °
  • n3 Display shaken 72 °
  • FIG. 15 Three-dimensional display of «and pen pressure (3-axis force) using a pen-type 6-axis force sensor w example Handwriting and pen pressure (3-axis force) recorded using a pen-type 6-axis force sensor An example is shown with a 3D display and a different angle. This is an image display rotated clockwise with 1 as 0 °, with the katakana character “i” as the axis.
  • Virtual space model used in [17] records data push Potan 'Regeneration (Virtual environment model) non 0
  • FIG. 19 (a) Shows the data at the time of button data playback, and represents the displacement of the virtual button when the user operates the haptic interface and presses the virtual button.
  • FIG. 19 (b) This shows the data at the time of button data playback, and represents the force presented to the user by the haptics interface.
  • FIG. 19 (c): Shows data during button data playback, showing the relationship between the virtual button operation displacement and the presentation force.
  • Patent Document 1 0 Disclosure of finger-mounted 6-axis force sensor
  • Patent Document 10 Example of a sensory sensor bullet ⁇ raw structure BEST MODE FOR CARRYING OUT THE INVENTION
  • a model virtual model can be constructed by inputting operation data of an expert through a haptic interface equipped with a kart sensor, and an actual result obtained by using a tool-type sensor by a trainee.
  • the modified reaction force is composed of the difference between the operation data and the data obtained by manipulating the virtual model in the virtual space, and output to the Haboutique interface.
  • the modified reaction force created by the Haboutine interface and the reaction force from the real environment In combination, it is presented as a sensation to the operation.
  • the motion sensor three or six axes can be used. The 6-axis has better reproducibility accuracy. The following explanation will focus on the six axes.
  • the present invention further applies this force sense to a tool model (for example, a human training model for medical training) using a tool-type sensor.
  • the reaction force is configured to reflect the difference from the actual applied object and output to the haptic interface, and the operation is controlled It is presented as a sense. Furthermore, by inputting a cow that is different from the constructed situation of the constructed virtual model, it is possible to present the training of an object in another situation for the operation 1 while using the same model.
  • the knot presentation system of the present invention includes a knotting interface and a control device including a tool type six-axis kart sensor, and operates the tool type kart sensor including a six-axis kart sensor.
  • the three-dimensional motion data of the tool type motion sensor and the data related to the force applied to the tool motion sensor from the real environment can be input to the control device, while the tool type motion sensor from the control device.
  • the reaction force can be presented.
  • the control device can operate the model in the virtual space based on the input data from the haptic interface, and provides an interaction calculation process and a superposition process. The difficulty of
  • the haptic interface consists of a small arm and a tool sensor. Maneuvering W is performed by hand-operating a haptic interface tool-type sensor.
  • the tool motion sensor is constructed using the actual tool used for the hand and tool to measure the mechanical interaction between the tool and the heel.
  • Figure 2 shows an example of a tool-type force sensor.
  • Figure 2. Pen type 6-axis motion sensor uses a commercially available pole pen at the tip and a pen tip (in this example, ZEBM ne: fc ⁇ pole pen JB! — A KNOCK pen tip was used. By attaching a pencil, marker, or other ballpoint pen tip to the tip adapter, a 6-axis motion sensor can be configured for various writing instruments.
  • the female 6-axis force sensor shown in Fig. 2 (b) has a commercially available female handle and blade. (This example shows the shape of the blade shape No. 21 and the handle of the handle blade No. 4 with Matsukichi Medical Instruments. And using an adapter to attach to the starter frame.
  • the 6-axis force sensor force S of various medical instruments can be configured.
  • various tools can be attached to the tool-type 6-axis sensor using the adapter.
  • the / J arm was a modified version of PHA ToM manufactured by Sen sAb e Te c hno logies.
  • Figure 2 shows the external appearance of the tool-type sensor equipped with a 6-axis force sensor.
  • the tool-type force sensor can be used with the one proposed in Japanese Patent Application No. 2 0 0 6-2 0 5 7 8 0.
  • a gripping tool for a 6-axis kull sensor composed of a sensor body A, a gripping part B, and a part C in a cage used to obtain operation information using a tower.
  • the part B and the healthy part C have a connecting structure that can be attached to and detached from the sensor body A, and the ream and crane are attached to the gripping part B or the hook part C, or
  • the additional connectors D and E, the gripping parts B and Z or the part C, are gripping tools for a six-axis carousel sensor provided in plural depending on the type of tool. See Figures 9-13.
  • the sensor used in the present invention is a six-axis force sensor.
  • the force sensor used in the present invention is suitable for a thin force S.
  • the outer diameter of the housing # 3 ⁇ 4 is about 1 O mm thick, and the outer diameter can be made less than 2 O mm.
  • the following is an example of the sensing part of a 6-axis force sensor using a strain gauge.
  • the sensing part of the 6-axis force sensor is an elastic structure that is easily distorted with respect to specific force components (force and moment). And flanges are connected via three beams, and strain gauges are attached to each side of the beam. When an external force is applied to this elastic structure, the beam is distorted, and this strain is converted into an electrical signal by a strain gauge, whereby the force component can be extracted as an electrical signal of the strain gauge.
  • An optical sensor unit can be used instead of the strain gauge as an element for detecting the strain of the 6-axis force sensor elastic structure.
  • This elastic structure can be made less than 2 mm thick.
  • the strain stiffness matrix representing the relationship between the six-axis force (force and moment in the three-axis direction) acting on the elastic structure and the output of the strain gauge of each beam is obtained in advance by calibration.
  • the strain Stillness Matrix is a matrix that converts the strain gauge output of each beam into force. Acts on elastic structure from strain gauge output signal using strain stiffness matrix Yes 6-axis force can be calculated.
  • the elastic structure is housed in a casing, and a projection for connecting to the link mechanism is provided to form a link connection, and a signal extraction cable is attached to form a 6-axis motion sensor body.
  • the cable is connected to an external amplifier or the like.
  • the case has a substantially cylindrical shape, so that one handle for drawing operation can be attached, and the functional part of the tool such as the knife edge of a knife can be attached to the other surface.
  • the mounting surface should be a flat surface with a screw hole, and can be screwed with the mating flat flange. As mounting brackets, protruding ribs, threaded portions, etc. can be appropriately determined. It is also possible to create unevenness on the mounting surface for alignment with the other party and for preventing rotation.
  • the sensitive surface of the elastic structure is the surface to which the tool is attached.
  • the other side is fixed, 'If3 ⁇ 4 the input surface.
  • the motion-capture sensor can collect 3D spatial information, and the haptic sensor can present position information to the trainee.
  • the gripping part is the part that is actually held to the effect, and the part that corresponds to the handle of the private tool. This gripping part can also be used for processing a handle of an actually used tool. You can also create a shape similar to the pattern of an actual ⁇ equipment.
  • Pencils, pole pens, fountain pens, etc. usually use a variety of pens, but the patterns can be handled with standard patterns. Of course, it can also be manufactured from ⁇ .
  • the gripping part and the sensor body can be structured such that a standard gripping part can be directly coupled to the mounting surface of the sensor body part.
  • which is produced by processing the handle, can be handled by using a connector to connect the tip of the handle and the sensor body.
  • the length of the gripping part is within a range in which the length force S of the actual jig can be secured, and the length excluding the length of the region and the sensor body is subjected to processing such as cutting.
  • the functional part that is the actual part of the tool such as the pen tip of the writing tool, has a unique shape according to each tool, so it is not possible to form a joint structure directly on the functional part. It is not practical, and it is preferable to connect to the sensor main body through a connector.
  • the part that actually functions is secured, and the rear end part is cut and processed into a shape that fits the connector connection within the range that can secure the length close to the actual length of the tool.
  • the knife for ⁇ has various forces Even if the bladed part is 40 to 5 O mm, it is the part that is actually about 2 O mm from the tip part. It is desirable to be able to secure a sufficient force with a bladed part S, so a sufficient length can be secured including the part covered by the connector.
  • the connector should have a shape that fits the mounting surface of the sensor body. For example, 3 ⁇ 4 ⁇ of plane matching between planes has a plane.
  • the interlocking structure in which the surfaces of the flat flanges are brought into contact with each other and screwed together can prevent the gripping part and the tip side from wobbling and increase the fixing system, and it is easy to improve information accuracy.
  • the flange portion can be made thinner and the restriction on the position where the finger is actually pressed can be reduced.
  • the private part can devise a connection structure that matches the shape of each part. For parts with similar shapes, such as a knife for machining, process the H part It is possible to make a series!
  • a round bar, a flat bar and the like that can be used in common can be provided with a dedicated product having a joint structure with the sensor body.
  • a gripping part with a flange fixed to the tip of a round bar it is preferable to interpose a connector, like the working unit side, in the case of having a pallet for various uses such as a round hexagonal bar and a flat bar of different sizes. Even with this ⁇ , the handle can be processed into a shape suitable for the continuous structure.
  • the structure that connects the handle and the heel part is inserted between the split molds and pressed against the split mold with the receiving recesses and fixed, or the base is inserted into the hollow of the cylindrical split mold. Suitable for tightening, screwing, etc.! «Can be used.
  • the operating force data is sent to the amplifier and CPU via the cable and stored. Since the gripping part and the key part can be exchanged, it is possible to collect total data by making preparations according to 3 ⁇ 41 of various work tools required for one work.
  • the entire data required for one ⁇ I can be collected, and simulator development data can be obtained. It can also be installed in Haptec Interface, a telemedicine tool.
  • FIGS. 9, 10, 11, and 12 show examples of these tool-type force sensors.
  • Figure 9 uses a ⁇ I scalpel as a possible tool.
  • FIGS. 10 to 12 show an example of a pen 3 ⁇ 4J sense sensor. These codes are used as they are.
  • Fig. 13 shows an example in which these tool-type kake sensors are attached to the / h3 ⁇ 4 arm. As this / h3 ⁇ 4 arm, the same one as shown in Patent Document 10 can be used.
  • Fig. 4 shows the basic system configuration of the kaku presentation system.
  • the basic system is the input / output device described above, the Haboutique interface, the actual measurement calculation unit, the data recording unit, the model estimator, the model database ( ⁇ model database, the object model database), the virtual space calculation unit, the actual data model data calculation Part and a display part.
  • Data can be displayed in various formats by measuring and accumulating operation data by operating the above-mentioned tool-type sensor. For example, the operation data and the data of the past can be compared and displayed by displaying the operation data and the previously stated data in an overlapping manner. Visualizing operation data is indispensable for extracting skills tips.
  • Figure 5 shows an example of a three-dimensional display of fog and pen pressure (three-axis force) taken using a pen-type six-axis force sensor.
  • 1-2-2 Objects based on real data. Model building
  • the haptic presentation system of the present invention is positioned as a “next generation force presentation device” that records and reproduces the mechanical interaction between the body, the tool and the environment, and the configuration of the system is described.
  • the haptic presentation system of the present invention is that the mechanical interaction between the tool and the environment can be ruled out. 1) Visualization of the technique, 2) Construction of the object / environment model based on the actual measurement, 3) Extension of the haptic sense Reality can be presented.
  • Fig. 6 is a system diagram of virtual model construction by fungus ij. Manipulation is performed by using a tool-type sensor.
  • a pen type sensor attached to the tip of a haptic interface can be used to write characters on paper, or a female type sensor attached to the tip of a haptic interface can use a knife type sensor. Cut an object with a sensor, or press a button or switch on the tip link of a haptic interface.
  • the actual measurement calculation unit of the system measures the transversal force applied to the pen, scalpel, or tip link and the movement of the pen, scalpel, tip link during the creation of the object, and records them in the data storage device.
  • the reaction force that the object in the virtual space receives from the environment is calculated by the interaction calculation,
  • the reaction force is presented to Maneuver W by the Itku interface.
  • the physical properties of the object ⁇ in the real space can be reflected on the model of the virtual space on the spot and presented to the maneuver ⁇ .
  • the operation is transferred to an object different from the object previously used.
  • the physical characteristics I ⁇ raw of an object are different from the physical characteristics ( ⁇ raw of the object model boundary model previously described.
  • the panel panel coefficient is non-linear and is a function of the button push amount ⁇ X.
  • the panel coefficient of the textiles registered in the past is km (Ax).
  • the / net coefficient of the actual button is kr ( ⁇ ).
  • the reaction force f r felt when a person presses the real button is given by equation (1).
  • f r is the reaction force felt when a person presses the button.
  • is the amount of pressing of the button.
  • ⁇ X is measured by the haptic interface.
  • control unit generates a force f h corresponding to the push amount ⁇ X of the button by the equation (2) and presents it with gratitude.
  • the force fm felt by the operation is the sum of the reaction force fh generated by the Haboutique interface and the reaction force actually received from the button ⁇ r, and the panel coefficient kr ( ⁇ ) Despite pressing the actual button, you can feel as if you are pressing the button of the panel coefficient km ( ⁇ ), which was recognized in the past.
  • the panel features of the buttons and switches around you (not the raw «
  • Kikunori data and virtual data can be superimposed not only by pressing the button, but also by correcting the frictional characteristics between the pen and paper when writing characters with the pen, to improve the writing comfort of S (J This can be realized, or the impedance characteristics when cutting an object with a knife can be modified to create a different cutting sensation, for example, the impedance characteristics when a human skin is touched with a knife are recorded in advance.
  • FIG. 8 is a system configuration diagram that realizes superimposition of measured data 'virtual data'.
  • Manipulation is performed on difficult objects using a haptic interface.
  • a pen type sensor attached to the tip of a optic interface can be used to write text on paper, or a female sensor attached to the tip of a haptic interface can be used as a knife type sensor. Cut the object with the sensor or press the button or switch with the tip link of the pttic interface.
  • the actual measurement calculation unit of the control unit of the system measures the worm power applied to the pen, scalpel, or tip link and the movement of the pen, scalpel, tip link during the operation of the material, and the data is stored in the data storage device. Is done.
  • the lotus movement of the two / le-type motion sensor which is the operation of the Haboutique interface, is also used to operate the object model in the virtual space, and the interaction computing power between the object and the chain in the virtual space s rows Is called.
  • the following two methods for superimposing difficult IJ data and thought data there are the following two methods for superimposing difficult IJ data and thought data.
  • reaction force calculated from the model and the fungus IJ in the overlay process
  • modified reaction force is calculated by directly comparing the reaction force.
  • buttons press This is a method that does not estimate the physical parameters of the actual sentence.
  • the amount of button press . ⁇ X is input to the virtual space from the haptic interface and calculated by the formula 111 (4) in the virtual space interaction calculation. Is input.
  • the reaction force fr when the actual button measured by the control unit is pressed is input to the superposition processing unit (Fig. 8 (a)).
  • the reaction force f h is calculated by equation (5) and input to the haptic interface as an interaction correction amount.
  • the haptic interface can present the reaction force fh in Eq. (5) to the maneuver.
  • the maneuver is the reaction force fh generated by the haptic interface, and the sum of the reaction force fr actually received from the button is the reaction force. feel.
  • the second is an object calculated from the model in the overlay process.
  • the ⁇ physical parameter and the physical parameter calculated in real time by the model estimator using the chrysanthemum data are compared and corrected.
  • the force is calculated.
  • the push amount ⁇ X when the button is actually pressed in the haptic interface is punishable.
  • the amount of indentation ⁇ that has been shojed is input to the virtual space, the button panel count kni (A X) is calculated by the virtual space interaction calculation, and input to the overlay process.
  • the panel estimator kr (A x) of the real buttons is calculated by the model estimator using the Kikuchi rule data from the haptic interface and input to the overlay process (Fig. 8 (b)).
  • the button push amount ⁇ ⁇ is input to the overlay process (Fig. 8 (a)).
  • the reaction force fh is calculated by the calculation of equation (6) and input to the haptic interface as an interaction correction amount.
  • the haptic interface has the reaction force fh in equation (6) By presenting to the operation, the operation feels as the reaction force the sum of the reaction force fh generated by the haptic interface and the reaction force fr actually received from the button.
  • FIG. 14 shows examples of the handwriting and pressure (triaxial force) displayed in 3D.
  • the data of subject ⁇ is shown in Fig. 14 with the viewpoint changed from ⁇ 1 to Q5
  • the data of subject w is shown in Figure 15 with the viewpoint changed to w1 to w5 .
  • the perspectives of both figures are the same, and the image display is the right rotation with 11 1 as 0 °, with the axis of the force Takana character “i” as the axis.
  • nl is displayed from the front with respect to the paper surface, (2) 11 2 is displayed at 3 6 °, (3) 11 3 is displayed at 7 2 °, (4) n 4 is 1 0 8 ° Waved display, (5) n 5 is 1 4 4 The same is true for w1 to w5.
  • Fig. 16 shows an example of the display using n2.
  • the strength and direction of writing pressure are displayed as a three-dimensional vector as a hairline, and the speed at which the brush is carried is represented by the spacing between the hairlines. The shorter the hairline spacing force S, the slower the brush strokes.
  • the * of subject n shown in Fig. 1.4 is a slow carrying with strong writing pressure
  • the subject S shown in Fig. 15 has a low writing pressure and fast carrying.
  • the direction in which the pen pressure is applied is also indicated by the hairline.
  • such an actual handwriting can be displayed by changing the viewpoint three-dimensionally, and can also be displayed in comparison with each other.
  • the model of the subject n is used as a model, it is possible to present a sense of comparison with the subject w.
  • paper with different conditions can be set to build model handwriting for subject n, and subject w can present a sense of movement under the cow.
  • the present invention it is possible to acquire operation data using a hand tool that is very close to the actual operation, in a model that is close to the actual experience while operating, and to train for the presentation of force compared with the model. Can do. Alternatively, you can improve your proficiency by leaving your operating staff.
  • the haptic presentation system proposed in the present invention is composed of a tool-type sensation sensor using a tool with a / 3 ⁇ 4 arm (see FIG. 3). This makes it possible to measure the mechanical interaction between the tool and the environment in addition to the ability to provide haptic data.
  • the haptic data when the ⁇ / button is pressed is expressed 3 ⁇ and reflected in the virtual space on the spot and presented to the operation Indicates.
  • Fig. 17 shows the virtual space model used in the push button data recording and playback experiment.
  • the press button is represented by a cylinder described by ⁇ r [mm] and height h [mm], and has a stroke s [mm] as an attribute.
  • Fig. 17 represent the proxy indicated as the action point of the tool sensor, and move in the virtual space in conjunction with the movement of the haptic interface. If you operate the two / le-type sensor and press the virtual space button with proxy, the button will be pressed within the stroke range of the button.
  • Figure 18 shows data that records the displacement and reaction force of the button when the button is pressed in the haptic interface.
  • Figure 1 8 is the Ha boutique interface key board This is the Force-Displacement curve when the (ELECOM TK-U12FYALBK) key is pressed.
  • the upper curve in Fig. 18 is the data when the key is pressed, and the lower curve is the data when the key is released. In this way, the Force-Displacement curve when the button is pressed becomes a different curve when the button is pressed and when it is released, and hysteresis is drawn.
  • the Force-Displacement curve was constructed by discretizing the displacement of the button at regular intervals and calculating the reaction force against the displacement by using the distance between the sampling data lines.
  • the maximum displacement of the Force ⁇ Displacement curve was used as the virtual button stroke.
  • the reaction force of the recorded button data was divided into the following five conditions (represented in Equations 7 to 11) based on the proxy and virtual button removal status, and ⁇ .
  • f f release—max
  • f the force that the haptic interface will present to the user, f press and f release are respectively
  • the force at the time of pressing and releasing from the Force-Displacement curve ( Figure 18)
  • Kp, ⁇ are servo gains
  • is the displacement after reaching the bottom of the button
  • Vc is the speed in the direction of pressing the button
  • frelease-max is the maximum value of frelease.
  • Figure 19 shows the data during button data playback.
  • Fig. 19 (a) shows the displacement of the virtual button when the user manipulates the virtual interface and presses the virtual button
  • Fig. 19 (b) shows the force presented to the user by the Haboutique interface
  • Fig. 19 ( c) shows the relationship between the virtual button operation displacement and the presentation force.
  • the peak in Fig. 9 (b) is when proxy reaches the bottom of the button. The feeling of data reproduction was satisfactory when proxy was within the stroke range of the button.
  • buttons press data used as an example of “virtual model construction based on actual measurement”.
  • the playback results are all online based on the data.
  • the button data in front of me was reflected in the virtual space and presented to the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de présentation de force de réaction qui compare un modèle virtuel, construit dans un ordinateur, à une valeur de sortie réellement mesurée, obtenue à partir d'une interface haptique ayant un capteur de force de type outil, de façon à obtenir une différence, et qui présente la différence à un outil d'actionnement d'article réel. Le procédé calcule la différence entre une force de réaction appliquée au modèle virtuel conformément à une quantité d'actionnement et une force de réaction appliquée à l'outil d'actionnement réel, et présente la différence en tant que force de réaction. L'invention concerne également un système de présentation de force capable d'enregistrer et de reproduire l'interface haptique.
PCT/JP2007/074188 2006-12-13 2007-12-11 Procédé de présentation de force de réaction et système de présentation de force Ceased WO2008072756A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008549389A JP5177560B2 (ja) 2006-12-13 2007-12-11 反力提示方法、および力覚提示システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-336138 2006-12-13
JP2006336138 2006-12-13

Publications (1)

Publication Number Publication Date
WO2008072756A1 true WO2008072756A1 (fr) 2008-06-19

Family

ID=39511767

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/074188 Ceased WO2008072756A1 (fr) 2006-12-13 2007-12-11 Procédé de présentation de force de réaction et système de présentation de force

Country Status (2)

Country Link
JP (1) JP5177560B2 (fr)
WO (1) WO2008072756A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009187550A (ja) * 2008-02-04 2009-08-20 Gwangju Inst Of Science & Technology 拡張現実におけるハプティックインタラクション方法およびそのシステム
JP2011238068A (ja) * 2010-05-11 2011-11-24 Nippon Hoso Kyokai <Nhk> 仮想力覚提示装置及び仮想力覚提示プログラム
JP2014109818A (ja) * 2012-11-30 2014-06-12 Hiroshima Univ 力覚提示システム
JP2015127965A (ja) * 2015-01-06 2015-07-09 三菱プレシジョン株式会社 手術シミュレーション用モデルの生成方法、手術シミュレーション方法、及び手術シミュレータ
JP2015146194A (ja) * 2009-06-25 2015-08-13 サムスン エレクトロニクス カンパニー リミテッド 仮想世界処理装置および方法
CN115795731A (zh) * 2022-11-30 2023-03-14 南京邮电大学 一种基于p-i模型的气动软体机械臂迟滞建模方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6010729B2 (ja) * 2013-07-31 2016-10-19 株式会社栗本鐵工所 力覚提示システム
KR20200070607A (ko) * 2018-12-10 2020-06-18 (주)리얼감 강도를 이용한 포스 피드백 방법 및 시스템, 기계로 읽을 수 있는 저장 매체
US11396103B2 (en) * 2020-03-10 2022-07-26 Samsung Electronics Co., Ltd. Method and apparatus for manipulating a tool to control in-grasp sliding of an object held by the tool

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0181547U (fr) * 1987-11-20 1989-05-31
JPH10177387A (ja) * 1996-10-18 1998-06-30 Yamaha Corp 力覚駆動装置、力覚付与方法および記録媒体
JP2000259074A (ja) * 1999-03-11 2000-09-22 Minolta Co Ltd 道具媒介型の力覚呈示システム
JP3624374B2 (ja) * 2000-12-12 2005-03-02 独立行政法人産業技術総合研究所 力覚呈示装置
JP3802483B2 (ja) * 2002-12-26 2006-07-26 大日本印刷株式会社 書道学習支援システム、コンピュータ、プログラム、及び記録媒体

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4273464B2 (ja) * 2006-07-28 2009-06-03 独立行政法人産業技術総合研究所 6軸力覚センサ用把持型ツール

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0181547U (fr) * 1987-11-20 1989-05-31
JPH10177387A (ja) * 1996-10-18 1998-06-30 Yamaha Corp 力覚駆動装置、力覚付与方法および記録媒体
JP2000259074A (ja) * 1999-03-11 2000-09-22 Minolta Co Ltd 道具媒介型の力覚呈示システム
JP3624374B2 (ja) * 2000-12-12 2005-03-02 独立行政法人産業技術総合研究所 力覚呈示装置
JP3802483B2 (ja) * 2002-12-26 2006-07-26 大日本印刷株式会社 書道学習支援システム、コンピュータ、プログラム、及び記録媒体

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009187550A (ja) * 2008-02-04 2009-08-20 Gwangju Inst Of Science & Technology 拡張現実におけるハプティックインタラクション方法およびそのシステム
JP2015146194A (ja) * 2009-06-25 2015-08-13 サムスン エレクトロニクス カンパニー リミテッド 仮想世界処理装置および方法
JP2011238068A (ja) * 2010-05-11 2011-11-24 Nippon Hoso Kyokai <Nhk> 仮想力覚提示装置及び仮想力覚提示プログラム
JP2014109818A (ja) * 2012-11-30 2014-06-12 Hiroshima Univ 力覚提示システム
JP2015127965A (ja) * 2015-01-06 2015-07-09 三菱プレシジョン株式会社 手術シミュレーション用モデルの生成方法、手術シミュレーション方法、及び手術シミュレータ
CN115795731A (zh) * 2022-11-30 2023-03-14 南京邮电大学 一种基于p-i模型的气动软体机械臂迟滞建模方法

Also Published As

Publication number Publication date
JPWO2008072756A1 (ja) 2010-04-02
JP5177560B2 (ja) 2013-04-03

Similar Documents

Publication Publication Date Title
JP6049788B2 (ja) 仮想道具操作システム
WO2008072756A1 (fr) Procédé de présentation de force de réaction et système de présentation de force
Dipietro et al. A survey of glove-based systems and their applications
CN109979600A (zh) 基于虚拟现实的眼眶手术训练方法、系统和存储介质
JP3624374B2 (ja) 力覚呈示装置
Daniulaitis et al. Medical palpation of deformable tissue using physics-based model for haptic interface robot (HIRO)
CN106156398A (zh) 用于计算机辅助地模拟外科手术的设备和方法
Alhalabi et al. Medical training simulation for palpation of subsurface tumor using HIRO
KR101092372B1 (ko) 하이브리드 의료 시뮬레이션 시스템 및 방법
Wei et al. Augmented optometry training simulator with multi-point haptics
Ershad et al. Adaptive surgical robotic training using real-time stylistic behavior feedback through haptic cues
Suzuki et al. Simulator for virtual surgery using deformable organ models and force feedback system
CN106575486B (zh) 医学手术模拟器
Suzuki et al. Virtual surgery system using deformable organ models and force feedback system with three fingers
Burdea Virtual reality and robotics in medicine
Pihuit et al. Hands on virtual clay
JP7467293B2 (ja) シミュレーションシステム及びシミュレーション方法
CN2733316Y (zh) 一种手术刀的力采集装置
Liu et al. Force modeling for tooth preparation in a dental training system
Yi et al. The implementation of haptic interaction in virtual surgery
Wang et al. Virtueledent: A compact xr tooth-cutting training system using a physical emr-based dental handpiece and teeth model
Wagner Force feedback in surgery: physical constraints and haptic information
Poon et al. A novel user-specific wearable controller for surgical robots
CN1677064A (zh) 一种手术刀的力采集装置
Patel Enhancing Virtual Reality Training with Haptic Feedback: A Taxonomy of Devices and Development of a Tactile Interface for Vr-Based Manufacturing Training

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07850677

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008549389

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07850677

Country of ref document: EP

Kind code of ref document: A1