[go: up one dir, main page]

WO2007019546A2 - Systeme, dispositif et procedes servant a simuler le debridement chirurgical d'une blessure - Google Patents

Systeme, dispositif et procedes servant a simuler le debridement chirurgical d'une blessure Download PDF

Info

Publication number
WO2007019546A2
WO2007019546A2 PCT/US2006/031063 US2006031063W WO2007019546A2 WO 2007019546 A2 WO2007019546 A2 WO 2007019546A2 US 2006031063 W US2006031063 W US 2006031063W WO 2007019546 A2 WO2007019546 A2 WO 2007019546A2
Authority
WO
WIPO (PCT)
Prior art keywords
wound
simulated
human body
debridement
simulator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2006/031063
Other languages
English (en)
Other versions
WO2007019546A3 (fr
Inventor
Lee A.Ii Belfor
Jenifer Seevinck
Frederick D. Mckenzie
Mark W. Scerbo
Hector Garcia
Sylvia Girtelschmid
Emre Baydogan
Wesley Adam Taggart
R. Bowen Loftin
Jessica R. Crouch
Yuzhong Shen
Leonard J. Weireter
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Old Dominion University
Original Assignee
Old Dominion University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Old Dominion University filed Critical Old Dominion University
Publication of WO2007019546A2 publication Critical patent/WO2007019546A2/fr
Anticipated expiration legal-status Critical
Publication of WO2007019546A3 publication Critical patent/WO2007019546A3/fr
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the present invention is related to the field of computer-based simulation, and, more particularly, simulation of surgical procedures.
  • Wound debridement refers generally to procedures for removing necrotic, devitalized, or contaminated tissue, and/or removing foreign objects from a patient's wound. Successful wound debridement promotes healing of the wound.
  • Surgical wound debridement is typically the fastest method of performing debridement. It tends to be the most selective method in the sense that a surgeon has virtually complete control over which tissue is removed and which is left intact. Surgical wound debridement is typically the best method of debridement for wounds afflicted with a large amount of necrotic tissue. It similarly is often the preferred method when infected tissue must also be removed.
  • the present invention provides a system, device, and related methods for simulated a surgical wound debridement.
  • the invention can be used for training medical and non-medical personnel, such as emergency and combat personnel, that are called upon to perform wound under various conditions.
  • the invention provides realistic models, visual graphics, and haptic sensations that result in an effective learning experience.
  • the invention simulates various aspects of wound debridement, including wound cleaning, tissue deformation, and foreign-body extractions.
  • the simulative experience afforded by the invention can further include sequencing guidance, performance evaluation and feedback during training sessions, and overall performance assessments designed to test the competency of personnel in performing wound debridments.
  • One embodiment of the invention is a virtual reality simulator that incorporates three-dimensional modeling of portions of the human body that exhibit realistic responses to surgical procedures performed on the body. BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic diagram of a wound debridement simulator, according to one embodiment of the invention.
  • FIG. 2 is a flowchart illustrative of a method of simulating wound debridement, according to another embodiment of the invention.
  • FIG. 3 is a schematic side view of a haptic display device, according to yet another embodiment of the invention.
  • FIG. 4 is a schematic top view of the haptic display device in FIG. 3.
  • FIG. 5 is a schematic front view of the haptic display device in FIG. 3.
  • the invention is a simulator for training medical and non-medical personnel to successfully perform various procedures associated with the debridement of a wound under a variety of circumstances.
  • the simulator more particularly, provides a realistic virtual-reality environment that creates for the user a simulated image of a portion of a human body that has suffered a wound. While viewing the wound, the user can operate the simulator to simulate cleaning the wound, deforming and removing tissue, and extracting foreign objects such as shrapnel or shards of glass from the wound.
  • the simulator simulates the look and feel of an actual performance of these procedures.
  • the user employs one or more simulative instruments concurrently such as a brush, scalpel, forceps, scissors and/or irrigator used for performing the procedures. If two or more simulative instalments are used, the user can employ them concurrently with one another. Additionally, the simulator can simulate reactions of the body to the simulative performance of the various procedures. The simulator, moreover, can be programmed to generate different bodily reactions in relation to the procedures.
  • simulative instruments such as a brush, scalpel, forceps, scissors and/or irrigator used for performing the procedures. If two or more simulative instalments are used, the user can employ them concurrently with one another.
  • the simulator can simulate reactions of the body to the simulative performance of the various procedures.
  • the simulator moreover, can be programmed to generate different bodily reactions in relation to the procedures.
  • the bodily reactions can include a change with respect to the geometry of the body as well as with respect to the image on the surface.
  • the geometry corresponds to a three-dimensional (3D) model of the body.
  • the 3D model deforms in response to and/or is modified by the simulated surgical processes, such as a deformation that arises from a simulated cutting.
  • An image of the wound, as well as changes thereto (e.g., bleeding), are projected onto, or "painted" on, the surface.
  • the surface more particularly, is the surface of the 3D model. Orientation of the imaged body or body part is the result of a mathematical transformation that is built into a graphics API.
  • the simulative performance of the various procedures provides a user with an effective learning experience, albeit one that does not necessarily require supervision of an experienced practitioner nor entail risks to an actual wound victim.
  • the simulator can provide to the user procedural sequencing, performance feedback during the simulative performance of the procedures, and post- performance evaluation of the user's performance.
  • the simulator 100 for simulating procedures relating to surgical debridement of a wound according to one embodiment is schematically illustrated.
  • the simulator 100 illustratively includes a visual display 102, the modeling system 104 in communication with the visual display, and a haptic device in communication with the tissue modeling system 106. Additionally, the simulator 100 includes a training module 108 in communication with both the modeling system 104 and the haptic device 106.
  • the simulator 100 can, according to another embodiment, also include a recordation module 110 in communication with both the modeling system 106 and the haptic device 108, as well as an evaluation module 112 in communication with the recordation module.
  • the visual display 102 can comprise, for example, a liquid crystal display (LCD), cathode ray tube (CRT) monitor, or similar type of computer-based imaging screen for generating a visual image.
  • LCD liquid crystal display
  • CRT cathode ray tube
  • the user can wear stereo-optic glasses, such as the CrystalEyes®3 glasses made by StereoGraphics Corporation of San Rafael, California, to view the visual image as a 3-D image.
  • the visual display 102 displays a simulated rendering of a portion of a human body upon which a wound has been inflicted.
  • the simulated rendering can be changed based upon user input, the user input cause a change from an image of one particular portion of a human body to another.
  • the visual display 102 can be used to render images of different wounded body portions.
  • the visual display 102 also can be changed according to user input to render different types of wounds, including, for example, a gunshot wound, shrapnel wound, or other type of wound.
  • the modeling system 104 in communication with the visual display 102 causes the animated rendering to change in response to a simulated touching of the portion of the human body.
  • the modeling system 104 for generating the actual simulation of procedures is achieved by integrating multiple, distinct modules.
  • One integrated module is a module for modeling the tissue.
  • the model for example, can be a physics-based tissue model. Accordingly, the modeling module can implement at least two different physics models for modeling tissue. The first is a mass-spring model (MSM). The second is a finite-element model (FEM).
  • MSM mass-spring model
  • FEM finite-element model
  • the tissue modeling system also includes a collision detection module. Collision detection is implemented to generate responses to simulated surgical procedures. For example, a scalpel "collides" with tissue when the scalpel intersects the "skin" of the imaged model. Collisions can similarly result from a simulated cutting, probe, or other procedure. A result of a collision can be blood flows on the surface or causing a glass shard to touch another glass shard.
  • collision detection is a distinct process separate from modeling in the sense that collision detection can provide potential input to the model, the modeling system 104 providing the integration that generates the resulting response to the input. Moreover, this embodiment of the system 104 utilizes an architecture configured to support real time updates as well as a modular software for effecting component integration to produce the desired results.
  • the modeling system 104 significantly extends conventional tissue models so as to facilitate user interaction with the model.
  • the simulation generates changes in appearance in the simulated tissue to correspond, for example, to wound cleaning, bleeding, rinsing, and treating a wound, thereby generating a realistic rendering of wound debridement.
  • the modeling system 104 thus can comprise machine-readable code for rendering portions of the human body in a manner that displays the elastic characteristics of skin, tissue, muscle, and similar such bodily components.
  • the modeling system 104 can be manifest as a physics-based model, as described above, or other model that provides a visual fidelity suitable for training.
  • the mass-spring system animating elastic characteristics of tissue can comprise creating a three- dimensional (3-D) mesh of discrete points of the object whose elastic characteristics are to be modeled. Point masses are associated with each node of the 3-D mesh, and damped springs are associated with the mesh edges.
  • the finite element model as will be readily understood by one of ordinary skill in the art, animating such elastic characteristics can comprise creating a three dimensional mesh of discrete elements whose characteristics can be modeled using a tensor. The tensor characteristics are derived from the desired bulk tissue characteristics.
  • the modeling system 104 can be made portable across diverse platforms using various known computer graphics-based libraries and toolkits.
  • the available libraries and toolkits include, for example, the OpenGL environment and the GLUT toolkit.
  • the modeling system 104 comprises a stored set of machine- readable code for rendering portions of the human body in accordance with the elastic characteristics of the various portions so rendered.
  • the machine-readable code can be stored in a memory (not shown) and executed using one or more processors (also not shown) connected with the memory, the execution generating the desired image on the visual display 102.
  • the modeling system 104 can be implemented in one or more dedicated hardwired circuits or through an integration of several distinct computing devices connected to the visual display 102.
  • the hardwired circuitry and/or integrated computing devices can be configured to generate visual renderings of portions of the human body in a manner that displays their elastic characteristics using a three-dimensional mesh of discrete points along with point masses at each node and damped springs at edges of the mesh as already described.
  • the modeling system 104 can be implemented as a combination of hardwired circuitry, computing devices, and/or machine- readable code.
  • the hap tic device 106 in communication with the tissue modeling system 104 generates force feedback (i.e., a "haptic" or tactile sensation) felt by the user in response to simulated touching of the human body.
  • the simulated touching can comprise, for example, a simulated application of a scalpel, forceps, brush, scissors, or fluid from an irrigator.
  • the haptic device 106 can include a mock instrument (not shown).
  • the haptic device 106 includes a plurality of interchangeable mock instruments configured to give the user the feel of different instruments used for performing a wound debridement, including a scalpel, forceps, brush, scissors, and irrigator.
  • the haptic device 106 causes the visual display 102 to render an image of the particular instrument in juxtaposition to the portion of the human body rendered in the same image. As the user moves the mock instrument, the visual image of the instrument moves relative to the image of the human body portion also comprising part of the visual image. [0033]
  • the degrees of freedom in movement afforded to the user are determined by a mechanical interface (not shown) that is also a component of the haptic device 106.
  • the mechanical interface more particularly, provides interfaces for input and output between the user as the mock instrument is manipulated and one or more processors (also not shown) that cause the visual image to change in response thereto.
  • the resulting position and/or orientation of the mock instrument is translated by the mechanical interface into a form suitable for interpretation by sensors of the mechanical interface.
  • the hap tic device provides electronic circuits from integrated sensors, tracking positions and using electronic signals generated by the system to produce force feedback.
  • the sensors track the movements of the mock instrument and provide suitable electronic signals to the one or more processors, which, in turn, process the position and/or orientation information and cause the image rendered by the visual display 102 to change accordingly.
  • the processors generate electronic signals corresponding to force feedback information, the signals supplied to actuators being coupled to the mechanical interface.
  • the actuators generate forces on members of the mechanical apparatus to provide corresponding forces on the mock instrument. The user, accordingly, experiences the forces so generated as realistic simulations of the tactile sensations experienced in performing the particular wound debridement procedure.
  • the training module 108 in communication with the tissue modeling system 104 and the haptic device 106 causes the system and device to operate in a predefined manner in response to at least one user-supplied input.
  • the training module 108 receives user input in the form of machine-readable data, entered for example via a keyboard or other input/output (I/O) device.
  • the training module 108 causes a particular portion of the human body to be rendered by the visual display 102 and to exhibit a particular type of wound.
  • the data also can cause the visual image, as well as the tactile responses associated therewith, to change. Accordingly, this induces not only a change in the image rendered by the visual display but also the tactile sensations generated with the haptic device 106
  • the training module 108 can cause the visual display 102 to render an image of a wound such as a thigh wound.
  • the training module 108 can cause the visual display 102 to render a specific type of wound, such as a bullet wound or shrapnel wound.
  • the training module 108 can cause the visual display to render particular, predefined characteristics, such as certain types of infection or excessive bleeding.
  • the different renderings can be used to simulate various conditions associated with particular types of wounds, thereby providing a more realistic as well as more varied learning experience for the user.
  • the training module 108 is implemented in machine-readable code.
  • the training module can be configured to run on a general-purpose or application specific computing device 110 having one or more processors and memory elements as will be readily understood by one of ordinary skill in the art.
  • the training module 108 can be implemented in one or more dedicated hardwired circuits, or as a combination of hardwired circuitry and machine-readable code.
  • the simulator 100 additionally includes a recordation module 112 in communication with both the tissue modeling system 104 and the haptic device 106.
  • the recordation module 112 records simulated responses of the portion of the human body to the simulated touching.
  • the recordation effected with the recordation module 112 provides a record of the responses induced by the particular manner in which the user performs one or more procedures for accomplishing wound debridement. If the particular wound debridement procedure or procedures are performed well, the responses generated are accordingly positive in nature. Conversely, if one or more of the procedures are not performed satisfactorily, the record will reflect the sublevel performance.
  • the simulator additionally includes an evaluation module 114.
  • the evaluation module 114 is illustratively in communication with the recordation module 112.
  • the evaluation module 114 generates a performance evaluation based upon the simulated responses recorded by the recordation module 112.
  • the evaluation module 114 can be used to identify techniques of the particular user in performing the simulated wound debridement. In particular, the evaluation module 114 can identify particular problems the user has with respect to performing one or more of the procedures related to wound debridement.
  • the simulator 100 additionally, or alternatively, includes a wound debridement procedures module 116.
  • the wound debridement procedure module 116 can be communicatively linked to the visual display 102 and/or an audio rendering device incoiporated in the simulator 100.
  • the wound debridement procedure module 116 generates user guidance, in the form of visual and/or audio output, for performing a predefined wound debridement procedure. Accordingly, the wound debridement procedure module 116 can substitute for or provide a supplement to wound debridement training by a medical professional.
  • an inexperienced user can begin immediate training by "working through” a simulated procedure under the guidance of the visual and/or audio provided by the procedure module 116 [0041]
  • the procedure module 116 provides a particularly effective teaching mechanism. Medical and non-medical personnel alike who are inexperienced in performing procedures related to wound debridement can, as already pointed out, begin immediately working through procedures in a virtual-reality environment provided by the simulator 100. Thus, the user gains hands-on experience at the outset while receiving direct instruction from the procedure module 116. The performance of the user can be evaluated as he or she carries out a procedure.
  • performance parameters can be recorded by the recordation module 112 during the simulated performance of the wound debridement, and the evaluated at the conclusion of a performance of a procedure by the evaluation module 114.
  • the simulator 100 can provide a mechanism for reaching a larger as well as more diverse, non-medical personnel included, group of individuals who have a need to be trained in the technique of wound debridement. Without the need for a patient on which to practice the procedures, inexperienced personnel can learn more efficiently and more rapidly each of the various procedures, while also avoiding risks to real-world patients.
  • the method 200 includes, at step 202, rendering a virtual-reality image of a portion of a human body upon which a wound has been inflicted.
  • the method continues at step 204 whereby the virtual-reality image is caused to change in response to a simulated touching of the portion of the human body, the touching corresponding to at least one procedure for performing a wound debridement.
  • the method 200 further includes at step 206 simulating force feedback based upon the simulated touching.
  • the step of rendering a virtual-reality image can further include causing the visual display to render an image of a particular one of a plurality of predefined portions of the human body.
  • the step of rendering a virtual-reality image further can include causing the visual display to render an image of a particular one of a plurality of predefined types of wounds.
  • the method 200 further includes recording at step 208 simulated responses of the portion of the human body to the simulated touching.
  • the method also can include, according to still another embodiment, generating at step 210 a performance evaluation based upon the simulated responses recorded. Additionally, or in lieu of steps 208 and 210, the method according to yet another embodiment can include generating user guidance for performing a predefined wound debridement procedure at step 212. The method illustratively concludes at step 214.
  • Yet another embodiment of the invention is a portable haptic display device.
  • the haptic display device can be used to display an image on a reflective surface. The image can respond interactively to manipulations using a haptic device.
  • the device can be used as an augmented/mixed-reality display device useful with a variety of haptic applications.
  • the haptic display device can be portably configured to allow easy transport in various environments.
  • the haptic display device illustratively includes a stand comprising a base portion, an extension extending from the base portion, and a holding unit connected to the extension in which a laptop computer can be positioned. Connected to the stand, as further shown, is an adjustable display surface that is connected to the stand and onto which an image generated by the laptop computer can be displayed.
  • the haptic display can include a tilt adjustment for adjustably aligning the adjustable display in relation to a viewer.
  • the holding unit is adjustably connected to the base portion, and the device further includes a tilt adjustment for adjustably aligning the holding unit relative to a viewer.
  • the display surface of the device preferably is a translucent surface that can be modified so that the translucency of the surface can be modified to accommodate a plurality of distinct applications.
  • the holding unit is preferably configured to provide access to peripheral connections of the laptop computer when the laptop computer is positioned within the holding unit.
  • the haptic display device can optionally operate with one or more haptic systems. More particularly, the haptic display device can operate with twin or dual haptic systems in order to provide for two-handed interactions. Additional representations of an embodiment of the haptic display device are provided in the APPENDIX.
  • the present invention can be realized in hardware, software, or a combination of hardware and software.
  • the present invention can be realized in a centralized fashion in one computer system, or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
  • a typical combination of hardware and software can be a general purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
  • the present invention also can be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
  • Computer program in the present context means any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Instructional Devices (AREA)

Abstract

Simulateur de débridement de blessure servant à simuler des opérations associées au débridement chirurgical d'une blessure. Ce simulateur comporte un affichage visuel servant à afficher un rendu simulé d'une partie du corps humain à laquelle on a affligé une blessure. Ce simulateur comprend également un système de modélisation communicant avec l'afficheur visuel et provoquant la modification du rendu animé en réaction à une simulation de contact tactile avec la partie concernée du corps humain. Ce simulateur comprend, de plus, un dispositif haptique communiquant avec le système de modélisation de tissu et simulant une rétroaction de force en fonction du contact tactile simulé. Ce simulateur possède également un module d'apprentissage communiquant à la fois avec le système de modélisation de tissu et le dispositif haptique et provoquant le fonctionnement du système et du dispositif de façon prédéterminée en réponse à une ou plusieurs entrées produites par l'utilisateur.
PCT/US2006/031063 2005-08-08 2006-08-08 Systeme, dispositif et procedes servant a simuler le debridement chirurgical d'une blessure Ceased WO2007019546A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US70641405P 2005-08-08 2005-08-08
US60/706,414 2005-08-08

Publications (2)

Publication Number Publication Date
WO2007019546A2 true WO2007019546A2 (fr) 2007-02-15
WO2007019546A3 WO2007019546A3 (fr) 2009-04-02

Family

ID=37728021

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/031063 Ceased WO2007019546A2 (fr) 2005-08-08 2006-08-08 Systeme, dispositif et procedes servant a simuler le debridement chirurgical d'une blessure

Country Status (1)

Country Link
WO (1) WO2007019546A2 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010083272A1 (fr) * 2009-01-15 2010-07-22 Simquest Llc Simulation interactive de tissu biologique
WO2010148078A3 (fr) * 2009-06-16 2011-07-07 Simquest Llc Simulateur de contrôle d'hémorragie
CN101441205B (zh) * 2008-11-17 2013-04-24 江苏科技大学 生物软组织力反馈触觉建模的测试系统
CN109389590A (zh) * 2017-09-28 2019-02-26 上海联影医疗科技有限公司 结肠图像数据处理系统及方法
US10729650B2 (en) 2017-01-23 2020-08-04 United States Of America As Represented By The Secretary Of The Air Force Skin punch biopsy and wound-debridgement training model
EP4070725A1 (fr) * 2021-04-07 2022-10-12 Koninklijke Philips N.V. Prédiction des forces balistiques sur un sujet
EP4145468A1 (fr) * 2021-09-07 2023-03-08 Toyota Jidosha Kabushiki Kaisha Système, procédé et programme d'estimation de blessure

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7202851B2 (en) * 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
US20050142525A1 (en) * 2003-03-10 2005-06-30 Stephane Cotin Surgical training system for laparoscopic procedures

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441205B (zh) * 2008-11-17 2013-04-24 江苏科技大学 生物软组织力反馈触觉建模的测试系统
WO2010083272A1 (fr) * 2009-01-15 2010-07-22 Simquest Llc Simulation interactive de tissu biologique
WO2010148078A3 (fr) * 2009-06-16 2011-07-07 Simquest Llc Simulateur de contrôle d'hémorragie
US9142144B2 (en) 2009-06-16 2015-09-22 Simquest Llc Hemorrhage control simulator
US10729650B2 (en) 2017-01-23 2020-08-04 United States Of America As Represented By The Secretary Of The Air Force Skin punch biopsy and wound-debridgement training model
CN109389590A (zh) * 2017-09-28 2019-02-26 上海联影医疗科技有限公司 结肠图像数据处理系统及方法
WO2019061202A1 (fr) * 2017-09-28 2019-04-04 Shenzhen United Imaging Healthcare Co., Ltd. Système et procédé de traitement de données d'images de côlon
US11216948B2 (en) 2017-09-28 2022-01-04 Shanghai United Imaging Healthcare Co., Ltd. System and method for processing colon image data
CN109389590B (zh) * 2017-09-28 2022-02-08 上海联影医疗科技股份有限公司 结肠图像数据处理系统及方法
EP4070725A1 (fr) * 2021-04-07 2022-10-12 Koninklijke Philips N.V. Prédiction des forces balistiques sur un sujet
WO2022214408A1 (fr) * 2021-04-07 2022-10-13 Koninklijke Philips N.V. Forces balistiques sur un sujet
EP4145468A1 (fr) * 2021-09-07 2023-03-08 Toyota Jidosha Kabushiki Kaisha Système, procédé et programme d'estimation de blessure

Also Published As

Publication number Publication date
WO2007019546A3 (fr) 2009-04-02

Similar Documents

Publication Publication Date Title
Liu et al. A survey of surgical simulation: applications, technology, and education
Escobar-Castillejos et al. A review of simulators with haptic devices for medical training
Schendel et al. A surgical simulator for planning and performing repair of cleft lips
Kühnapfel et al. Endoscopic surgery training using virtual reality and deformable tissue simulation
Meier et al. Virtual reality: surgical application—challenge for the new millennium
Fried et al. The role of virtual reality in surgical training in otorhinolaryngology
Li et al. Evaluation of haptic virtual reality user interfaces for medical marking on 3D models
He et al. Robotic simulators for tissue examination training with multimodal sensory feedback
Srikong et al. Immersive technology for medical education: Technology enhance immersive learning experiences
Müller et al. The virtual reality arthroscopy training simulator
Behringer et al. Some usability issues of augmented and mixed reality for e-health applications in the medical domain
KR20050047548A (ko) 가상 해부 환경을 생성하는 장치 및 방법
Okamura et al. Haptics in medicine and clinical skill acquisition [special section intro.]
KR100551201B1 (ko) 볼륨 모델 기반의 햅틱 인터페이스를 이용한 치과 치료훈련 및 평가 시스템
WO2007019546A2 (fr) Systeme, dispositif et procedes servant a simuler le debridement chirurgical d'une blessure
Nakao et al. Transferring bioelasticity knowledge through haptic interaction
Rosen et al. 14 Virtual Reality and Surgery
Coles et al. Haptic palpation for the femoral pulse in virtual interventional radiology
Frisoli et al. Simulation of real-time deformable soft tissues for computer assisted surgery
Perez et al. Cataract surgery simulator for medical education & finite element/3D human eye model
Tai et al. Real-time visuo-haptic surgical simulator for medical education–a review
Seevinck et al. A simulation-based training system for surgical wound debridement
Dumay Medicine in virtual environments
Gutiérrez-Fernández et al. An immersive haptic-enabled training simulation for paramedics
Dequidt et al. Vascular neurosurgery simulation with bimanual haptic feedback

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06801052

Country of ref document: EP

Kind code of ref document: A2