WO2009049282A2 - Simulateur mixte et son utilisation - Google Patents
Simulateur mixte et son utilisation Download PDFInfo
- Publication number
- WO2009049282A2 WO2009049282A2 PCT/US2008/079687 US2008079687W WO2009049282A2 WO 2009049282 A2 WO2009049282 A2 WO 2009049282A2 US 2008079687 W US2008079687 W US 2008079687W WO 2009049282 A2 WO2009049282 A2 WO 2009049282A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- simulator
- physical
- physical object
- virtual
- virtual model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/30—Anatomical models
Definitions
- a simulation provides representations of certain key characteristics or behaviors of a selected physical or abstract system. Simulations can be used to show the effects of particular courses of action.
- a physical simulation is a simulation in which physical objects are substituted for a real thing or entity. Physical simulations are often used in interactive simulations involving a human operator for educational and/or training purposes. For example, mannequin patient simulators are used in the healthcare field, flight simulators and driving simulators are used in various industries, and tank simulators may be used in military training.
- Physical simulations or objects provide a real tactile and haptic feedback for a human operator and a 3-dimensional (3-D) interaction perspective suited for learning psycho-motor and spatial skills.
- MPS mannequin patient simulator
- These MPSs have been widely adopted and consist of an instrumented mannequin that can sense certain interventions and, via mathematical models of physiology and pharmacology, the mannequin reacts appropriately in real time. For example, upon sensing an intervention such as administration of a drug, the mannequin can react by producing an increased palpable pulse at the radial and carotid arteries and displaying an increased heart rate on a physiological monitor.
- real medical instruments and devices can be used with the life-size MPSs and proper technique and mechanics can be learned.
- Physical simulations or objects are limited by the viewpoint of the user, hi particular, physical objects such as anesthesia machines (in a medical simulation) and car engines (in a vehicle simulation) and physical simulators such as MPSs (in a medical simulation) remain a black-box to learners in the sense that the internal structure, functions and processes that connect the input (cause) to the output (effect) are not made explicit.
- the user's point of reference in, for example, a mannequin patient simulator is from the outside looking in any direction at any object, but not from within the object.
- many visual cues such as a patient's skin turning cyanotic (blue) from lack of oxygen are difficult to simulate.
- Virtual simulations have also been used for education and training.
- the simulation model is instantiated via a display such as computer, PDA or cell phone screens, or stereoscopic, 3-D, holographic or panoramic displays.
- An intermediary device often a mouse, joystick, or WiiTM, is needed to interact with the simulation.
- Virtual abstract simulations such as transparent reality simulations of anesthesia machines and medical equipment or drug dissemination during spinal anesthesia, emphasize internal structure, functions and processes of a simulated system. Gases, fluids and substances that are usually invisible or hidden can be made visible or even color-coded and their flow and propagation can be visualized within the system.
- the simulator cannot be directly touched like a physical simulation.
- CAVETM Cave Automatic Virtual Environment
- monitor-based 2-dimensional (2-D) or 3-D graphics or video based simulations while easy to distribute, may lack in-context integration.
- the graphics or video based simulations can provide good abstract knowledge, but research has shown that they may be limited in their ability to connect the abstract to the physical.
- the subject invention provides mixed simulator systems that combine the advantages of both physical objects/simulations and virtual representations.
- Two modes of mixed simulation are provided.
- a virtual representation is combined with a physical simulation or object by using a tracked display capable of displaying an appropriate dynamic virtual representation as a user moves around the physical simulation or object.
- the tracked display can display an appropriate virtual representation as a user moves or uses the object or instrument.
- a virtual representation is combined with a physical simulation or object by projecting the virtual representation directly onto the physical object or simulation
- the projection includes correcting for unevenness of the surface onto which the virtual representation is projected.
- the user's perspective can be tracked and used in adjusting the projection.
- Embodiments of the mixed simulator system are inexpensive and highly portable.
- the subject mixed simulator incorporates existing mannequin patient simulators as the physical simulation component to leverage off the large and continuously growing number of mannequin patient simulators in use in healthcare simulation centers worldwide.
- the physical simulation is no longer a black box.
- a virtual representation is interposed or overlaid over a corresponding part of a physical simulation or object. Using such virtual representations, virtual simulations that focus on the internal processes or structure not visible with a physical simulation can be observed in real time while interacting with the physical simulation.
- instruments, diagnostic tools, devices, accessories and disposables commonly used with the physical object or simulator can also be tracked.
- Such other instruments and devices can be a scrub applicator, a laryngoscope, syringes, endotracheal tube, airway devices and other healthcare devices in the case of a mannequin patient simulator, hi the case of an anesthesia machine, the instruments can include, for example, a low pressure leak test suction bulb, a breathing circuit, a spare compressed gas cylinder, a cylinder wrench or gas pipeline hose.
- the instruments can include, for example, a wrench or any other automotive accessories, devices, tools or parts.
- the resulting mixed simulation can thus take into account and react accordingly to a larger set of interactions between the user, physical object and instrument.
- This can include visualizing, through the virtual representation, effects that are not visible, explicit or obvious, such as the efficacy of skin sterilization and infectious organism count when the user is applying a tracked scrub applicator (that applies skin sterilizing agent) over a surface of the mannequin patient simulator.
- the virtual representations can be abstract or concrete. Abstract representations include, but are not limited to, inner workings of a selected object or region and can include multiple levels of detail such as surface, sub-system, organ, functional blocks, structural blocks or groups, cellular, molecular, atomic, and sub-atomic representations of the object or region.
- Concrete representations reflect typical clues and physical manifestations of an object, such as, for example, a representation of vomit or blue lips on a mannequin.
- a mixed simulation for healthcare training is particularly useful for training healthcare professionals including physicians and nurses. It will be clear, however, from the descriptions set forth herein that the mixed simulation of the subject invention finds application in a wide variety of healthcare, education, military, and industry settings including, but not limited to, simulation centers, educational institutions, vocational and trade schools, museums, and scientific meetings and trade shows.
- FIGS. IA and IB show a mixed simulation system incorporating a hand-held display and a physical mannequin patient simulator according to one embodiment of the present invention.
- Figure 2 shows a mixed simulation system incorporating a projector and a physical mannequin patient simulator according to one embodiment of the present invention.
- Figure 3 shows a diagram of a tracking system according to an embodiment of the present invention.
- Figures 4A-4C show a mixed simulation system incorporating a hand-held display and an anesthesia machine according to an embodiment of the present invention.
- Figure 5 shows a real-world view of a user touching an incompetent inspiratory valve and a view of an incompetent inspiratory valve in the virtual display according to an embodiment of the present invention.
- Figure 6 shows a window providing a look-at indicator available during afteraction review in accordance with an embodiment of the present invention.
- Figure 7 shows a window for past interaction boxes available during after-action review in accordance with an embodiment of the present invention.
- FIG. 8 shows a block diagram of a specific implementation of the tracking system in accordance with an embodiment of the present invention.
- Figure 9 shows a functional block diagram of a mixed simulator system according to an embodiment of the present invention.
- Figure 10 illustrates transforming a 2-D VAM component to contextualized 3-D in accordance with an embodiment of the present invention.
- Figure 11 shows three states of the mechanical ventilator controls for the VAM.
- Figures 12 A and 12B show pipes between components where the pipes represent the diagrammatic graph arcs.
- Figure 12A shows the VAM arcs of simple 2-D paths; and
- Figure 12B shows the arcs transformed to 3-D.
- the subject invention pertains to mixed simulations incorporating virtual representations with physical simulations or objects.
- the subject invention can be used in, for example, healthcare, industry, and military applications to provide educational and test scenarios.
- the current invention provides an in-context integration of an abstract representation (e.g., transparent reality whereby internal, hidden, or invisible structure, function and processes are made visible, explicit and/or adjustable) with the physical simulation or object.
- an abstract representation e.g., transparent reality whereby internal, hidden, or invisible structure, function and processes are made visible, explicit and/or adjustable
- the subject invention provides an abstract representation over a corresponding area of the physical simulation or object.
- concrete virtual representations can be provided with respect to the physical simulation or object.
- representations can be provided for the interactions, relationships and links between objects, where the objects can be, but are not limited to, simulators, equipment, machinery, instruments and any physical entity.
- the subject mixed simulator combines advantages of both physical simulations or objects and virtual representations such that a user has the benefit of real tactile and haptic feedback with a 3-D perspective, and the flexibility of virtual images for concrete and abstract representations.
- a "concrete representation'" is a true or nearly accurate representation of an object.
- An “abstract representation” is a simplified or extended representation of an object and can include features on a cut-out, cross- sectional, simplified, schematic, iconic, exaggerated, surface, sub-system, organ, functional blocks, structural blocks or groups, cellular, molecular, atomic, and subatomic level.
- the abstract representation can also include images typically achieved through medical or other imaging techniques, such as MRI scans, CAT scans, echography scans, ultrasound scans, and X-ray.
- a virtual representation and a physical simulation or object can be combined using implementations of one of two modes of mixed simulation of the present invention.
- a virtual representation is combined with a physical simulation or object by using a tracked display capable of displaying an appropriate dynamic virtual representation as a user/viewer moves around the physical simulation or object.
- the tracked display can be interposed between the viewer(s) and the physical object or simulation whereby the display displays the appropriate dynamic virtual representation as viewers move around the physical object or simulation or move the display.
- the tracked display can provide a virtual representation for the physical object or simulation within the same visual space of a user.
- the display can be handheld or otherwise supported, for example, with a tracked boom arm.
- the virtual representations provided in the display can be abstract or concrete.
- tracking can be performed with respect to the display, a user, and the physical object or simulation. Tracking can also be performed with respect to associated instruments, devices, tools, peripherals, parts and components. In one embodiment where multiple users are performing the mixed simulation at the same time, the tracking can be simultaneously performed with respect to the multiple users and/or multiple displays. Registration of the virtual objects provided in the display with the real objects (e.g., the physical object or simulator) can be performed using a tracking system. By accurately aligning virtual objects within the display with the real objects (e.g., the physical object or simulator), they can appear to exist in the same space. In an embodiment, any suitable tracking system can be used to track the user, the display and/or the physical object or simulator.
- Examples include tracking fiducial markers, using stereo images to track retro-reflective IR markers, or using a markerless system.
- a second vision-based tracker can be incorporated into embodiments of the present invention.
- an imaging device can be mounted on the display device to capture images of the physical objects. The captured images can be used with existing marker or markerless tracking algorithms to more finely register the virtual objects atop the physical objects. This can enhance the overall visual quality and improve the accuracy and scope of the interaction.
- a markerless system can use the image formed by a physical object captured by an imaging device (such as a video camera or charge coupled device) attached to the display and a model of the physical object to determine the position and orientation of the display.
- an imaging device such as a video camera or charge coupled device
- the imaging device and thus the display are directly over the cylinder (plan view). If the circle appears small, the display is further away from the physical object compared to a larger appearing circle.
- the imaging device would produce a rectangle whose size would again vary with the distance of the imaging device and thus the display from the physical object.
- users can view a first-person perspective of an abstract or concrete representation with a photorealistic or 3-D model of the real object or simulation.
- the photorealistic or 3-D model appears on the lens in the same position and orientation as the real physical object or simulation, as if the display was a transparent window (or a magnifying glass) and the user was looking through it.
- This concept can include implementations of a 'magic lens, " which shows underlying data in a different context or representation.
- Magic Lenses were originally created as 2-D interfaces that are movable, semi- transparent 'regions of interest " that show the user a different representation of the information underneath the lens. Magic lenses can be used for such operations as magnification, blur, and previewing various image effects. Often, each lens represented a specific effect. If the user wanted to combine effects, two lenses could be dragged over the same area, producing a combined effect in the overlapping areas of the lens. According to embodiments of the present invention, the capabilities of multiple magic lenses can be integrated into a single system. That is, the display can provide the effects of multiple magic lenses.
- embodiments of the present invention can integrate the diagram- based, dynamic, transparent reality model into the context of the real object or physical simulator using a "see-through" magic lens (i.e. a display window).
- a "see-through" magic lens i.e. a display window.
- the display window displays a scaled high-resolution 3-D model of the object or physical simulator that is registered to the object or real simulator.
- the see-through functionality is implemented using a 3-D model of the real object or physical simulator.
- another technique may utilize a video see-through technique where abstract components are superimposed over a live photorealistic video stream.
- Figures IA and IB show concept pictures of a mixed simulator system incorporating a hand-held display 100 and physical mannequin patient simulator 101 according to an embodiment of the present invention
- an abstract representation of the distribution over time of an anesthetic shown as a darkened region
- an anesthetic shown as a darkened region
- a user can more clearly understand the processes, functions, and structures, including those that are transparent, internal, hidden, invisible, conceptual or not easily distinguished, of the physical object or simulation.
- the models driving the distribution of anesthetic in the abstract, virtual representation determine if the anesthetic migrates to the cervical area of the spine that controls breathing and causes loss of spontaneous breathing.
- the virtual simulation communicates that information back to the MPS (mannequin patient simulator) so that the physically simulated spontaneous breathing in the MPS becomes blunted.
- FIG. 2 shows a concept view of a mixed simulator system incorporating a projector 200 and physical mannequin patient simulator 201 according to an embodiment of the present invention.
- the projector 200 can be ceiling mounted.
- An abstract representation of the distribution over time of an anesthetic (shown as a darkened region) in the spinal cavity is projected directly onto the MPS 201 , which is a physical object and also a physical simulator.
- the virtual representation can be projected correcting for unevenness, or non- flatness, of the surface onto which the virtual representation is projected.
- the user's perspective can be tracked and used in adjusting the projection.
- the virtual representation can be abstract or concrete.
- a checkerboard pattern (or some other pattern) can be initially projected onto the non-flat surface and the image purposely distorted via a software implemented filter until the projection on the non-flat surface results in the original checkerboard pattern (or other original pattern).
- the virtual representation can then be passed through the filter so that the non-flat surface does not end up distorting the desired abstract or concrete virtual representation.
- the location of the physical simulation or object can be tracked if the physical simulation or object moves, and the location of the physical simulation or object can be registered within the 3-D space if the physical simulation or object remains stationary.
- the display interposes the appropriate representation (abstract or concrete) over the corresponding area of the physical simulation or object.
- the orientation and position of the display can be tracked in a 3-D space containing a physical simulation, such as a MPS, or object such, as an anesthesia machine or car engine.
- a physical simulation such as a MPS
- object such, as an anesthesia machine or car engine.
- a concrete virtual representation when placing the display between the viewer(s) and the mannequin head, as used in the first mode described above, a virtual representation that looks like the head of the mannequin (i.e., a concrete virtual representation) will be interposed over the head of the physical mannequin and the lips in the virtual representation may turn blue (to simulate cyanosis) or vomit may spew out to simulate vomiting virtually.
- blue lip color can be projected onto the lips of the MPS to indicate the patient's skin turning cyanotic.
- these approaches provide a means to do the "messy stuff virtually with a minimum of spills and cleanup.
- identifiable features related to different attributes and conditions such as age, gender, stages of pregnancy, and ethnic group can be readily represented in the virtual representation and overlaid over a "hard-coded " ' physical simulation, such as a mannequin patient simulator.
- the mixed simulator can provide the viewing of multiple virtual versions that are registered with the physical object. This is done as a means to overcome the limitations of easily and quickly modifying the physical object.
- the multiple virtual versions that are mapped to the physical object allow for training and education of many complex concepts not afforded with existing methods.
- the virtual human model registered with the physical human patient simulator can represent different gender, different size, and different ethnic patients.
- the user sees the dynamic virtual patient while interacting with the human patient simulator as inputs to the simulation.
- the underlying model to the physical simulation is also modified by the choice of virtual human, e.g. gender or weight specific physiological changes.
- An abstract representation might instead interpose a representation of the brain over the head of the MPS with the ability to zoom in to the blood brain barrier and to cellular, molecular, atomic and sub-atomic abstract representations.
- Another abstract virtual representation would be to interpose abstract representations of an anesthesia machine over an actual anesthesia machine.
- a tracking system similar to what would be used to track a display used with a physical MPS in a 3-D space is implemented for tracking a display used with a real anesthesia machine and interposing abstract representations of the internal structure and processes in an anesthesia machine.
- An example of this system is conceptually shown in Figures 3 and 8.
- the tracking system can use a computer vision technique called outside-looking-in tracking (see e.g., "Optical Tracking and Calibration of Tangible Interaction Devices," by van Rhijn et al. (Proceedings of the Immersive Projection Technology and Virtual Environments Workshop 2005), which is hereby incorporated by reference in its entirety).
- the outside-looking-in tracking technique uses multiple stationary cameras that observe special markers attached to the objects being tracked (in this case the object being tracked is the display 303).
- the images captured by the cameras can be used to calculate positions and orientations of the tracked objects.
- the cameras are first calibrated by having them all view an object of predefined dimensions. Then the relative position and orientation of each camera can be calculated. After calibration, each camera searches each frame's images for the markers attached to the lens; then the marker position information from multiple cameras is combined to create a 3-D position and orientation of the tracked object.
- embodiments of the subject system use cameras with infrared lenses and retro- reflective markers (balls) that reflect infrared light.
- This particular system uses IR cameras 301 to track fiducial markers (reflective balls 302a, 302b, and 302c) on the display 303 which could be hand-held or otherwise supported.
- fiducial markers reflective balls 302a, 302b, and 302c
- the cameras 301 see only the reflective balls (302a, 302b, and 302c) in the image plane.
- Each ball has a predefined relative position to the other two balls. Triangulating and matching the balls from at least two camera views can facilitate calculation of the 3-D position and orientation of the balls. Then this position and orientation can be used for the position and orientation of the magic lens provided by the display 303.
- the position and orientation of the display can be used to render the 3-D model of the machine from the user's current perspective. Although tracking the lens alone does not result in rendering the exact perspective of the user, it gives an acceptable approximation as long as users know where to hold the lens in relation to their head. To accurately render the 3-D machine from the user's perspective independent of where the user holds the lens in relation to the head, both the user's head position and the display can be tracked.
- the display and/or user can be tracked using an acoustic or ultrasonic method and inertial tracking, such as the Tntersense IS-900.
- Another example is to track a user with a magnetic method using ferrous materials, such as the Polhemus FastrakTM.
- a third example is to use optical tracking, such as the 3rdTech HiballTM.
- a fourth example is to use mechanical tracking, such as boom interfaces for the display and/or user. FakeSpace Boom 3C (discontinued) and WindowVR by Virtual Research Systems, Inc. are two examples of boom interfaces.
- Data related to the physiological and pharmacological status of the MPS can be relayed in real time to the display so that the required changes in the abstract or concrete overlaid/interposed representations arc appropriate.
- models running on the virtual simulation can send data back to the MPS and affect how the MPS runs.
- Embodiments of the subject invention can utilize wired or wireless communication elements to relay the information.
- the display providing the virtual simulation can include a tangible user interface (TUI). That is, similarly to a Graphical User Interface (GUI) in which a user clicks on buttons and slider bars to control the simulation (interactive control), the TUI can be used for control of the simulation. However, in contrast to a typical GUI, the TUI is also an integral part of that simulation - often a part of the phenomenon being simulated. According to an embodiment, a TUI provides a simulation control and represents a virtual object that is part of the simulation. In this way, interacting with the real object (e.g., the physical object or simulator or instrument) facilitates interaction with both the real world and the virtual world at the same time while helping with suspension of disbelief and providing a natural intuitive user interface.
- GUI Graphical User Interface
- a TUI provides a simulation control and represents a virtual object that is part of the simulation. In this way, interacting with the real object (e.g., the physical object or simulator or instrument) facilitates interaction with both the real world and the virtual world at the same
- effects that are not visible, explicit or obvious, such as the efficacy of skin sterilization and infectious organism count can be visualized through the virtual representation when the user is applying a tracked scrub applicator over a surface of a mannequin patient simulator.
- the virtual representations can reflect the efficacy of such actions by, for example, a color map illustrating how thorough the sterilization and images or icons representing organisms illustrating how pervasive the infectious organism count over time.
- a user can interact with a virtual simulation by interacting with the real object or physical simulator.
- the real object or physical simulator becomes a TUI.
- the interface and the virtual simulation arc synchronized.
- the model of the gas flowmeters (specifically the graphical representation of the gas particles " flow rate and the flowmeter bobbin icon position) is synchronized with the real anesthesia machine such that changes in the rate of the simulated gas flow correspond with changes in the physical gas flow in the real anesthesia machine.
- the real machine is an interface to control the simulation of the machine and the transparent reality model visualization (e.g., visible gas flow and machine state) is synchronized with the real machine.
- the transparent reality model visualization e.g., visible gas flow and machine state
- users can observe how their interactions with the real machine affect the virtual model in context with the real machine.
- the overlaid diagram- based dynamic model enables users to visualize how the real components of the machine arc functionally and spatially related, thereby demonstrating how the real machine works internally.
- the real machine controls as the user interface to the model, interaction with a pointing device can be minimized. In further embodiments interaction with the pointing device can be eliminated, providing for a more real-world and intuitive user interaction. Additionally, users get to experience the real location, tactile feel and resistance of the machine controls.
- the O 2 flowmeter knob is fluted while the N 2 O flowmeter knob is knurled to provide tactile differentiation.
- the synchronization can be accomplished using a detection system to track the setting or changes to the real object or simulator (e.g., the physical flowmeters of the real anesthesia machine which correspond to the real gas flow rates of the machine).
- the detection system can include motion detection via computer vision techniques.
- the information obtained through the computer vision techniques can be transmitted to the simulation to affect the corresponding state in the simulation.
- the gas flow rates (as set by the user on the real flowmeters) are transmitted to the simulation in order to set the flow rate of the corresponding gas in the transparent reality simulation.
- the real object or physical simulator can provide signals (e.g. through a USB, serial port, wireless transmitter port, etc.) indicative of, or that can be queried about, the state or settings of the real object or physical simulator.
- signals e.g. through a USB, serial port, wireless transmitter port, etc.
- a 2 -D optical tracking system can be employed to detect the states of the anesthesia machine.
- Table 1 describes an example set-up.
- Figure 8 shows a block diagram of a specific implementation of the tracking system.
- state changes of the input devices can be detectable as changes in 2-D position or visible marker area by the cameras.
- retro-reflective markers can be attached and webcams can be used to detect the visible area of the markers.
- the visible area of the tracking marker increases or decreases depending on the direction the knob is turned (e.g. the O 2 knob protrudes out further from the front panel when the user increases the flow of O 2 , thereby increasing the visible area of the tracked marker).
- the pressure gauge and bag tracking system can use color based tracking (e.g., the 2-D position of the bright red pressure gauge needle).
- Embodiments of the present invention can provide a 'magic lens' for mixed or augmented reality, combining a concrete or abstract virtual display with a physical object or simulation.
- the magic lens can be incorporated in a TUI and a display device instead of a 2-D GUI.
- the user can look through the 'lens' and see the real world augmented with virtual information within the "region of interest' of the lens.
- the region of interest can be defined by a pattern marker or an LCD screen or touchscreen, for example, of a tablet personal computer.
- the iens' can act as a filter or a window for the real world and is shown in perspective with the user's first-person perspective of the real world.
- the augmented-reality lens is implemented as a hand-held display tangible user interface instead of a 2-D GUI.
- the hand-held display can allow for six degrees of freedom.
- the hand-held display can be the main visual display implemented as a tracked 6DOF (six degrees of freedom) tablet personal computer.
- FIG. 4B and 4C to visualize the superimposed gas flowmeters, users look through a tracked 6DOF magic lens.
- the iens' allows users to move it or themselves freely around the machine and view the simulation from a first person perspective, thereby augmenting their visual perception of the real machine with the overlaid VAM model graphics (e.g. portion 410).
- the relationship between the user's head and the lens is analogous to an OpenGL camera metaphor.
- the camera is positioned at the user's eye, and the projection plane is the lens; the 'lens' 400 renders the VAM simulation directly (shown as element 410) over the machine from the perspective of the user.
- a first-person perspective of the VAM model e.g., portions 409 and 410 in context with a photorealistic or 3-D model 411 of the real machine (see Figure 4B reference 405).
- the photorealistic or 3-D machine model appears on the display providing the 'lens' in the same position and orientation as the real machine, as if the lens was a transparent window (or a magnifying glass or a lens with magic properties such as rendering hidden, internal, abstract or invisible processes visible and explicit) and the user was looking through it.
- the virtual simulation can allow users to "reset" the model dynamics to a predefined start state. All of the interactive components are then set to predefined start states.
- This instant reset capability is not possible for certain real objects. For example, for a real anesthesia machine it is not possible to reset the gas particle states (i.e. removing all the particles from the pipes) due to physical constraints on gas flows.
- the input device can serve as an interface to change the simulation visualization for actions that may not have a corresponding interaction with a physical component or may be too onerous, time-consuming, physically strenuous, impractical or dangerous to perform physically.
- the subject tracking system it is possible to automatically capture and record where a trainee is looking and for how long as well as whether certain specially marked and indexed objects and instruments in the tracked simulation environment (beyond the physical simulation or object) such as airway devices, sterile solution applicators or resuscitation bags are picked up and used, facilitating assessment and appropriate simulated response and greatly enhancing the capabilities of an MPS or physical simulator.
- certain specially marked and indexed objects and instruments in the tracked simulation environment such as airway devices, sterile solution applicators or resuscitation bags are picked up and used, facilitating assessment and appropriate simulated response and greatly enhancing the capabilities of an MPS or physical simulator.
- AAR after-action review
- students need repetition, feedback, and directed instruction to achieve an acceptable level of competency
- educators need assessment tools to identify trends in class performance.
- current video-based after-action review systems offer educators and students the ability to playback (i.e., play, fast-forward, rewind, pause) training sessions repeatedly and at their own pace.
- Most current after- action review systems consist of reviewing videos of a student's training experience. This allows students and educators to playback, critique, and assess performance.
- video-based after-action review systems allow educators to manually annotate the video timeline - to highlight important moments in the video (e.g. when a mistake was made and what kind of mistake). This type of annotation helps to direct student instruction and educator assessment.
- Video- based after-action review is widely used in training because it meets many of the educators' and students' educational needs.
- video-based review typically consists of fixed viewpoints and primarily real-world information (i.e. the video is minimally augmented with virtual information).
- students and educators do not enjoy the cognitive, interactive, and visual advantages of collocating real and virtual information in mixed reality.
- collocated after-action review using embodiments of the subject mixed simulator system can be provided to: (1) effectively direct student attention and interaction during after-action review and (2) provide novel visualizations of aggregate student (trainee) performance and insight into student understanding and misconceptions for educators.
- a user can control the after-action review experience from a first-person viewpoint. For example, users can review an abstract simulation of an anesthesia machine's internal workings that is registered to a real anesthesia machine. During the after-action review, previous interactions can be collocated with current real-time interactions, enabling interactive instruction and correction of previous mistakes in situ (i.e. in place with the anesthesia machine). Similar to a video-based review, embodiments of the present invention can provide recording and playback controls. In further embodiments, these recorded experiences can be collocated with the anesthesia machine and the user ' s current real -world experience.
- students can (1) review their performance in situ, (2) review an expert's performance for the same fault or under similar conditions in situ, (3) interact with the physical anesthesia machine while following a collocated expert guided tutorial, and (4) observe a collocated visualization of the machine's internal workings during (l),(2),and (3).
- a student or trainee can playback previous interactions, visualize the chain of events that made up the previous interactions, and visualize where the user and the expert were each looking during their respective interactions.
- the anesthesia machine is used only as an example - the principle would be similar if the anesthesia machine was replaced by, for example, a mannequin patient simulator or automotive engine.
- head-gaze or eye-gaze
- physical object or simulator states can be logged during a fault test or training exercise.
- Head-gaze can be determined using any suitable tracking method. For example, a user can wear a hat tracked with retro-reflective tape and IR sensing web cams. This enables the system to log the head-gaze direction of the user. The changes in the head-gaze and physical object or simulator states can then be processed to determine when the user interacted with the physical object or simulator.
- a student (or trainee) log is recorded when a student (or trainee) performs a fault test or training exercise prior to the collocated after-action review.
- Figures 5-7 illustrate an embodiment where the physical object or simulator is an anesthesia machine.
- students physically interact with the real anesthesia machine and use a 6DOF magic lens to visualize how these interactions affect the internal workings and invisible gas flows of the real anesthesia machine.
- fault behavior during training specific faults can be physically caused in the real anesthesia machine and triggered or reproduced in the abstract simulation. For example, one fault involves a faulty inspiratory valve, which can be potentially harmful to a patient.
- the top left inset of Figure 5 shows what the student sees of a real anesthesia machine, while the main image shows what the student sees on the magic lens.
- the magic lens visualizes abstract concepts, such as invisible gas flow
- students or trainees can observe how a faulty inspiratory valve affects gas flow in situ.
- the abstract valve icons are both open (e.g. the horizontal line 501 is located at the top of the icon, which denotes an open valve).
- a "look-at indicator" (indicated by a circle, boxed, or highlighted region) helps students to direct their attention in the after-action review and allows them to compare their own observations to the expert's observations.
- an ''interaction event box that is collocated with the corresponding control can appear. For example, when the student turned the O 2 knob, an interaction box pops up next to the control and reads that the student increased the O 2 flow by a specific percentage.
- a 3-D line can be rendered that slowly extends from the last interaction event position and towards the position of the next upcoming event.
- This line can be in a distinctive color, such as red.
- Lines between older events can be a different color, such as blue, indicating that the events have passed.
- these lines connect all the interactions that were performed in the experience.
- Figure 9 shows a block diagram of a mixed simulator training system according to an embodiment of the present invention.
- the mixed simulator training system can include a real object or physical simulator 900, a tracking system 901, and a visual display 902.
- the visual display can display a virtual model or simulation of the real object or physical simulator 900.
- a user (or trainee) 903 can physically and visually interact with the real object or physical simulator 900 while also viewing the virtual model or simulation on the visual display 902.
- Status state information from the real object or physical simulator 900 and data from the tracking system 901 can be stored in a storage device 904.
- the tracking system 901 can be used to track head gaze or eye gaze and location of the user 903.
- the tracking system 901 can also include tracking of the visual display 902 for allowing continual registration of visual display position with the real object or physical simulator 900 aiding 'magic lens' capabilities of the visual display 902. In further embodiments, the tracking system 901 can also include tracking of the real object or physical simulator 900.
- a processing block 905 can be used to implement the software code for the system and control and process signals from the various devices. The processing block 905 can be a part of one or more computers or logic devices.
- the visual display can include an optional input device 906 for inputting instructions from a user that are not mapped to an interaction with, or reproduced in, the real object or physical simulator 900.
- a method of integrating a diagram-based dynamic model, the physical phenomenon being simulated, and the visualizations of the mapping between the two into the same context is provided with reference to Figures 10-12.
- the following description relates to a specific implementation of an augmented anesthesia machine that combines the virtual anesthesia machine (VAM) model (described, for example, in U.S. Patent No. 7,128,578, which is incorporated by reference in its entirety) with a real anesthesia machine; and demonstrates the integration of a diagram-based dynamic model, the physical phenomenon being simulated and the visualizations of the mapping between the model and the physical phenomenon into the same context.
- VAM virtual anesthesia machine
- the VAM components are reorganized to align with the real machine.
- the spatially reorganized components are superimposed into the user ' s view of the real machine.
- the simulation is synchronized with the real machine, allowing the user to interact with the diagram-based dynamic model (VAM model) through interacting with the real machine controls, such as the flowmeter knobs.
- VAM model diagram-based dynamic model
- embodiments of the present invention can help students to visualize the mapping between the VAM model and the real machine.
- each diagrammatic component can be visually collocated with each anesthesia machine component.
- the contextualization comprises: (1) transforming the 2-D VAM diagrams into 3-D objects (e.g. a textured mesh, a textured quad, or a retexturing of the physical phenomenon's 3-D geometric model) and (2) positioning and orienting the transformed diagram objects in the space of the corresponding anesthesia machine component (i.e. the diagram objects must be visible and should not be located inside of their corresponding real-component's 3-D mesh).
- Figure 10 shows a diagram illustrating transforming a 2-D VAM diagram into a
- each VAM component is manually texture-mapped to a quad and then the quad is scaled to the same scale as the corresponding 3-D mesh of the physical component.
- each VAM component quad is manually oriented and positioned in front of the corresponding real component's 3-D mesh - specifically, the side of the component that the user looks at the most.
- the flowmeters' VAM icon is laid over the real flowmeter tubes. The icon is placed where users read the gas levels on the front of the machine, rather than on the back of the machine where users rarely look.
- the machine model can be textured or more complex 3-D models of the diagram can be used rather than texture mapped 3-D quads.
- the diagram and the physical component's mesh can be alpha blended together. This allows a user to be able to visualize both the geometric model and the diagrammatic model at all times.
- the VAM icon quads can be opaque, which can obstruct the underlying physical component geometry. However, since users interact in the space of the real machine, they can look behind the display to observe machine operations or details that may be occluded by VAM icons.
- the VAM shows these internal state changes as animations so that the user can visualize them.
- the VAM ventilator model has three discrete states: (1) off, (2) on and exhaling and (3) on and inhaling.
- a change in the ventilator state will change, for example, the visible flow of the representations (icons or animated 3-D particle) of the gases.
- Students may also have problems with understanding the functional relationships between the real machine components.
- the VAM uses 2-D pipes.
- the pipes are the arcs through which particles flow in the VAM model.
- the direction of the particle flow denotes the direction that the data flows through the model.
- these arcs represent the complex pneumatic connections that are found inside the anesthesia machine.
- these arcs are simplified for ease of visualization and spatial perception.
- the VAM pipes are laid out so that they do not cross each other, to ease the data flow visualization.
- the 2-D model arcs shown in Figure 12A
- the pipes can be visualized as 3-D cylinders that are not collocated with the real pneumatic connections inside the physical machine. Instead, the pipes are simplified to make the particle flow simpler to visualize and perceive spatially.
- This simplification emphasizes the functional relationships between the components rather than focusing on the spatial complexities of the pneumatic pipe geometry.
- the pipes can be arranged such that they do not intersect with the machine geometry or with other pipes. However, in transforming these arcs from 2-D to 3-D, some of the arcs may appear to visually cross each other from certain perspectives because of the complex way the machine components are laid out. In the cases that are unavoidable due to the machine layout, the overlapping sections of the pipes can be distinguished from each other by, for example, assigning the pipes different colors.
- Embodiments of the mixed simulator can combine any kind of physical simulation or object or instrument that is to be the subject of the training or simulation, such as a car engine, an anesthesia machine, a scrub applicator or a photocopier, with a virtual representation that enhances understanding.
- any reference in this specification to "one embodiment,' “ “an embodiment,' “ “example embodiment,” etc. means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention.
- the appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Business, Economics & Management (AREA)
- Medicinal Chemistry (AREA)
- Computational Mathematics (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Cette invention se rapporte à des systèmes de simulateur mixte qui combinent les avantages de simulations/d'objets physiques et de représentations virtuelles. Une intégration contextuelle de représentations virtuelles et de simulations ou d'objets physiques peut faciliter la formation initiale ou permanente. Deux modes de simulation mixte sont fournis. Dans le premier mode, une représentation virtuelle est combinée à une simulation ou à un objet physique en utilisant un affichage suivi, capable d'afficher une représentation virtuelle dynamique appropriée pendant qu'un utilisateur se déplace autour de la simulation ou de l'objet physique. Dans le deuxième mode, une représentation virtuelle est combinée à une simulation ou à un objet physique en projetant la représentation virtuelle directement sur l'objet physique ou la simulation. Dans d'autres modes de réalisation, l'action et l'interaction de l'utilisateur peuvent être suivies et incorporées à l'intérieur du système de simulateur mixte de manière à fournir une visualisation après action de réalité mixte. Les simulateurs mixtes dont il est question peuvent être utilisés dans de nombreuses applications y compris, mais sans que cela soit une limitation, la santé, l'éducation, la défense, les écoles professionnelles et l'industrie.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/664,222 US20100159434A1 (en) | 2007-10-11 | 2008-10-13 | Mixed Simulator and Uses Thereof |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US97913307P | 2007-10-11 | 2007-10-11 | |
| US60/979,133 | 2007-10-11 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2009049282A2 true WO2009049282A2 (fr) | 2009-04-16 |
| WO2009049282A3 WO2009049282A3 (fr) | 2009-07-23 |
Family
ID=40549856
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2008/079687 Ceased WO2009049282A2 (fr) | 2007-10-11 | 2008-10-13 | Simulateur mixte et son utilisation |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20100159434A1 (fr) |
| WO (1) | WO2009049282A2 (fr) |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10290232B2 (en) | 2014-03-13 | 2019-05-14 | Truinject Corp. | Automated detection of performance characteristics in an injection training system |
| WO2019200381A1 (fr) * | 2018-04-14 | 2019-10-17 | The California State University - San Marcos | Équipement pratique de laboratoire et de démonstration avec un environnement hybride virtuel/augmenté, ainsi que leurs procédés d'utilisation |
| US10500340B2 (en) | 2015-10-20 | 2019-12-10 | Truinject Corp. | Injection system |
| US10643497B2 (en) | 2012-10-30 | 2020-05-05 | Truinject Corp. | System for cosmetic and therapeutic training |
| US10648790B2 (en) | 2016-03-02 | 2020-05-12 | Truinject Corp. | System for determining a three-dimensional position of a testing tool |
| EP3138091B1 (fr) * | 2014-03-13 | 2020-06-10 | Truinject Corp. | Détection automatisée des caractéristiques de performance dans un système de formation à l'administration d'injections |
| US10743942B2 (en) | 2016-02-29 | 2020-08-18 | Truinject Corp. | Cosmetic and therapeutic injection safety systems, methods, and devices |
| US10896627B2 (en) | 2014-01-17 | 2021-01-19 | Truinjet Corp. | Injection site training system |
| US10902677B2 (en) | 2010-04-09 | 2021-01-26 | University Of Florida Research Foundation, Incorporated | Interactive mixed reality system and uses thereof |
| US11094223B2 (en) | 2015-01-10 | 2021-08-17 | University Of Florida Research Foundation, Incorporated | Simulation features combining mixed reality and modular tracking |
| US11710424B2 (en) | 2017-01-23 | 2023-07-25 | Truinject Corp. | Syringe dose and position measuring apparatus |
| US11730543B2 (en) | 2016-03-02 | 2023-08-22 | Truinject Corp. | Sensory enhanced environments for injection aid and social training |
| US12217626B2 (en) | 2012-10-30 | 2025-02-04 | Truinject Corp. | Injection training apparatus using 3D position sensor |
Families Citing this family (69)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10083621B2 (en) | 2004-05-27 | 2018-09-25 | Zedasoft, Inc. | System and method for streaming video into a container-based architecture simulation |
| WO2008008893A2 (fr) * | 2006-07-12 | 2008-01-17 | Medical Cyberworlds, Inc. | Système de formation médicale informatisé |
| US10908421B2 (en) | 2006-11-02 | 2021-02-02 | Razer (Asia-Pacific) Pte. Ltd. | Systems and methods for personal viewing devices |
| US9891435B2 (en) * | 2006-11-02 | 2018-02-13 | Sensics, Inc. | Apparatus, systems and methods for providing motion tracking using a personal viewing device |
| WO2009003169A2 (fr) * | 2007-06-27 | 2008-12-31 | University Of Florida Research Foundation, Inc. | Simulation interactive à base d'affichage avec panorama dynamique |
| US20100055657A1 (en) * | 2008-08-27 | 2010-03-04 | Warren Goble | Radiographic and ultrasound simulators |
| WO2010102288A2 (fr) * | 2009-03-06 | 2010-09-10 | The University Of North Carolina At Chapel Hill | Procédés, systèmes et supports lisibles par ordinateur pour créer des avatars physiques de personnages réels ou virtuels basés sur des dispositifs d'éclairage/ombrage |
| US9053641B2 (en) | 2009-06-11 | 2015-06-09 | University of Pittsburgh—of the Commonwealth System of Higher Education | Real-time X-ray vision for healthcare simulation |
| WO2010148078A2 (fr) * | 2009-06-16 | 2010-12-23 | Simquest Llc | Simulateur de contrôle d'hémorragie |
| US20110084983A1 (en) * | 2009-09-29 | 2011-04-14 | Wavelength & Resonance LLC | Systems and Methods for Interaction With a Virtual Environment |
| US20120102231A1 (en) * | 2009-11-19 | 2012-04-26 | Atellis, Inc. | Apparatus, method and computer readable medium for simulation integration |
| EP2544082A4 (fr) * | 2010-03-05 | 2013-10-30 | Fujitsu Ltd | Système d'affichage d'image, appareil de traitement d'informations, appareil d'affichage et procédé d'affichage d'image |
| US9247142B2 (en) | 2010-06-28 | 2016-01-26 | Lg Electronics Inc. | Method and apparatus for providing the operation state of an external device |
| JP5574854B2 (ja) * | 2010-06-30 | 2014-08-20 | キヤノン株式会社 | 情報処理システム、情報処理装置、情報処理方法及びプログラム |
| US20120102439A1 (en) * | 2010-10-22 | 2012-04-26 | April Slayden Mitchell | System and method of modifying the display content based on sensor input |
| US9847044B1 (en) | 2011-01-03 | 2017-12-19 | Smith & Nephew Orthopaedics Ag | Surgical implement training process |
| WO2012106706A2 (fr) | 2011-02-04 | 2012-08-09 | University Of Pittsburgh - Of The Commonwealth System Of Higher Education | Simulation de réalité hybride physique-virtuelle pour apprentissage clinique pouvant fournir une rétroaction à un modèle anatomique physique |
| BR112013034009A2 (pt) * | 2011-05-06 | 2017-02-07 | Magic Leap Inc | mundo de presença digital remota simultânea massiva |
| US8990682B1 (en) | 2011-10-05 | 2015-03-24 | Google Inc. | Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display |
| US9081177B2 (en) | 2011-10-07 | 2015-07-14 | Google Inc. | Wearable computer with nearby object response |
| US8662892B2 (en) * | 2011-10-12 | 2014-03-04 | Raytheon Company | Universal hands-on trainer (UHOT) |
| US9547406B1 (en) | 2011-10-31 | 2017-01-17 | Google Inc. | Velocity-based triggering |
| US9792715B2 (en) | 2012-05-17 | 2017-10-17 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for utilizing synthetic animatronics |
| US20140240349A1 (en) * | 2013-02-22 | 2014-08-28 | Nokia Corporation | Method and apparatus for presenting task-related objects in an augmented reality display |
| US20150015582A1 (en) * | 2013-07-15 | 2015-01-15 | Markus Kaiser | Method and system for 2d-3d image registration |
| WO2015027286A1 (fr) * | 2013-09-02 | 2015-03-05 | University Of South Australia | Système et procédé de simulation de formation médicale |
| US10146299B2 (en) * | 2013-11-08 | 2018-12-04 | Qualcomm Technologies, Inc. | Face tracking for additional modalities in spatial interaction |
| US10321107B2 (en) | 2013-11-11 | 2019-06-11 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for improved illumination of spatial augmented reality objects |
| US10302614B2 (en) | 2014-05-06 | 2019-05-28 | Safetraces, Inc. | DNA based bar code for improved food traceability |
| US9911365B2 (en) * | 2014-06-09 | 2018-03-06 | Bijan SIASSI | Virtual neonatal echocardiographic training system |
| DE102014011163A1 (de) | 2014-07-25 | 2016-01-28 | Audi Ag | Vorrichtung zum Anzeigen eines virtuellen Raums und Kamerabildern |
| DE102014224851A1 (de) * | 2014-12-04 | 2016-06-09 | Siemens Aktiengesellschaft | Vorrichtung und Verfahren zur Darstellung von Strukturinformation über ein technisches Objekt |
| US11324386B2 (en) | 2015-06-08 | 2022-05-10 | The General Hospital Corporation | Airway management and visualization device |
| US10025099B2 (en) | 2015-06-10 | 2018-07-17 | Microsoft Technology Licensing, Llc | Adjusted location hologram display |
| US10962512B2 (en) | 2015-08-03 | 2021-03-30 | Safetraces, Inc. | Pathogen surrogates based on encapsulated tagged DNA for verification of sanitation and wash water systems for fresh produce |
| US10025375B2 (en) * | 2015-10-01 | 2018-07-17 | Disney Enterprises, Inc. | Augmented reality controls for user interactions with a virtual world |
| EP3512452B1 (fr) * | 2016-09-16 | 2025-08-27 | Zimmer, Inc. | Guidage de technique chirurgicale à réalité augmentée |
| US10424121B1 (en) * | 2016-11-06 | 2019-09-24 | Oded Melinek | Generated offering exposure |
| US10692401B2 (en) | 2016-11-15 | 2020-06-23 | The Board Of Regents Of The University Of Texas System | Devices and methods for interactive augmented reality |
| US11056022B1 (en) * | 2016-11-29 | 2021-07-06 | Sproutel, Inc. | System, apparatus, and method for creating an interactive augmented reality experience to simulate medical procedures for pediatric disease education |
| WO2018165253A1 (fr) * | 2017-03-07 | 2018-09-13 | The Charles Stark Draper Laboratory, Inc. | Visualisation à réalité augmentée pour l'inspection de tuyaux |
| AU2018236172B2 (en) | 2017-03-13 | 2021-03-04 | Zimmer, Inc. | Augmented reality diagnosis guidance |
| US11432877B2 (en) * | 2017-08-02 | 2022-09-06 | Medtech S.A. | Surgical field camera system that only uses images from cameras with an unobstructed sight line for tracking |
| CN111465970B (zh) * | 2017-08-16 | 2022-08-30 | 科玛科学公司 | 用于教导患者护理的增强现实系统 |
| US20190087831A1 (en) | 2017-09-15 | 2019-03-21 | Pearson Education, Inc. | Generating digital credentials based on sensor feedback data |
| EP3701355A1 (fr) * | 2017-10-23 | 2020-09-02 | Koninklijke Philips N.V. | Bibliothèque d'instructions de service basée sur la réalité augmentée à auto-expansion |
| US10926264B2 (en) | 2018-01-10 | 2021-02-23 | Safetraces, Inc. | Dispensing system for applying DNA taggants used in combinations to tag articles |
| US10902680B2 (en) | 2018-04-03 | 2021-01-26 | Saeed Eslami | Augmented reality application system and method |
| US10556032B2 (en) * | 2018-04-25 | 2020-02-11 | Safetraces, Inc. | Sanitation monitoring system using pathogen surrogates and surrogate tracking |
| US11042156B2 (en) * | 2018-05-14 | 2021-06-22 | Honda Motor Co., Ltd. | System and method for learning and executing naturalistic driving behavior |
| CN112512599A (zh) * | 2018-06-12 | 2021-03-16 | 手机肥皂有限公司 | 用于管理消毒的系统和方法 |
| US11853832B2 (en) | 2018-08-28 | 2023-12-26 | Safetraces, Inc. | Product tracking and rating system using DNA tags |
| US10810416B2 (en) * | 2018-12-14 | 2020-10-20 | Palo Alto Reseach Center Incorporated | Method and system for facilitating dynamic materialization for real-world interaction with virtual reality |
| US10726630B1 (en) * | 2019-06-28 | 2020-07-28 | Capital One Services, Llc | Methods and systems for providing a tutorial for graphic manipulation of objects including real-time scanning in an augmented reality |
| US12258638B2 (en) | 2020-04-16 | 2025-03-25 | Safetraces, Inc. | Airborne pathogen simulants and mobility testing |
| EP3905225A1 (fr) * | 2020-04-28 | 2021-11-03 | Université de Strasbourg | Système et procédé permettant d'évaluer une formation médicale basée sur la simulation |
| CN114550525B (zh) * | 2020-11-24 | 2023-09-29 | 郑州畅想高科股份有限公司 | 一种基于混合现实技术的机车部件检修实训系统 |
| US20220188545A1 (en) * | 2020-12-10 | 2022-06-16 | International Business Machines Corporation | Augmented reality enhanced situational awareness |
| EP4278366A4 (fr) | 2021-01-12 | 2024-12-11 | Emed Labs, LLC | Plateforme de test et de diagnostic de santé |
| US11615888B2 (en) | 2021-03-23 | 2023-03-28 | Emed Labs, Llc | Remote diagnostic testing and treatment |
| US11929168B2 (en) | 2021-05-24 | 2024-03-12 | Emed Labs, Llc | Systems, devices, and methods for diagnostic aid kit apparatus |
| US11369454B1 (en) | 2021-05-24 | 2022-06-28 | Emed Labs, Llc | Systems, devices, and methods for diagnostic aid kit apparatus |
| GB2623461A (en) | 2021-06-22 | 2024-04-17 | Emed Labs Llc | Systems, methods, and devices for non-human readable diagnostic tests |
| US20230048501A1 (en) * | 2021-08-16 | 2023-02-16 | Apple Inc. | Visualization of a knowledge domain |
| US12014829B2 (en) | 2021-09-01 | 2024-06-18 | Emed Labs, Llc | Image processing and presentation techniques for enhanced proctoring sessions |
| US20250221772A1 (en) | 2022-03-09 | 2025-07-10 | All India Institute Of Medical Sciences (Aiims) | 3-dimensional tracking and navigation simulator for neuro-endoscopy |
| US11981460B2 (en) * | 2022-05-13 | 2024-05-14 | Firestorm Labs, Inc. | Mission-adaptable aerial vehicle and methods for in-field assembly and use |
| US12309352B2 (en) * | 2022-06-30 | 2025-05-20 | Apple Inc. | In-field calibration techniques for cameras |
| US12486017B2 (en) | 2024-03-13 | 2025-12-02 | Firestorm Labs, Inc. | Additive manufactured integral fastening system for mission adaptable unmanned aerial vehicles |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2808366B1 (fr) * | 2000-04-26 | 2003-12-19 | Univ Paris Vii Denis Diderot | Procede et systeme d'apprentissage en realite virtuelle, et application en odontologie |
| US7371067B2 (en) * | 2001-03-06 | 2008-05-13 | The Johns Hopkins University School Of Medicine | Simulation method for designing customized medical devices |
| NO20013450L (no) * | 2001-07-11 | 2003-01-13 | Simsurgery As | Systemer og fremgangsmåter for interaktiv trening av prosedyrer |
| WO2003023737A1 (fr) * | 2001-09-07 | 2003-03-20 | The General Hospital Corporation | Systeme de formation et d'entrainement aux actes medicaux |
| US7206627B2 (en) * | 2002-03-06 | 2007-04-17 | Z-Kat, Inc. | System and method for intra-operative haptic planning of a medical procedure |
| US6943754B2 (en) * | 2002-09-27 | 2005-09-13 | The Boeing Company | Gaze tracking system, eye-tracking assembly and an associated method of calibration |
| DE10261673A1 (de) * | 2002-12-31 | 2004-07-15 | Riener, Robert, Dr.-Ing. | Interaktive Lehr- und Lernvorrichtung |
| JP2004347623A (ja) * | 2003-03-26 | 2004-12-09 | National Institute Of Advanced Industrial & Technology | 人体模型及びその製造方法 |
| US20080187896A1 (en) * | 2004-11-30 | 2008-08-07 | Regents Of The University Of California, The | Multimodal Medical Procedure Training System |
| EP1909162A1 (fr) * | 2006-10-02 | 2008-04-09 | Koninklijke Philips Electronics N.V. | Système pour dessiner virtuellement sur une surface physique |
| KR100748269B1 (ko) * | 2007-05-15 | 2007-08-09 | 태라한빛 주식회사 | 치과치료용 실습 교육의 최적화 시스템 |
-
2008
- 2008-10-13 US US12/664,222 patent/US20100159434A1/en not_active Abandoned
- 2008-10-13 WO PCT/US2008/079687 patent/WO2009049282A2/fr not_active Ceased
Cited By (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10902677B2 (en) | 2010-04-09 | 2021-01-26 | University Of Florida Research Foundation, Incorporated | Interactive mixed reality system and uses thereof |
| US11361516B2 (en) | 2010-04-09 | 2022-06-14 | University Of Florida Research Foundation, Incorporated | Interactive mixed reality system and uses thereof |
| US10643497B2 (en) | 2012-10-30 | 2020-05-05 | Truinject Corp. | System for cosmetic and therapeutic training |
| US10902746B2 (en) | 2012-10-30 | 2021-01-26 | Truinject Corp. | System for cosmetic and therapeutic training |
| US11854426B2 (en) | 2012-10-30 | 2023-12-26 | Truinject Corp. | System for cosmetic and therapeutic training |
| US12217626B2 (en) | 2012-10-30 | 2025-02-04 | Truinject Corp. | Injection training apparatus using 3D position sensor |
| US11403964B2 (en) | 2012-10-30 | 2022-08-02 | Truinject Corp. | System for cosmetic and therapeutic training |
| US12456393B2 (en) | 2012-10-30 | 2025-10-28 | Truinject Corp. | System for cosmetic and therapeutic training |
| US10896627B2 (en) | 2014-01-17 | 2021-01-19 | Truinjet Corp. | Injection site training system |
| EP3138091B1 (fr) * | 2014-03-13 | 2020-06-10 | Truinject Corp. | Détection automatisée des caractéristiques de performance dans un système de formation à l'administration d'injections |
| US10290231B2 (en) | 2014-03-13 | 2019-05-14 | Truinject Corp. | Automated detection of performance characteristics in an injection training system |
| US10290232B2 (en) | 2014-03-13 | 2019-05-14 | Truinject Corp. | Automated detection of performance characteristics in an injection training system |
| US11094223B2 (en) | 2015-01-10 | 2021-08-17 | University Of Florida Research Foundation, Incorporated | Simulation features combining mixed reality and modular tracking |
| US12387621B2 (en) | 2015-01-10 | 2025-08-12 | University Of Florida Research Foundation, Incorporated | Simulation features combining mixed reality and modular tracking |
| US10500340B2 (en) | 2015-10-20 | 2019-12-10 | Truinject Corp. | Injection system |
| US12070581B2 (en) | 2015-10-20 | 2024-08-27 | Truinject Corp. | Injection system |
| US10743942B2 (en) | 2016-02-29 | 2020-08-18 | Truinject Corp. | Cosmetic and therapeutic injection safety systems, methods, and devices |
| US10648790B2 (en) | 2016-03-02 | 2020-05-12 | Truinject Corp. | System for determining a three-dimensional position of a testing tool |
| US11730543B2 (en) | 2016-03-02 | 2023-08-22 | Truinject Corp. | Sensory enhanced environments for injection aid and social training |
| US11710424B2 (en) | 2017-01-23 | 2023-07-25 | Truinject Corp. | Syringe dose and position measuring apparatus |
| US12350472B2 (en) | 2017-01-23 | 2025-07-08 | Truinject Corp. | Syringe dose and position measuring apparatus |
| US11694575B2 (en) | 2018-04-14 | 2023-07-04 | The Trustees of the California State University | Hands-on laboratory and demonstration equipment with a hybrid virtual/augmented environment, along with their methods of use |
| WO2019200381A1 (fr) * | 2018-04-14 | 2019-10-17 | The California State University - San Marcos | Équipement pratique de laboratoire et de démonstration avec un environnement hybride virtuel/augmenté, ainsi que leurs procédés d'utilisation |
Also Published As
| Publication number | Publication date |
|---|---|
| US20100159434A1 (en) | 2010-06-24 |
| WO2009049282A3 (fr) | 2009-07-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20100159434A1 (en) | Mixed Simulator and Uses Thereof | |
| Gasques et al. | Artemis: A collaborative mixed-reality system for immersive surgical telementoring | |
| US11749137B2 (en) | System and method for multisensory psychomotor skill training | |
| US8605133B2 (en) | Display-based interactive simulation with dynamic panorama | |
| AU2008270883B2 (en) | Virtual interactive presence systems and methods | |
| US9959629B2 (en) | System and method for managing spatiotemporal uncertainty | |
| Lin et al. | A first-person mentee second-person mentor AR interface for surgical telementoring | |
| CN108389249B (zh) | 一种多兼容性的vr/ar空间教室及其构建方法 | |
| AU2013370334A1 (en) | System and method for role-switching in multi-reality environments | |
| CN115315729A (zh) | 用于促进远程呈现或交互的方法和系统 | |
| Quarles et al. | A mixed reality approach for merging abstract and concrete knowledge | |
| Müller et al. | The virtual reality arthroscopy training simulator | |
| Rebol et al. | Mixed reality communication for medical procedures: Teaching the placement of a central venous catheter | |
| Andersen et al. | Augmented visual instruction for surgical practice and training | |
| Quarles et al. | Scaffolded learning with mixed reality | |
| Hochreiter et al. | Touch sensing on non-parametric rear-projection surfaces: A physical-virtual head for hands-on healthcare training | |
| TW201619754A (zh) | 醫用影像物件化介面輔助解說控制系統及其方法 | |
| Negrão et al. | Characterizing head-gaze and hand affordances using AR for laparoscopy | |
| Quarles et al. | A mixed reality approach for interactively blending dynamic models with corresponding physical phenomena | |
| Shinde et al. | Augmented reality in healthcare | |
| Andersen | Effective User Guidance Through Augmented Reality Interfaces: Advances and Applications | |
| Coles | Investigating augmented reality visio-haptic techniques for medical training | |
| CN115547129B (zh) | 一种心脏三维可视化的ar实现系统与方法 | |
| Witzke et al. | Immersive virtual reality used as a platform for perioperative training for surgical residents | |
| Lee et al. | A remote virtual-surgery training and teaching system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08838430 Country of ref document: EP Kind code of ref document: A2 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 12664222 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 08838430 Country of ref document: EP Kind code of ref document: A2 |