[go: up one dir, main page]

US20060183096A1 - Interactive teaching and learning device with three-dimensional model - Google Patents

Interactive teaching and learning device with three-dimensional model Download PDF

Info

Publication number
US20060183096A1
US20060183096A1 US10/541,295 US54129505A US2006183096A1 US 20060183096 A1 US20060183096 A1 US 20060183096A1 US 54129505 A US54129505 A US 54129505A US 2006183096 A1 US2006183096 A1 US 2006183096A1
Authority
US
United States
Prior art keywords
model
touched
force
electronic storage
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/541,295
Other languages
English (en)
Inventor
Robert Riener
Rainer Burgkart
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of US20060183096A1 publication Critical patent/US20060183096A1/en
Priority to US12/397,758 priority Critical patent/US8403677B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/30Anatomical models

Definitions

  • the invention relates to a device which allows to preferably explain and demonstrate three-dimensional objects, e. g. anatomical models or even models and exhibits for museums and fairs.
  • the model to be touched has to be processed. As a result the model may be changed or even damaged. Further, in order to achieve a sufficient level of space resolution over the whole of the concerned model area, a plurality of sensors sensitive to pressure have to be used.
  • a 3D body incorporating the model is fastened to the adjacencies by at least one multi-component electrical force-torque measurement device.
  • the forces and torques arising are converted into electrical measurement signals which are leaded to an electronic storage and evaluation system.
  • an electronic storage and evaluation system In the electronic storage and evaluation system a mathematical model of the geometry of the 3D body is implemented.
  • geometry means at least each surface area of the model which can be contacted and which is to be explained, i. e. also body hollows of an anatomical model.
  • the calculated place of the contact is indicated or displayed by means of an indicating device.
  • the mode of indication and/or output is optional and is executed in accordance with the purpose to be achieved.
  • Optic-visual and/or acoustic indicating devices as known from the state of the art are preferred.
  • the forces and torques detected by the multiple-component force-torque measurement device are compared with the data stored in the data table.
  • the place touched is detected and displayed by the indicating device.
  • the invention in accordance with claim 2 can practically detect the pre-taught points only.
  • the model is fastened to a table, a wall, a ceiling or any other base surface by only one multiple-component force-torque measurement device. For the reason of a better mechanical stability even several force measurement devices may be used. Multiple-component force-torque measurement devices are part of the state of the art and are commercially offered as modular components. Additional holding appliances may also be used if required by the dimensions of the 3D body. These holding appliances, however, must be constructed in a way to unambiguously and reproducibly feed the force caused by the touch to the force-torque measurement device or the force-torque measurement devices.
  • the touch-sensitive sensor system is not positioned at the touch point of the model but is arranged as connecting element between the model and the adjacencies. For this reason there is no need to expensively adapt the model. Furthermore nearly any number of touch points may be generated, which is not possible with regard to the devices of the known art.
  • the construction as mentioned above allows to visually and/or acoustically explain, describe or accentuate the areas, points of elements of the model touched by the operator.
  • the details shown may be the name, certain properties and functions of the area or element of the identified model.
  • the details are made readable or visually recognizable by means of a visual display unit, and/or audible by means of loudspeakers.
  • films or graphic animations can be imported depending on what kind of setting and operating activities have been made.
  • the amount of the force detected and the direction of the force detected can be further processed by the data processor and reproduced as a graphically animated vector arrow or as an acoustic signal. If for example, the operator applies too high forces to the model a visual or acoustic warning signal or a warning voice may ensure that the operator stops applying force to the model so as to avoid a destruction of the model or the force sensor.
  • the mathematical representation of the used model can be determined by means of 3D-scanners (CT, magnetic resonance tomography, laser scanner etc.) and stored in a data processor.
  • CT computed tomography
  • laser scanner etc.
  • the relative areas of the model are touched, and the thereby arising forces and torques are measured and stored and assigned, for example by the input of texts.
  • the assignment method can be supported by up-to-date techniques such as artificial neural networks.
  • the element touched is detected automatically.
  • the geometric image of the model can also be represented in a graphically animated way.
  • certain areas of the model which are touched can be marked by colour or by means of an arrow.
  • Even very fine details which are positioned near the touch point but cannot be marked on the real model for lack of space can be visualized by means of the visual display unit.
  • menu points which optically differ in colour, size, shape, inscription can be marked. If one of these menu points is touched, depending on the kind of the point a certain reaction is released or menu function is executed which is displayed acoustically or graphically.
  • touch patters are for example: long or short contacts, light or strong contact pressing, as well as tapping signs with varying numbers of taps such as the double click in the Windows programme which leads to the opening of a file.
  • the invention can be operated in two different modes.
  • the above mentioned function represents the so-called standard mode, in which the touch results in a graphic and/or acoustic response.
  • a graphic or acoustic request can be put to the operator such as to touch a certain area of the model.
  • the operator e. g. a student to be examined, touches the supposed area, and the data processor checks whether the correct area has been touched, i. e. detected.
  • the data processor checks whether the correct area has been touched, i. e. detected.
  • Success, failure or a valuation are then communicated to the operator by means of the graphic and/or acoustic display. By using this mode the operator's knowledge is tested.
  • the optic-visual indicating device includes a projector which projects visual data such as texts or images directly to the area touched, which also allows to project the reverse sides. It is required, however, that the colour and the surface of the model area are adjusted to match the projection. If, for example, the operator with growing force presses the lung of the model, more low-lying sections are projected and represented. It is known to the specialist that such projections can be shown on separate monitors as well.
  • the projector is provided as video projector. This, for example, allows to show the blood transportation in the lung in a way very similar to reality, thus further improving the informative effect.
  • FIG. 1 a - f show the application of the invention to a model of an anatomic torso.
  • FIG. 2 shows the application of the invention to a model ear for the training in acupuncture.
  • FIG. 3 shows an embodiment with divided model.
  • FIG. 4 a, b show an embodiment of the invention for a non-medical application.
  • FIG. 1 a shows an artificial open upper part of a body 1 (phantom torso) with dismountable organs.
  • the invention serves to support the medical training.
  • the torso is mounted on a 6-component force-torque sensor 2 .
  • the sensor data lead to a data processing unit with graphic and acoustic output.
  • On the individual organs there are several small dots in yellow, blue and green colour. If, for example, a student of medicine touches one of the organs or a certain area of an organ, the name of the relative organ or area is communicated to him acoustically. Simultaneously a monitor shows the torso as artificial image in a shaded way and the name of the area touched is inserted.
  • the touched structures can be accentuated in colour. Even very fine anatomic structures, such as blood vessels, veinlets, lines of nerves, base points of muscles, can be made visible.
  • an operator touches the yellow dot on the artificial organ of the torso a photorealistic view of the organ or the area of the organ is represented to him on the monitor.
  • the blue dot the physiological relevance and possible pathologies are graphically and acoustically described. After all the green dot allows to start the graphic animation and films with sound. Further by an increase in pressure on an organ or the skin of the torso model it becomes possible to dip into the depth like a pin prick. As a result various body sections and internal sights are graphically represented in an animated way.
  • an artificial voice can request the operator to touch a certain area which is relevant from the anatomic point of view.
  • the place touched is then recorded by the data processing unit and the result is acoustically and graphically communicated and commented to the operator.
  • FIG. 1 b shows the operator removing one of the organs from the torso.
  • the sensor records an amended weight and a shifting of the centre of gravity.
  • the sensor automatically detects the organ which has been removed.
  • the artificial display of the torso on the monitor adjusts itself according to the amended torso.
  • FIG. 1 c shows how after the removal of several parts of organs more low-lying structures that have not been visible so far become visible now and can be explored further by touching them and by means of acoustic-graphic support.
  • FIG. 1 d shows a different graphic and acoustic display using a head-mounted-display (HMD).
  • HMD head-mounted-display
  • FIG. 1 e shows a different graphic display in which the text and image data are projected directly to the touched model.
  • This can be realized by means of a commercial projection beamer, in which case as for this example the model surface is to be white or unicoloured in a light colour.
  • FIG. 1 f shows an embodiment in which the phantom torso is fastened by two multiple-component sensors 2 a, 2 b.
  • the relative force-torque signals are vectorially added up and finally further processed by the data processing unit as sum signal which corresponds to the signal of one single sensor.
  • FIG. 2 shows an embodiment in which a phantom ear is utilized for the acupuncture training.
  • the phantom ear is connected with a force-torque sensor 2 .
  • the ear shows marks of the most important acupuncture positions. If the operator by means of a sharp-pointed object which is similar to an acupuncture pin touches the phantom ear, a voice and the monitor image tell him the name and the effect of the aimed dot. In this example of application the acoustic information and the text insertions are meaningful also for the reason that there is not enough space on the ear for the names and effects of the dots. Sound and image can also guide the operator when he looks for a desired dot. Further it is also possible to check how much time is taken by the operator to look for a certain dot and in which sequence he approaches these dots.
  • FIG. 3 shows an embodiment in which the model is divided.
  • the right model part is connected with the table by a force-torque sensor 2 a.
  • the left model part is connected with the right model part by means of a further force-torque sensor 2 b.
  • the sensor 2 b is the only connecting element between the right and the left model parts.
  • This arrangement also allows ambidextrous pointing activities.
  • the forces active at the left part can unambiguously be further processed by the connecting sensor 2 b.
  • the sensor 2 a on the side of the table receives the forces of both model parts, it is necessary for the localization of the right contact point that both sensor ends are coupled to each other.
  • the force-torque data of the connecting sensor are component for component subtracted, i. e. vectorially, from the force-torque data of the sensor on the side of the table (in a common coordinate system).
  • FIG. 4 a shows a model car mounted on a 6-component force-torque sensor 2 .
  • the force-torque data are leaded to a data processing unit which has an acoustic output facility by means of a sound generator (sound card).
  • the data processing unit includes a mathematical image of the model car geometry.
  • the model cars is composed of a plurality of small components, such as wheels, doors, bumpers, headlights. As soon as the operator (visitor of a museum) shortly touches one of the components by his finger, he hears the name of the touched component by means of loudspeakers. If he two times in a row quickly taps the same element, its function is explained to him in more detail.
  • the monitor Simultaneously with the output of the acoustic information the monitor shows an animated image of the model with a coloured accentuation of the touched part and a text box which explains the function in more detail.
  • One single long tapping starts a short film which describes the manufacturing process of the touched part.
  • FIG. 4 b shows an embodiment in which the model car is fastened by two multiple components 2 a, 2 b.
  • the relative force-torque signals are vectorially added and finally as a sum signal which corresponds to the signal of a single sensor further processed by the data processing unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Physics (AREA)
  • Medicinal Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Optimization (AREA)
  • Medical Informatics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Instructional Devices (AREA)
  • Processing Or Creating Images (AREA)
  • Electrically Operated Instructional Devices (AREA)
US10/541,295 2002-12-31 2003-12-31 Interactive teaching and learning device with three-dimensional model Abandoned US20060183096A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/397,758 US8403677B2 (en) 2002-12-31 2009-03-04 Interactive teaching and learning device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10261673A DE10261673A1 (de) 2002-12-31 2002-12-31 Interaktive Lehr- und Lernvorrichtung
DE10261673.6 2002-12-31
PCT/DE2003/004292 WO2004061797A1 (fr) 2002-12-31 2003-12-31 Dispositif interactif de formation et d'apprentissage avec modele tridimensionnel

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/397,758 Continuation US8403677B2 (en) 2002-12-31 2009-03-04 Interactive teaching and learning device

Publications (1)

Publication Number Publication Date
US20060183096A1 true US20060183096A1 (en) 2006-08-17

Family

ID=32519512

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/541,295 Abandoned US20060183096A1 (en) 2002-12-31 2003-12-31 Interactive teaching and learning device with three-dimensional model
US12/397,758 Expired - Fee Related US8403677B2 (en) 2002-12-31 2009-03-04 Interactive teaching and learning device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/397,758 Expired - Fee Related US8403677B2 (en) 2002-12-31 2009-03-04 Interactive teaching and learning device

Country Status (6)

Country Link
US (2) US20060183096A1 (fr)
EP (1) EP1579406B1 (fr)
CN (1) CN1745404B (fr)
AU (1) AU2003303603A1 (fr)
DE (1) DE10261673A1 (fr)
WO (1) WO2004061797A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060183099A1 (en) * 2005-02-14 2006-08-17 Feely Richard A Education and test preparation system, method and computer program product
US20080233550A1 (en) * 2007-01-23 2008-09-25 Advanced Fuel Research, Inc. Method and apparatus for technology-enhanced science education
US20090081627A1 (en) * 2007-09-26 2009-03-26 Rose Marie Ambrozio Dynamic Human Model
US20100129781A1 (en) * 2008-11-21 2010-05-27 National Taiwan University Electrical bronze acupuncture statue apparatus
US20100159434A1 (en) * 2007-10-11 2010-06-24 Samsun Lampotang Mixed Simulator and Uses Thereof
US20120156665A1 (en) * 2009-06-11 2012-06-21 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Real-Time X-Ray Vision for Healthcare Simulation
JP2012148059A (ja) * 2010-12-27 2012-08-09 Kochi Univ Of Technology タンジブルデバイスを備えている生体画像処理システム
US8966681B2 (en) 2013-02-26 2015-03-03 Linda L. Burch Exercise mat
US20160240102A1 (en) * 2015-02-12 2016-08-18 Vikram BARR System for audio-tactile learning with reloadable 3-dimensional modules
FR3044121A1 (fr) * 2015-11-19 2017-05-26 Univ Paris 1 Pantheon-Sorbonne Equipement de realite augmentee et d'interface tangible
US9773347B2 (en) 2011-11-08 2017-09-26 Koninklijke Philips N.V. Interacting with a three-dimensional object dataset
US20180040261A1 (en) * 2016-08-03 2018-02-08 Megaforce Company Limited Human body cavity model
US10097817B2 (en) * 2016-08-03 2018-10-09 MEGAFORCE COMPANY LlMlTED Double-image projection device projecting double images onto 3-dimensional ear canal model
US10325522B2 (en) * 2012-01-27 2019-06-18 University of Pittsburgh—of the Commonwealth System of Higher Education Medical training system and method of employing
US10354555B2 (en) * 2011-05-02 2019-07-16 Simbionix Ltd. System and method for performing a hybrid simulation of a medical procedure
US20210033842A1 (en) * 2018-04-27 2021-02-04 Hewlett-Packard Development Company, L.P. Nonrotating nonuniform electric field object rotation
CN114680894A (zh) * 2022-03-10 2022-07-01 马全胜 检测操作准确性的方法以及装置
CN116935730A (zh) * 2023-08-18 2023-10-24 大连鸿峰生物科技有限公司 一种基于塑化神经系统概观标本的立体展示方法

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7736149B2 (en) * 2004-09-29 2010-06-15 Towliat Faye F Operating room display and related methods
DE102005062611A1 (de) * 2005-12-23 2007-06-28 Burgkart, Rainer, Dr. med. Modell-Simulationsvorrichtung zum Simulieren von Eindringvorgängen
CN102314779B (zh) * 2010-06-30 2013-11-27 上海科技馆 禽类胚胎发育至孵化过程演示装置及其演示方法
DE102013019563B4 (de) 2013-11-22 2021-11-18 Audi Ag Verfahren zum Bereitstellen von Information über eine Umgebung an einem Smart-Gerät
CN105160977A (zh) * 2015-08-05 2015-12-16 成都嘉逸科技有限公司 一种人体解剖学3d教学系统
CN110087550B (zh) * 2017-04-28 2022-06-17 深圳迈瑞生物医疗电子股份有限公司 一种超声图像显示方法、设备及存储介质
CN109729325A (zh) * 2017-10-30 2019-05-07 王雅迪 实现动态跟踪旋转展览模型的实时投影系统
CN110619797A (zh) * 2018-06-20 2019-12-27 天津小拇指净化技术有限公司 智能化手术室示范教学系统
US10410542B1 (en) * 2018-07-18 2019-09-10 Simulated Inanimate Models, LLC Surgical training apparatus, methods and systems
CN115465006A (zh) * 2022-10-21 2022-12-13 西安外事学院 一种激光浮雕图像盲人可触可视感知实现方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5376948A (en) * 1992-03-25 1994-12-27 Visage, Inc. Method of and apparatus for touch-input computer and related display employing touch force location external to the display
US5400661A (en) * 1993-05-20 1995-03-28 Advanced Mechanical Technology, Inc. Multi-axis force platform
US6141000A (en) * 1991-10-21 2000-10-31 Smart Technologies Inc. Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US6915709B2 (en) * 2003-03-31 2005-07-12 Wacoh Corporation Force detection device
US20050246109A1 (en) * 2004-04-29 2005-11-03 Samsung Electronics Co., Ltd. Method and apparatus for entering information into a portable electronic device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3742935A (en) * 1971-01-22 1973-07-03 Humetrics Corp Palpation methods
US4254562A (en) * 1979-02-14 1981-03-10 David Murray Combination cardiovascular electronic display/teaching apparatus, system and methods of constructing and utilizing same
DE3638192A1 (de) * 1986-11-08 1988-05-19 Laerdal Asmund S As System und verfahren zum testen einer person in der ausuebung der cardiopulmonaren wiederbelebung (cpr) und zur bewertung von uebungen der cpr
DE3642088A1 (de) * 1986-12-10 1988-06-23 Wolfgang Brunner Anordnung zur messung von kraftverteilungen
WO1991004553A2 (fr) * 1989-09-18 1991-04-04 Paolo Antonio Grego Edizioni S.A.S. Tableau synoptique facilitant l'acquisition d'informations et/ou la localisation de certains points sur une carte
US5259764A (en) * 1991-04-29 1993-11-09 Goldsmith Bruce W Visual display apparatus for the display of information units and related methods
DE10017119A1 (de) 2000-04-06 2001-10-31 Fischer Brandies Helge Apparatur und Verfahren zur Messung der Kraftwirkung an Zähnen, Zahnmodellen und/oder Implantaten
AUPR118100A0 (en) * 2000-11-02 2000-11-23 Flinders Technologies Pty Ltd Apparatus for measuring application of pressure to an imitation body part
DE10217630A1 (de) * 2002-04-19 2003-11-13 Robert Riener Verfahren und Vorrichtung zum Erlernen und Trainieren zahnärztlicher Behandlungsmethoden

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6141000A (en) * 1991-10-21 2000-10-31 Smart Technologies Inc. Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US5376948A (en) * 1992-03-25 1994-12-27 Visage, Inc. Method of and apparatus for touch-input computer and related display employing touch force location external to the display
US5400661A (en) * 1993-05-20 1995-03-28 Advanced Mechanical Technology, Inc. Multi-axis force platform
US6915709B2 (en) * 2003-03-31 2005-07-12 Wacoh Corporation Force detection device
US20050246109A1 (en) * 2004-04-29 2005-11-03 Samsung Electronics Co., Ltd. Method and apparatus for entering information into a portable electronic device

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060183099A1 (en) * 2005-02-14 2006-08-17 Feely Richard A Education and test preparation system, method and computer program product
US20080233550A1 (en) * 2007-01-23 2008-09-25 Advanced Fuel Research, Inc. Method and apparatus for technology-enhanced science education
US8469715B2 (en) * 2007-09-26 2013-06-25 Rose Marie Ambrozio Dynamic human model
US20090081627A1 (en) * 2007-09-26 2009-03-26 Rose Marie Ambrozio Dynamic Human Model
US20100159434A1 (en) * 2007-10-11 2010-06-24 Samsun Lampotang Mixed Simulator and Uses Thereof
US20100129781A1 (en) * 2008-11-21 2010-05-27 National Taiwan University Electrical bronze acupuncture statue apparatus
US9053641B2 (en) * 2009-06-11 2015-06-09 University of Pittsburgh—of the Commonwealth System of Higher Education Real-time X-ray vision for healthcare simulation
US20120156665A1 (en) * 2009-06-11 2012-06-21 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Real-Time X-Ray Vision for Healthcare Simulation
JP2012148059A (ja) * 2010-12-27 2012-08-09 Kochi Univ Of Technology タンジブルデバイスを備えている生体画像処理システム
US10354555B2 (en) * 2011-05-02 2019-07-16 Simbionix Ltd. System and method for performing a hybrid simulation of a medical procedure
US9773347B2 (en) 2011-11-08 2017-09-26 Koninklijke Philips N.V. Interacting with a three-dimensional object dataset
US10325522B2 (en) * 2012-01-27 2019-06-18 University of Pittsburgh—of the Commonwealth System of Higher Education Medical training system and method of employing
US8966681B2 (en) 2013-02-26 2015-03-03 Linda L. Burch Exercise mat
US20160240102A1 (en) * 2015-02-12 2016-08-18 Vikram BARR System for audio-tactile learning with reloadable 3-dimensional modules
FR3044121A1 (fr) * 2015-11-19 2017-05-26 Univ Paris 1 Pantheon-Sorbonne Equipement de realite augmentee et d'interface tangible
US10097817B2 (en) * 2016-08-03 2018-10-09 MEGAFORCE COMPANY LlMlTED Double-image projection device projecting double images onto 3-dimensional ear canal model
US10115321B2 (en) * 2016-08-03 2018-10-30 Megaforce Company Limited Human body cavity model
US20180040261A1 (en) * 2016-08-03 2018-02-08 Megaforce Company Limited Human body cavity model
US20210033842A1 (en) * 2018-04-27 2021-02-04 Hewlett-Packard Development Company, L.P. Nonrotating nonuniform electric field object rotation
US12204085B2 (en) * 2018-04-27 2025-01-21 Hewlett-Packard Development Company, L.P. Nonrotating nonuniform electric field object rotation
CN114680894A (zh) * 2022-03-10 2022-07-01 马全胜 检测操作准确性的方法以及装置
CN116935730A (zh) * 2023-08-18 2023-10-24 大连鸿峰生物科技有限公司 一种基于塑化神经系统概观标本的立体展示方法

Also Published As

Publication number Publication date
CN1745404A (zh) 2006-03-08
DE10261673A1 (de) 2004-07-15
CN1745404B (zh) 2010-06-23
US20090162823A1 (en) 2009-06-25
EP1579406A1 (fr) 2005-09-28
WO2004061797A1 (fr) 2004-07-22
US8403677B2 (en) 2013-03-26
EP1579406B1 (fr) 2016-07-06
AU2003303603A1 (en) 2004-07-29

Similar Documents

Publication Publication Date Title
US8403677B2 (en) Interactive teaching and learning device
US6428323B1 (en) Medical examination teaching system
US20200242961A1 (en) Phonics exploration toy
CN105096670B (zh) 一种用于鼻胃管操作实训的智能沉浸式教学系统及装置
US11810473B2 (en) Optical surface tracking for medical simulation
DE69429491D1 (de) Vorrichtung und Verfahren zum Anzeigen von dreidimensionalen Daten eines Ultraschallechographiegeräts
KR940007721A (ko) 상호 작용하는 항공기 훈련 장치 및 그 방법
ATE286610T1 (de) Endoskopisches tutorisches system
US7889170B2 (en) Inner force sense presentation device, inner force sense presentation method, and inner force sense presentation program
WO2012160999A1 (fr) Système d'entraînement à la stéthoscopie et stéthoscope simulé
EP3789989B1 (fr) Simulateur laparoscopique
WO2009008750A1 (fr) Simulateur d'endoscope
ITBO20090111A1 (it) Metodo e apparato di addestramento chirurgico
WO1999017265A1 (fr) Procede et appareil d'entrainement chirurgical et de simulation d'operation chirurgicale
EP4258993A1 (fr) Procédé et système mis en oeuvre par ordinateur pour cartographier l'attention spatiale
ES2534140B1 (es) Procedimiento y dispositivo para el aprendizaje y entrenamiento de operaciones de cirugía laparoscópica e intervenciones similares
TWI687904B (zh) 互動式訓練及檢測裝置
Vishwanath The epistemological status of vision and its implications for design
Isaksson-Daun et al. Assessing mobility of blind and low-vision individuals through a portable virtual reality system and a comprehensive questionnaire
US20250316184A1 (en) Training platform
DE20221756U1 (de) Interaktive Lehr- und Lernvorrichtung
Fanghella et al. A Low-Cost, Moderately Fast System for Online Motion Tracking in Laparoscopic Surgery Training
HK40038871B (en) Laparoscopic simulator
Erdani et al. Developing a wii remote-based interactive handwriting board
Müller Challenges in Information Representation with Augmented Reality for Procedural Task Support

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION