WO2020138671A1 - Système d'évaluation d'actes chirurgicaux basé sur la réalité virtuelle, utilisant un simulateur, pour l'oto-laryngologie et la neurochirurgie - Google Patents
Système d'évaluation d'actes chirurgicaux basé sur la réalité virtuelle, utilisant un simulateur, pour l'oto-laryngologie et la neurochirurgie Download PDFInfo
- Publication number
- WO2020138671A1 WO2020138671A1 PCT/KR2019/013827 KR2019013827W WO2020138671A1 WO 2020138671 A1 WO2020138671 A1 WO 2020138671A1 KR 2019013827 W KR2019013827 W KR 2019013827W WO 2020138671 A1 WO2020138671 A1 WO 2020138671A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- information
- moving information
- line
- moving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/285—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
Definitions
- the present invention relates to an otorhinolaryngology and neurosurgery surgery evaluation system, and more specifically, based on virtual reality, such as VR (virtual reality), AR (Augmented Reality), MR (MixedalReality), of an otolaryngology subject or neurosurgery
- VR virtual reality
- AR Algmented Reality
- MR MatedalReality
- VR Virtual reality
- the medical field is emerging as an application field of VR that is developing around games, and attempts to actively introduce VR in hospitals are underway.
- the VR technology industry is also predicting that the medical service will become the representative VR B2B market in the future, and is expanding its sales force by trying various projects.
- Patent Document 1 augmented reality based laparoscopic surgery simulation system and method using the same.
- the augmented reality-based simulation system for laparoscopic surgery includes a laparoscopic surgical tool for generating first laparoscopic surgical information; A simulator module that is provided with a human body model therein to generate second laparoscopic surgical information when it is sensed that the laparoscopic surgical instrument is cutting the human body model or inserted into the human body model; An information processing module for generating augmented reality information based on the first laparoscopic surgical information and the second laparoscopic surgical information; And augmented reality glass to output the augmented reality information. It includes.
- the learner simulates laparoscopic surgery or compares the image information of the previously stored laparoscopic surgery with the laparoscopic surgery simulated by the learner himself, and compares the laparoscopic surgery simulated by the learner with respect to the previously stored laparoscopic surgery. It is possible to evaluate which information matches the image information, so that the surgical technique required for laparoscopic surgery can be effectively polished, and the educator can easily use the augmented reality information to facilitate educational 3D content for laparoscopic surgery. Can be produced.
- Patent Document 2 published number 10-2018-0123310, hereinafter referred to as Patent Document 2
- patent document 2 it relates to a laparoscopic surgical education system using augmented reality that gives a user the feeling of actually performing an operation and allows them to practice while being guided by a surgical method.
- a laparoscopic surgical practice system using virtual reality glasses is provided. Therefore, according to the present invention, it is possible to overcome the reality that is difficult to practice in person, thereby giving an opportunity to deal with the actual internal organs, and thus has an advantage of being effective education even indirectly to students receiving medical education.
- Patent Document 3 there is also a “virtual reality training system and method for dentistry (Publication No. 2003-0044909, hereinafter referred to as Patent Document 3)".
- Patent Literature 3 it detects data regarding the spatial position of an actual element that can be held and used in hand, displays a three-dimensional display of a virtual object on a screen, and displays a virtual device corresponding to the actual spatial location of the actual element.
- Virtual reality practice to obtain procedural movement in dentistry by processing spatial location data to provide spatial representation, providing a virtual instrument operating on the virtual object, and modeling the interaction between the virtual object and the virtual instrument It is about a system.
- the hand-held element is a tactile human-machine interface (IHM) device having an actuator that is controlled to provide force feedback to a user holding the real element in hand when the virtual instrument interacts with the virtual object. Belongs.
- IHM tactile human-machine interface
- Patent Document 4 there is also a “virtual surgery simulation apparatus and its operation method (Publication No. 10-2016-0092425, hereinafter referred to as Patent Document 4)".
- a virtual surgical simulation apparatus includes a control unit for performing virtual surgery on the patient by using a 3D image of a patient's surgical target area; An image processing unit generating a surgical image including at least a part of the process of the virtual surgery; And it characterized in that it comprises a communication unit for transmitting at least one of the surgical image and the surgical results.
- the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention is devised to solve the conventional problems as described above, and presents the following problems to be solved.
- the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator according to the present invention has the following problem solving means for the above-mentioned problems.
- a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator is mounted on the front face of a medical practitioner, and includes goggles for visually providing spatial information to the medical practitioner; A controller provided in the hand of the medical practitioner to map moving information of the motion of the medical practitioner's hand into the spatial information; A spatial information receiving unit receiving the spatial information provided by the goggles; A moving information receiver configured to receive the moving information of the medical practitioner mapped by the controller; A reference data setting unit for pre-acquiring and setting preset reference data for the moving information on the spatial information; And an evaluation measurement unit receiving the preset reference data from the reference data setting unit and comparing the moving information of the medical practitioner on the spatial information to calculate a quantitative score.
- the preset reference data of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator according to the present invention is obtained by obtaining the moving information provided by a plurality of specialists in advance, by the plurality of specialists on the spatial information It can be characterized in that it is set to the average value of the traces in the three-dimensional coordinates of the moving information.
- the goggles of the surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to the present invention is characterized by providing at least one virtual reality among virtual reality (VR), Augmented Reality (AR), or Mixed Reality (MR) can do.
- VR virtual reality
- AR Augmented Reality
- MR Mixed Reality
- the evaluation measurement unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator includes: a spatial coordinate recognition unit for reading and acquiring the spatial information in three-dimensional spatial coordinates; A moving coordinate recognition unit for acquiring 3D spatial coordinates of the moving information in the 3D spatial coordinates of the spatial information; And a dimension extraction unit for extracting a predetermined dimension for evaluating the moving information on a predetermined surgical site in the spatial information.
- the dimensional extraction unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator sets the predetermined dimension to a plurality of one-dimensional, and the evaluation measurement unit touches the predetermined surgical site in one dimension It may be characterized in that it further comprises a point triggering unit which is set as a touch line and is set as a fail line at a lower portion having a predetermined depth from the touch line.
- the point triggering unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator includes: a target point setting unit configured to set the predetermined surgical site and provide the spatial information through the goggles; A touch line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the touch line; And a fail line recognition unit recognizing whether the moving information of the medical practitioner input by the controller touches the fail line formed under the touch line.
- the point triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator comprises: a touch time recognition unit for acquiring the total time that the moving information of the medical practitioner stays at the predetermined surgical site; A depth weighting unit continuously calculating a score of the moving information according to the depth of the moving information inserted into a space extending from the touch line to the failing line, and continuously subtracting the score in proportion to the depth of the moving information; And a depth dispersion measurement unit configured to measure a variance of the depth of the moving information inserted into a space leading from the touch line to the fail line, and to give a score of the moving information in inverse proportion to the variance. .
- the dimensional extraction unit of the surgical evaluation system of the virtual reality-based otolaryngology and neurosurgery simulator sets the predetermined dimension in two dimensions, and the evaluation measurement unit sets the predetermined surgical area in a two-dimensional area, , It may be characterized in that it further comprises a line trigger for calculating a distance spaced from the origin of the two-dimensional area.
- the line triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator includes: a target area setting unit for setting the origin of the two-dimensional area as a target area for the predetermined surgical area; A coordinate setting unit configured to set the two-dimensional area as x-y coordinates and recognize coordinates in the x-y coordinates of the two-dimensional area of the moving information; Deviation diameter calculation unit for measuring the distance the moving information is spaced on the x-y coordinates from the origin; And a coordinate shift unit configured to shift the two-dimensional area to cause the target region setting unit to reset the target region for the surgical site.
- the line triggering unit of the surgical evaluation system of the virtual reality based otolaryngology and neurosurgery simulator counts the number of times the moving information of the medical practitioner touches the xy coordinates of the two-dimensional area, and the moving information is the It may be characterized in that it further comprises a touch count unit for recognizing whether to touch the xy coordinates of the two-dimensional area multiple times.
- the virtual reality-based otolaryngology and neurosurgery simulator surgical evaluation system according to the present invention having the above configuration provides the following effects.
- VR virtual reality
- AR Augmented Reality
- MR Mixed Reality
- VR virtual reality
- AR Augmented Reality
- MR Magnetic Reality
- FIG. 1 is a conceptual diagram of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 2 is a block diagram of a surgical evaluation system of a virtual reality based otorhinolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 3 is a block diagram showing sub-components of an evaluation measurement unit that is one component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 4 is an exemplary screen of a virtual reality that a medical trainee can access through VR or the like for use of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 5 is an exemplary screen of a virtual reality that a medical trainee can access through AR or MR for use in a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 6 is a block diagram showing sub-components of a point triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 7 is a front cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
- FIG. 8 is a side cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
- FIG. 9 is a block diagram showing sub-components of a line triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 10 is a conceptual diagram illustrating coordinates of a target region of a line triggering unit, which is a component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 1 is a conceptual diagram of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- 2 is a block diagram of a surgical evaluation system of a virtual reality based otorhinolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 3 is a block diagram showing sub-components of an evaluation measurement unit that is one component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 4 is an exemplary screen of a virtual reality that a medical trainee can access through VR or the like for use of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- 5 is an exemplary screen of a virtual reality that a medical trainee can access through AR or MR for use in a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- 6 is a block diagram showing sub-components of a point triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 7 is a front cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- FIG. 8 is a side cross-sectional view showing a touch line and a fail line for a target point of a point triggering unit of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention. .
- FIG. 9 is a block diagram showing sub-components of a line triggering unit which is a component of a surgical evaluation system of a virtual reality based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- 10 is a conceptual diagram illustrating coordinates of a target region of a line triggering unit, which is a component of a surgical evaluation system of a virtual reality-based otolaryngology and neurosurgery simulator according to an embodiment of the present invention.
- the surgical evaluation system of the virtual reality-based otorhinolaryngology and neurosurgery simulator according to the present invention is as shown in FIG. 1, medical students, interns, residents, general doctors, majors, or other persons engaged in medical systems (hereinafter referred to as'medical practice') It is a technical field of a system that allows a person to have an experience related to surgery through spatial information, which is a virtual reality, without using tangible objects, and to allow quantitative evaluation through such experiences.
- the medical practitioner wears goggles (goggle, 11), and then provides a virtual reality (virtual reality) to the medical practitioner, the progress of the surgery through the operation of their hands in this virtual reality Based on how close the procedure is to the procedure normally performed by medical professionals who are proficient in the operation, the technical idea of quantitative evaluation is disclosed.
- goggles goggles
- virtual reality virtual reality
- the surgical evaluation system of the virtual reality based otorhinolaryngology and neurosurgery simulator according to the present invention has an otorhinolaryngology surgical evaluation system, as shown in FIG. 1, and in front of such a system, goggles 11 that provide VR and medical practice It may include a controller 12 that reads the motion of the child's hand, that is, moving information in three dimensions.
- the goggles 11 are configured to provide spatial information to the medical practitioner while being mounted on the front face of the medical practitioner.
- various blank information is provided from a storage in which raw data 13 are stored, and this spatial information is three-dimensional stereoscopic information about the nasal cavity of the otorhinolaryngology system as shown in FIGS. .
- At least one virtual reality among virtual reality (VR), augmented reality (AR), and mixed reality (MR) may be provided.
- the controller 12 As shown in FIG. 1, it is provided to the hand of a medical practitioner to read the movement of the hand of the medical practitioner, that is, moving information, and the goggles 11 as described above are provided. It is a configuration that maps to spatial information.
- the hands of the medical practitioner can read all the three-dimensional moving information necessary to proceed with the surgery.
- controller 12 be able to grasp a fine motion to read a fine motion that moves within a volume of an adult male fist, rather than encompassing an area having a large radius of action.
- the spatial information receiving unit 100 it is a configuration that receives the spatial information provided by the goggles 11.
- the 3D stereoscopic information visually provided through the face of the medical practitioner through the goggles 11 is provided.
- the spatial information receiving unit 100 may receive the visual elements of the spatial information provided by the virtual reality providing unit 10 to the goggles 11 as it is, and may also abbreviate and accommodate only spatial coordinate information, not visual elements. have.
- the moving information receiving unit 200 is configured to receive moving information of a medical practitioner mapped by the controller 12.
- the moving information receiving unit 200 is computationally connected to the controller 12 to accommodate the motion of the practitioner's hands, that is, reading moving information, in three-dimensional spatial coordinates.
- the spatial information receiving unit 100 obtains and sets desirable moving information to be moved by the medical practitioner, that is, preset reference data, in advance on the spatial information received from the goggles 11 .
- the preset reference data corresponds to the average value of traces in three-dimensional coordinates with respect to the desired hand movement of the surgical site that the surgical practitioner is currently experiencing and tested. More specifically, moving information by a plurality of specialists on spatial information is obtained on three-dimensional coordinates, and an average value of traces by moving information by a plurality of specialists is set as preset reference data. .
- the preset reference data may be obtained after concentrating the controller 12 in a state where a plurality of specialists respectively wear goggles 11.
- the preset reference data is received from the reference data setting unit 300, the moving information of the medical practitioner is compared on the spatial information as described above, and the moving information of the medical practitioner is preset. It corresponds to the configuration that calculates a quantitative score by comparing it with the reference data (moving information of professional medical staff).
- a spatial coordinate recognition unit 410 may be included.
- a moving coordinate recognition unit 420 may be included.
- a dimension extraction unit 430 may be included.
- the spatial information as shown in FIG. 4 is read and obtained as three-dimensional spatial coordinates.
- the spatial information of FIGS. 4 and 5 is obtained by reflecting microscopic coordinate information by a three-dimensional x-y-z axis, in addition to a shape visually or visually confirmed through the goggles 11.
- the moving coordinate recognition unit 420 In the case of the moving coordinate recognition unit 420, the three-dimensional spatial coordinates of the moving information in the spatial information to which the moving information is reflected and mapped are acquired according to the change of time.
- the dimension extraction unit 430 a configuration for extracting a predetermined dimension for evaluating moving information on a predetermined surgical site in spatial information.
- the fail line of FIG. 7 can be set as a blood vessel that should not be touched among surgical sites, and the touch line is incised by a medical practitioner and neatly cut through a mass It can be set to the line that should go out.
- the predetermined surgical site can be viewed as a touch line, and as described above, in order to determine the hand motion of the medical practitioner, that is, moving information, a predetermined dimension as shown in FIG. 8 is a straight line, that is, a one-dimensional You can also set
- a solid black line in FIG. 10 means a preferred incision line in which a medical practitioner is required to cut a specific skin surface in the nasal cavity into a weak S-shape through a mass.
- a plane orthogonal to the corresponding incision line, that is, a two-dimensional plane is extracted.
- a predetermined dimension means a hand gesture of a medical practitioner, that is, a dimension extracted for quantitative evaluation of moving information.
- a point triggering unit 440 may be further included.
- the dimension extraction unit 430 sets a plurality of predetermined dimensions to two one-dimensional lines, in this case, the point triggering unit 440 is previously determined as described above.
- the surgical site is set as a one-dimensional touch line, and a fail line is set below the touch line having a predetermined depth.
- the predetermined depth may be arbitrarily set at the ENT surgical site, and the surface of arteries or veins or nerves under the swollen skin tissue and the swollen skin tissue may be set as a fail line.
- a target point setting unit 441, a touch line recognition unit 442, and a fail line recognition unit 443 may be included.
- a surgical site that is, a predetermined surgical site is set, and spatial information is visually provided through the goggles 11 to the medical practitioner.
- the target point setting unit 441 visually provides the crystalline species in the nasal cavity through the goggles 11.
- the medical practitioner can determine the existence of these tissues only by visual observation, and through the visual observation, the medical practitioner can find the crystalline species.
- the touch line recognition unit 442 it is recognized whether the moving information of the medical practitioner input by the controller touches the touch line.
- the moving information of the medical practitioner input by the controller 12, that is, the 3D coordinate information of the hand gesture touches a fail line formed below the touch line. Or not.
- the failline recognition unit 443 recognizes whether or not an area that should not be touched, such as a nerve, an artery, or a vein, is touched.
- the fail line recognition unit 443 recognizes only whether the fail line is touched or not, and if there is a touch of the fail line, the failing operation of the medical practitioner is recognized as failing.
- the point triggering unit 440 may further include a touch time recognition unit 444, a depth weighting unit 445, and a depth dispersion measurement unit 446.
- the area to be operated should be incised smoothly through a mass at an appropriate rate, which should take into account the sharpness of the mass blade, the elasticity of the surgical site, tissue bleeding, and moisture evaporation. Because.
- the speed of the incision is important for the total time that the medical practitioner's moving information stays at the surgical site, and the touch time recognition unit 444 acquires the medical practitioner.
- the touch time recognition unit 444 measures the time around which the medical practitioner concentrates the mass and cuts it in the virtual space.
- the touch line recognition unit 442 and the fail line recognition unit 443 it is possible to determine how well the touch line is cut along and only touches the fail line. Further, the depth weight unit 445 In the case of, the penalties are imparted proportionally according to how much more mass enters the touch line and the mass entered deeper.
- FIG. 8(a) only the recognition of the touch line and the recognition of the fail line are determined.
- FIG. 8(b) the depth of the deduction is continuously increased according to the depth from the touch line to the fail line.
- the weight unit 445 is set.
- the variance of the depth of the moving information inserted in the space from the touch line to the fail line is measured, and the moving information is scored in inverse proportion to the variance.
- the evaluation measurement unit 400 may further include a line triggering unit 450 together with or separately from the point triggering unit 440.
- the dimension extracting unit 430 sets a predetermined dimension in two dimensions, but the line triggering unit 450 sets the predetermined surgical area to a two dimensional area. It is a configuration that calculates the distance (l) spaced apart from the origin (c) of the two-dimensional area.
- the solid line in FIG. 10 means preset reference data, that is, a desirable incision line, which is displayed when a plurality of specialists perform surgery.
- the line triggering unit 450 includes a target area setting unit 451, a coordinate setting unit 452, a deviation diameter calculating unit 454, and a coordinate shift unit 455. can do.
- the origin c of the two-dimensional area is set as a target region for a predetermined surgical site.
- the incision is set as the origin (c), and another virtual two-dimensional plane perpendicular to the incision line, that is, an area is extracted in a predetermined dimension.
- the coordinate setting unit 452 by setting the two-dimensional area as the xy coordinate as described above, the moving information of the medical practitioner, that is, the part where the mass passes through the nasal cavity, which is a virtual space, passes through FIG. 10. It is recognized as the xy coordinate of.
- c means the solid line in Fig. 10, and the two-dimensional area corresponds to the area w crossing the solid line.
- the distance l from c in the x-y coordinates of the moving information recognized by the coordinate setting unit 452 is measured.
- the two-dimensional area is shifted to cause the target region setting beam 451 to reset the target region for the surgical site.
- the touch counting unit 453 may be further included.
- the touch counting unit 453 the number of times the moving information of the medical practitioner touches the x-y coordinate of the 2D area is counted to recognize whether the moving information touches the x-y coordinate of the 2D area multiple times.
- the number of such incisions is counted because the incision line as shown in FIG. 10 must not be passed twice, that is, the incision should not be performed twice.
- the moving time measuring unit 456 similar to the touch time recognition unit 442, the cutting time of the mass moving along the cutting line of FIG. 10 is measured.
- reference data backup unit 500 information previously obtained by a plurality of specialists as described above, that is, a desirable mass movement that medical practitioners should imitate, that is, a storage for backing up moving information, which is preset in such storage Reference data may be physically divided and stored.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Computational Mathematics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Medicinal Chemistry (AREA)
- Algebra (AREA)
- Heart & Thoracic Surgery (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Radiology & Medical Imaging (AREA)
- Pulmonology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention concerne un système destiné à évaluer quantitativement le niveau de compétence en chirurgie oto-laryngologique ou en neurochirurgie sur la base d'une réalité virtuelle, le système comportant: des lunettes servant à fournir visuellement des informations spatiales à un stagiaire médical; un moyen de commande servant à cartographier, dans les informations spatiales, des informations de déplacement concernant le mouvement des mains du stagiaire médical; une unité de réception d'informations spatiales servant à recevoir les informations spatiales fournies par les lunettes; une unité de réception d'informations de déplacement servant à recevoir les informations de déplacement du stagiaire médical cartographiées par le moyen de commande; une unité de configuration de données de référence servant à recevoir par avance et à configurer, sur les informations spatiales, des données de référence préconfigurées associées aux informations de déplacement; et une unité de mesure d'évaluation servant à calculer un score quantitatif des informations de déplacement du stagiaire médical.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2018-0170079 | 2018-12-27 | ||
| KR1020180170079A KR102143784B1 (ko) | 2018-12-27 | 2018-12-27 | 가상현실 기반 이비인후과 및 신경외과 시뮬레이터의 수술 평가 시스템 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020138671A1 true WO2020138671A1 (fr) | 2020-07-02 |
Family
ID=71126540
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/013827 Ceased WO2020138671A1 (fr) | 2018-12-27 | 2019-10-21 | Système d'évaluation d'actes chirurgicaux basé sur la réalité virtuelle, utilisant un simulateur, pour l'oto-laryngologie et la neurochirurgie |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR102143784B1 (fr) |
| WO (1) | WO2020138671A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113077662A (zh) * | 2021-04-03 | 2021-07-06 | 刘铠瑞 | 一种基于5g网络技术应用的腹腔镜手术及培训系统 |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102532795B1 (ko) * | 2020-07-31 | 2023-05-15 | 전남대학교산학협력단 | 가상현실 기반 치의학 실습교육 대상자의 수행 행동 데이터 수집 시스템 및 방법 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20100106834A (ko) * | 2009-03-24 | 2010-10-04 | 주식회사 이턴 | 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법 |
| KR101166554B1 (ko) * | 2011-01-14 | 2012-07-18 | 가톨릭대학교 산학협력단 | 소작 애니메이션 효과 생성 장치 및 방법 |
| KR20150132681A (ko) * | 2014-05-15 | 2015-11-26 | 리치앤타임(주) | 다중 접속을 통하여 다수의 훈련자의 개별적인 가상훈련공간의 인식과 공유된 가상작업공간에서 집단적이며 조직적인 협력훈련이 가능한 몰입식 네트워크 가상훈련 시스템의 클라이언트 시스템을 구성하는 네트워크 가상훈련 처리장치 및 이를 이용한 몰입식 네트워크 가상훈련 방법. |
| JP2016500157A (ja) * | 2012-11-13 | 2016-01-07 | エイドス−メディスン リミティッド・ライアビリティ・カンパニー | ハイブリッド医療用腹腔鏡シミュレータ |
| KR101887805B1 (ko) * | 2017-03-23 | 2018-08-10 | 최재용 | 증강현실 기반의 복강경 수술용 시뮬레이션 시스템 및 이를 이용한 방법 |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2808366B1 (fr) | 2000-04-26 | 2003-12-19 | Univ Paris Vii Denis Diderot | Procede et systeme d'apprentissage en realite virtuelle, et application en odontologie |
| TWI605795B (zh) * | 2014-08-19 | 2017-11-21 | 鈦隼生物科技股份有限公司 | 判定手術部位中探針位置之方法與系統 |
| KR20160092425A (ko) | 2015-01-27 | 2016-08-04 | 국민대학교산학협력단 | 가상 수술 시뮬레이션 장치 및 그 동작 방법 |
| KR20180123310A (ko) | 2017-05-08 | 2018-11-16 | 서정훈 | 증강현실을 이용한 복강경 수술 교육시스템 |
-
2018
- 2018-12-27 KR KR1020180170079A patent/KR102143784B1/ko active Active
-
2019
- 2019-10-21 WO PCT/KR2019/013827 patent/WO2020138671A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20100106834A (ko) * | 2009-03-24 | 2010-10-04 | 주식회사 이턴 | 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법 |
| KR101166554B1 (ko) * | 2011-01-14 | 2012-07-18 | 가톨릭대학교 산학협력단 | 소작 애니메이션 효과 생성 장치 및 방법 |
| JP2016500157A (ja) * | 2012-11-13 | 2016-01-07 | エイドス−メディスン リミティッド・ライアビリティ・カンパニー | ハイブリッド医療用腹腔鏡シミュレータ |
| KR20150132681A (ko) * | 2014-05-15 | 2015-11-26 | 리치앤타임(주) | 다중 접속을 통하여 다수의 훈련자의 개별적인 가상훈련공간의 인식과 공유된 가상작업공간에서 집단적이며 조직적인 협력훈련이 가능한 몰입식 네트워크 가상훈련 시스템의 클라이언트 시스템을 구성하는 네트워크 가상훈련 처리장치 및 이를 이용한 몰입식 네트워크 가상훈련 방법. |
| KR101887805B1 (ko) * | 2017-03-23 | 2018-08-10 | 최재용 | 증강현실 기반의 복강경 수술용 시뮬레이션 시스템 및 이를 이용한 방법 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113077662A (zh) * | 2021-04-03 | 2021-07-06 | 刘铠瑞 | 一种基于5g网络技术应用的腹腔镜手术及培训系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR102143784B1 (ko) | 2020-08-12 |
| KR20200080534A (ko) | 2020-07-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10902677B2 (en) | Interactive mixed reality system and uses thereof | |
| EP2915157B1 (fr) | Système d'entraînement à l'injection | |
| Banerjee et al. | Accuracy of ventriculostomy catheter placement using a head-and hand-tracked high-resolution virtual reality simulator with haptic feedback | |
| Gallagher et al. | PicSOr: an objective test of perceptual skill that predicts laparoscopic technical skill in three initial studies of laparoscopopic performance | |
| Linke et al. | Assessment of skills using a virtual reality temporal bone surgery simulator | |
| US20030031993A1 (en) | Medical examination teaching and measurement system | |
| WO2008099028A1 (fr) | Système de simulation pour l'entraînement à la chirurgie arthroscopique | |
| Ho et al. | Virtual reality myringotomy simulation with real‐time deformation: development and validity testing | |
| Müller et al. | Virtual reality in surgical arthroscopic training | |
| WO2018062722A1 (fr) | Système de simulation de formation en acupuncture | |
| WO2021125547A1 (fr) | Système de formation à la réanimation cardio-pulmonaire fondé sur la réalité virtuelle | |
| Wei et al. | Augmented optometry training simulator with multi-point haptics | |
| CN115328317A (zh) | 一种基于虚拟现实的质控反馈系统及方法 | |
| WO2020138671A1 (fr) | Système d'évaluation d'actes chirurgicaux basé sur la réalité virtuelle, utilisant un simulateur, pour l'oto-laryngologie et la neurochirurgie | |
| Crossan et al. | A horse ovary palpation simulator for veterinary training | |
| Simon et al. | Design and evaluation of UltRASim: An immersive simulator for learning ultrasound-guided regional anesthesia basic skills | |
| WO2020145455A1 (fr) | Système de simulateur d'apprentissage virtuel basé sur la réalité augmentée pour intervention chirurgicale sur le système cardiovasculaire, et procédé associé | |
| RU2687564C1 (ru) | Система обучения и оценки выполнения медицинским персоналом инъекционных и хирургических минимально-инвазивных процедур | |
| JP2021043443A (ja) | 腹腔鏡シミュレータ | |
| Wheeler et al. | Interactive computer-based simulator for training in blade navigation and targeting in myringotomy | |
| Kabuye et al. | A mixed reality system combining augmented reality, 3D bio-printed physical environments and inertial measurement unit sensors for task planning | |
| KR20040084243A (ko) | 인공고관절 가상 시술 시뮬레이션 시스템 | |
| US20250221772A1 (en) | 3-dimensional tracking and navigation simulator for neuro-endoscopy | |
| Kaluschke et al. | The impact of 3D stereopsis and hand-tool alignment on effectiveness of a VR-based simulator for dental training | |
| Crossan et al. | Multimodal feedback cues to aid veterinary training simulations |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19903445 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/09/2021) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19903445 Country of ref document: EP Kind code of ref document: A1 |