WO2016195401A1 - Système de lunettes 3d pour opération chirurgicale au moyen de la réalité augmentée - Google Patents
Système de lunettes 3d pour opération chirurgicale au moyen de la réalité augmentée Download PDFInfo
- Publication number
- WO2016195401A1 WO2016195401A1 PCT/KR2016/005868 KR2016005868W WO2016195401A1 WO 2016195401 A1 WO2016195401 A1 WO 2016195401A1 KR 2016005868 W KR2016005868 W KR 2016005868W WO 2016195401 A1 WO2016195401 A1 WO 2016195401A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- modeling
- image
- surgical
- glasses
- augmented reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
Definitions
- the present invention relates to a surgical surgical 3D glasses system using augmented reality, and more particularly to a surgical surgical 3D glasses system using a 3D imaging glasses that provides an augmented reality function.
- the lesion is visually checked through the incision of the lesion, and in addition to the damage to the muscles and tissues in the process of securing the field of vision of the incision, the patient suffers a considerable amount of time, and a considerable time for recovery after surgery. need.
- MIS Minially Invasive Surgery
- C-arm Portable X-ray
- the implant and surgical instruments are inserted in the desired direction and position, but during surgery Since the ray has to be irradiated several times, there is a problem that the disease of the surgeon and operating room personnel is exposed to radiation.
- 3D surgical navigation system which is a method of outputting a 3D CT (Computed Tomography) image to an operating room monitor and performing surgery while viewing the same.
- 3D CT Computer Tomography
- the surgeon should refer to the 3D CT image output from the monitor at the time of operation, and there is a problem that the composition of the operating room equipment is complicated and the cost of the surgical equipment is not small.
- Augmented Reality is a technology that allows computer graphics (CG) to coexist with the real world so that computer graphics can feel as if they exist in the real world.
- Augmented reality a concept that complements the real world with the virtual world, uses a virtual environment made of computer graphics, but the main role is the real environment. That is, the HMD is mainly used to provide additional information necessary for the real environment by overlapping the 3D virtual image with the live image that the user is viewing.
- HMD Head Mounted Display and literally refers to a head display. It can be classified into an optical-based HMD and a video-based HMD.
- Optical-based HMDs project real-world images directly to the user without editing.
- a virtual world image generated by a computer is synthesized.
- the optical-based HMD generates additional information by using head position and orientation information transmitted to the computer system, and the additional information is combined with the real environment image in the semi-transparent HMD.
- the biggest difference from optical-based HMDs is the presence of a camera.
- the HMD acquires a real world image through a camera.
- the combined image obtained by combining the acquired image and the virtual image generated by the computer is displayed to the user through the HMD.
- Korean Patent Application Publication No. 2014-0112207 (Augmented Reality Image Display System and Surgical Robot System Including The Same) includes a slave system performing a surgical operation on a patient, a master system controlling a surgical operation of the slave system, and a virtual body of a patient's body.
- a camera for obtaining an actual image including an image system for generating an image and a plurality of markers attached to a patient's body or a human body model, a plurality of markers are detected from the actual image, and the position of the camera is detected using the detected plurality of markers.
- an augmented reality image generating unit and the augmented reality image for generating an augmented reality image obtained by estimating a gaze direction, and overlaying a location corresponding to the estimated camera position and gaze direction of the virtual image on the real image. It relates to an augmented reality image display system comprising a display for displaying.
- Korean Patent Laid-Open Publication No. 2014-0112207 uses a camera to obtain a real image including a plurality of markers attached to a model of a patient's body or a human body. According to this method, the HMD uses a real world image and a virtual image. There is a lot of problems such as delay that can not follow the change of the mixed image with respect to the user's movement that can occur during the mixing process, and also to solve the calibration problem by the camera.
- an object of the present invention is to provide a surgical surgical 3D glasses system using augmented reality to enable MIS surgery that can minimize the incision of the affected area.
- 3D models for surgery within a human body 3D models of implants that are inserted into the human body, and 3D models of surgical instruments used to perform implants, can be provided as augmented reality images to 3D glasses for surgery. It is an object to provide an eyeglass system.
- the 3D imaging glasses receive position information from three or more sensors mounted on the inside of the human body or the skin surface, and match each of the three or more reference points set in 3D modeling of the body to each position of the sensor. It is an object of the present invention to provide a surgical surgical 3D glasses system using augmented reality that can be displayed by accurately matching the 3D modeling to the actual image of the human body projected through the glasses.
- the present invention solves the calibration problem caused by the delay and the camera caused by mixing real-world and virtual images by acquiring real-world images through a camera when implementing augmented reality using a conventional video-based HMD.
- the optical-based HMD is used to project the patient's actual image as it is, and the position and size of the marker attached to the patient's body is not extracted from the image information captured by the camera.
- the human body is projected on the real image of the patient as it is projected through the 3D eyeglasses by using the location information received from the sensor mounted on or inside the patient's body.
- Surgical operation 3 to express augmented reality images by accurately superimposing the internal 3D modeling 3 D It is an object to provide a glasses system.
- the present invention for achieving the above object is attached to a surgical object inserted into the human body for transmitting the first position information;
- a second sensor unit attached to a surgical instrument used to operate the surgical object on a human body to transmit second position information;
- 3D modeling of the inside of the body having three or more reference points set to the 3D image glasses, 3D modeling of the surgical object at the point corresponding to the first position information and 3D of the surgical instrument at the point corresponding to the second position information
- 3D modeling storage and transmission unit for transmitting the modeling to the 3D image glasses;
- An affected part reference point sensor unit mounted on the human body having the same number as three or more reference points set in the 3D modeling of the body and transmitting the respective sensor position information;
- the 3D modeling of the inside of the body is superimposed on the three or more sensor positions mounted on the human body to generate an augmented reality image. It includes; glasses for displaying the 3D image.
- At least three sensors are attached to the surgical object and the surgical instrument to transmit sensor position information, and the 3D modeling storage and transmission unit performs 3D modeling of the surgical object and the surgical instrument at a point corresponding to the received sensor position information. Can transmit to 3D glasses.
- the 3D modeling storage and transmission unit pre-stores the 3D modeling of the surgical target and the surgical instrument to be performed, the 3D modeling of the corresponding position surgical target according to the change of the first position information and the second position information 3D modeling of the corresponding position surgical instrument according to the change can be transmitted to the glasses for 3D imaging.
- the affected part reference point sensor may be attached to the outer surface of the skin or inserted into the human body.
- the 3D image glasses may be an optical-based head mounted display (HMD).
- HMD head mounted display
- the 3D image glasses for receiving the 3D modeling of the inside of the body transmitted from the 3D modeling storage and transmission unit, the 3D modeling of the surgical object, the 3D modeling of the surgical instruments, and the three reference point sensor unit transmitted Receiving unit for receiving the above sensor position information;
- a storage unit storing 3D modeling of the received body and three or more pieces of sensor position information mounted on the human body; Match three or more reference points set in the 3D modeling of the body to each of the three or more sensor positions stored in the storage unit, and additionally place the 3D modeling of the surgical object in the first position, and the 3D of the surgical instrument in the second position.
- Augmented reality image generation unit for generating an augmented reality image by positioning the modeling;
- An augmented reality image display unit for displaying the generated augmented reality image superimposed on an actual image of a human body projected through glasses;
- a camera for photographing an actual image of the human body;
- a transmitter configured to synthesize an augmented reality image displayed on the augmented reality image display unit and an actual image of the human body photographed by the camera, and transmit the synthesized image to a display device.
- the 3D modeling storage and transmission unit transmits the 3D modeling of the surgical object of the first position information and the corresponding point, the 3D modeling of the surgical position of the second position information and the corresponding point, the 3D modeling glasses are stored 3D modeling of the surgical object of the corresponding position and the first position information and the second position information and 3D modeling of the surgical instrument of the corresponding point after storing the first position information based on the three or more sensor positions mounted on the human body
- the 3D modeling of the surgical object may be overlaid and displayed, and the 3D modeling of the surgical instrument may be overlaid and displayed at the second position.
- the display apparatus may further include a display device configured to display the same image as that of the 3D image glasses by receiving and displaying the actual image of the human body projected onto the 3D image glasses and the 3D model displayed on the 3D image. have.
- 3D modeling of the inside of the body may be generated using 3D CT (Computed Tomography) data.
- the three or more reference points may be set near the point where the surgical object is inserted in the 3D modeling of the inside of the body generated by the 3D modeling transformation program.
- the surgical object may be an implant.
- the 3D model, the object and the surgical instrument 3D model of the inside of the human body is superimposed on the 3D image by displaying augmented reality images to minimize the incision of the affected area while accurately identifying the location of important areas such as arteries and nerve lines Surgery is possible.
- the three set when generating 3D modeling of the inside of the human body at each sensor position By matching 3 or more reference points to display 3D modeling of the inside of the human body, 3D modeling of the inside of the body may be accurately superimposed on the actual image of the human body projected through the 3D image glasses.
- 3D modeling of surgical objects and surgical instruments inserted into the human body is generated in advance, and 3D modeling of surgical objects and surgical instruments is also performed by using position signals transmitted from surgical objects (implants) and surgical instruments. Overlapping on the image can be displayed.
- FIG. 1 is a block diagram of a surgical surgical 3D glasses system using augmented reality according to a preferred embodiment of the present invention.
- FIG. 2 is a block diagram of glasses for 3D imaging.
- Figure 3 is an illustration applied to the surgical surgical 3D glasses system using augmented reality of the present invention.
- Figure 4a is an exemplary view of the affected area reference point sensor attached to the outer surface of the human body.
- Figure 4b is an exemplary view of the affected area reference point sensor mounted inside the human body.
- FIG 5 is an exemplary view showing a 3D modeling of the internal vertebrae of the human body and a 3D modeling of a surgical instrument as an augmented reality image through the 3D image glasses.
- first and second may be used to describe various components, but the components are not limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
- the term “and / or” includes any combination of a plurality of related items or any of a plurality of related items.
- the 3D glasses system for surgery using augmented reality according to the present invention 100 is the first sensor unit 10, the second sensor unit 20, 3D modeling storage and transmission unit ( 30), the affected area reference point sensor unit 40 and the 3D image glasses 50 may be included, and the display device 60 may be further included.
- the first sensor unit 10 is attached to a surgical object inserted into the human body to transmit the first position information.
- the surgical object refers to an implant used in spinal surgery.
- the second sensor unit 20 is attached to a surgical instrument used to operate the surgical object on the human body and transmits second position information.
- the 3D modeling storage and transmission unit 30 transmits 3D modeling of the inside of the body in which three or more reference points are set to the 3D image glasses 50, and performs 3D modeling and operation of a surgical object at a point corresponding to the first location information.
- the 3D modeling of the surgical instrument of the point corresponding to the 2 position information is transmitted to the glasses 50 for the 3D image worn by the surgeon.
- the first sensor unit 10 and the second sensor unit 20 may each include three or more sensors. That is, three or more sensors are attached to the surgical object and the surgical instrument, and the position information is periodically transmitted when the surgical object and the surgical instrument move within the human body, and the 3D modeling storage and transmission unit 30 receives the received sensor. The 3D modeling of the surgical object and the surgical instrument at the point corresponding to the position information is transmitted to the 3D image glasses 50.
- the 3D modeling storage and transmission unit 30 previously stores 3D modeling of the surgical object and the surgical instrument to be treated.
- a 3D model of a surgical object and a surgical instrument is generated and stored in a storage unit, and the 3D modeling of the corresponding surgical position and the change of the second position information according to the change of the first positional information.
- 3D modeling of the position surgical instrument according to the 3D imaging glasses 50 can be transmitted.
- the first location information refers to a location transmitted by three or more sensors attached to a surgical object
- the second location information refers to a location transmitted by three or more sensors attached to a surgical instrument.
- the affected part reference point sensor unit 40 is mounted on the human body at the same position as three or more reference points set in 3D modeling of the inside of the body and transmits the respective sensor position information.
- the affected area reference point sensor unit 40 is composed of three sensors, attached to the skin surface, and referring to Figure 4b, the affected area reference point sensor unit 40 is composed of three sensors, It is inserted and installed inside the human body.
- the affected part reference point sensor unit 40 shows that it consists of three sensors, the present invention is not limited thereto and may be three or more.
- Three or more reference points set in 3D modeling of the inside of the human body must be configured to match each sensor position so that 3D modeling of the inside of the human body can be expressed at the correct position.
- the position information transmitted by three or more sensors inserted into the human body may refer to three-dimensional position coordinates of x, y, and z.
- C-arm Portable X-ray
- the sensor (affected reference point sensor unit) mounted on the skin surface of the patient or inside the human body to transmit position information is an important feature that is differentiated from the prior art.
- the sensor for transmitting the location information of the present invention is different from the marker mounted on the outer surface of the body of the patient, and is not a method of extracting the marker from the photographed image by photographing the marker with a camera as in the prior art.
- the present invention receives three or more location information transmitted wirelessly from a sensor mounted on a patient, and by matching three or more reference points set in advance in 3D modeling on the inside of the human body based on the three or more location information.
- the key is to accurately match the 3D modeling of the inside of the human body to the real image of the patient projected through the 3D image glasses 50. In this way, problems such as a delay phenomenon and a calibration by the camera, which do not follow changes in the mixed image of the user's motion, which are problems of the prior art, can be solved.
- 3D modeling of the inside of the human body is generated using a separate 3D modeling transformation program with 3D CT (Computed Tomography) data taken before surgery.
- 3D CT Computer Tomography
- the 3D image glasses 50 superimpose and display 3D modeling of the inside of the body received from the 3D modeling storage and transmission unit 30 on the actual image of the human body projected through the eyeglasses, and 3D modeling of the surgical object and the surgical instrument. 3D to the inside of the body based on the position of three or more sensors mounted on the human body by additionally displaying at the corresponding positions, respectively, by matching three or more reference points set in 3D modeling on the inside of the body to the same number of sensor positions. The modeling is superimposed to display an augmented reality image.
- Three or more reference points are preferably set near the point where the surgical object is inserted in the 3D modeling of the inside of the body generated by a separate 3D modeling transformation program.
- 3D image glasses 50 of the present invention is preferably using an optical-based HMD that can see where the eyes are directed.
- the use of an optical based HMD is one of the other differences from the prior art.
- the optical-based HMD By using the optical-based HMD, the real image of the patient viewed by the doctor is projected on the 3D image glasses 50 as it is.
- the point of not taking the image of the patient with a camera is a part different from the conventional invention.
- the 3D image glasses 50 includes a receiver 51, a storage 52, an augmented reality image generator 53, an augmented reality image display unit 54, a camera 55, and a transmitter 56.
- the receiver 51 receives the 3D modeling of the inside of the body transmitted from the 3D modeling storage and transmission unit 30, the 3D modeling of the surgical object, and the 3D modeling of the surgical instrument, and the three reference point sensor unit 40 is transmitted. Receive more than one sensor location information.
- the storage unit 52 stores 3D modeling of the received body and three or more sensor position information mounted on the human body.
- the augmented reality image generating unit 53 matches three or more reference points set in 3D modeling of the body to three or more sensor positions stored in the storage unit 52, and additionally performs 3D modeling of the surgical object at the first position. In the second position, 3D modeling of the surgical instrument is placed to generate an augmented reality image. Whenever the first position of the surgical object and the second position of the surgical instrument are changed, the positions of the surgical object and the surgical instrument displayed on the augmented reality image will also be changed.
- the augmented reality image generating unit 53 is three or more positions wirelessly received from the affected area reference point sensor unit 40 as the direction of the eye of the user wearing the 3D image glasses 50 and the distance from the patient is changed It is created by adjusting the direction and size (zoom / reduction) of 3D modeling inside the human body in the state of exactly matching three or more reference points set in 3D modeling inside the human body.
- the augmented reality image display unit 54 displays the generated augmented reality image superimposed on the actual image of the human body projected through the glasses.
- the camera 55 captures an actual image of the human body.
- the transmitting unit 56 synthesizes the real image of the human body photographed by the camera 55 and the augmented reality image displayed on the augmented reality image display unit 54, and transmits the synthesized image to the display device 60 to thereby transmit the 3D image glasses 50.
- Other people who are not wearing can also see an image, such as an augmented reality image appearing on the glasses 50 for 3D images.
- the camera 55 is not intended for capturing an actual image of the human body and displaying the 3D modeling on the inside of the human body on the 3D image glasses 50. Only, the display device 60 is used for synthesizing and transmitting the real image and the augmented reality image of the human body. As a result, the image taken by the camera 55 is not used at all during surgery by the doctor using the glasses 50 for the 3D image.
- the display apparatus 60 receives and displays the actual image of the human body projected onto the 3D image glasses 50 and the 3D modeling displayed on the 3D image glasses 50 from the 3D image glasses 50 to display the same as the 3D image glasses 50. Display the video.
- the 3D modeling of the vertebral bone V inside the human body and a surgical instrument inserted into the human body is represented as an augmented reality image through the augmented reality image display unit 54 of the 3D image glasses 50. .
- position information is wirelessly transmitted from the sensor objects 10 and 20 attached to the human body and the sensor units 10 and 20 attached to the surgical instrument to the 3D modeling storage and transmission unit 30 (1).
- the 3D modeling storage and transmission unit 30 may be built using a computer or a medical server.
- the 3D modeling storage and transmission unit 30 wirelessly transmits 3D modeling of the inside of the body in which three or more reference points are set, 3D modeling of a surgical object and a surgical instrument corresponding to the received position, to the 3D image glasses 50.
- the affected part reference point sensor unit 40 mounted inside the human body or attached to the outer surface of the skin wirelessly transmits three or more sensor position information to the 3D image glasses 50 (3).
- the augmented reality image displayed on the 3D image glasses 50 may also be transmitted to the display apparatus 60 (4).
- the present invention is not a way to operate while watching a monitor provided with an image taken by a separate camera when the doctor is operating, the patient with the camera in the glasses 50 for 3D imaging worn by the doctor It's not the way to shoot the video. That is, the present invention uses a method in which the patient's figure is projected as it is through the eyeglasses on the 3D image glasses 50, and displays the augmented reality image by superimposing 3D modeling on the inside of the body to the actual figure of the patient as it is being projected. It is.
- the 3D modeling storage and transmission unit 30 converts the 3D modeling of the surgical object of the first position information and the corresponding point and the 3D modeling of the second position information and the surgical instrument of the corresponding point into the 3D image glasses 50.
- the 3D image glasses 50 stores the 3D modeling of the surgical object of the first position information and the corresponding point in the storage unit 52 and the 3D modeling of the surgical instrument of the corresponding position and the second position information.
- the 3D modeling of the surgical object may be overlaid and displayed at the first position based on the position of three or more sensors 40 mounted at the second position, and the 3D modeling of the surgical instrument may be displayed at the second position.
- the 3D modeling storage and transmission unit 30 When the 3D modeling storage and transmission unit 30 receives the 3D modeling and position of the surgical object and receives the 3D modeling and position of the surgical instrument, the position of three or more sensors mounted on the human body, that is, the lesion reference point sensor unit 40 The 3D modeling of the surgical object and the surgical instrument is displayed on the basis of the overlapping position.
- 3D modeling of the inside of the human body is displayed by matching three or more reference points set when generating 3D modeling for the 3D modeling, so that 3D modeling of the inside of the body can be accurately superimposed on the actual image of the human body projected through the 3D image glasses.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Robotics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Selon la présente invention, afin de résoudre un phénomène de retard pouvant survenir pendant le processus de mélange d'une image du monde réel et d'une image virtuelle en raison du fait que l'image du monde réel est obtenue par une caméra dans le cas où la réalité augmentée est réalisée à l'aide d'un afficheur de tête (HMD) à base de vidéo classique, et afin de résoudre un problème d'étalonnage provoqué par la caméra, la présente invention met en œuvre un HMD à base d'optique qui réfléchit l'image réelle d'un patient sans distorsion, et utilise un procédé de communication sans fil permettant de recevoir des informations d'emplacement provenant d'au moins trois capteurs, montés sur la surface ou à l'intérieur d'un corps humain, plutôt qu'un procédé consistant à extraire l'emplacement, la taille, etc. d'un marqueur fixé au corps d'un patient à partir d'informations relatives à une image capturée par une caméra, ce qui permet d'obtenir un système de lunettes 3D pour une opération chirurgicale, qui exprime une image à réalité augmentée par superposition précise de la modélisation 3D interne d'un corps humain sur l'image réelle du patient qui est réfléchie à travers une lentille de lunettes 3D sans distorsion à l'aide d'informations d'emplacement reçues en provenance de capteurs montés sur la surface ou à l'intérieur du corps du patient.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2015-0079598 | 2015-06-05 | ||
| KR1020150079598A KR101647467B1 (ko) | 2015-06-05 | 2015-06-05 | 증강현실을 이용한 외과 수술용 3d 안경 시스템 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016195401A1 true WO2016195401A1 (fr) | 2016-12-08 |
Family
ID=56714292
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2016/005868 Ceased WO2016195401A1 (fr) | 2015-06-05 | 2016-06-02 | Système de lunettes 3d pour opération chirurgicale au moyen de la réalité augmentée |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR101647467B1 (fr) |
| WO (1) | WO2016195401A1 (fr) |
Cited By (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9861446B2 (en) | 2016-03-12 | 2018-01-09 | Philipp K. Lang | Devices and methods for surgery |
| WO2018134140A1 (fr) * | 2017-01-17 | 2018-07-26 | Koninklijke Philips N.V. | Système d'intervention à réalité augmentée fournissant des superpositions contextuelles |
| US10194131B2 (en) | 2014-12-30 | 2019-01-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
| WO2019164093A1 (fr) * | 2018-02-23 | 2019-08-29 | 서울대학교산학협력단 | Procédé d'amélioration des performances de mise en correspondance de données de tomodensitométrie et de données optiques et dispositif associé |
| CN110708530A (zh) * | 2019-09-11 | 2020-01-17 | 青岛小鸟看看科技有限公司 | 一种使用增强现实设备透视封闭空间的方法和系统 |
| CN111465912A (zh) * | 2017-10-11 | 2020-07-28 | 凯菲森有限公司 | 具有运动检测功能的增强现实眼镜 |
| US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
| US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
| US11751944B2 (en) | 2017-01-16 | 2023-09-12 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
| US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
| US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
| US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
| US12053247B1 (en) | 2020-12-04 | 2024-08-06 | Onpoint Medical, Inc. | System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures |
| US12211151B1 (en) | 2019-07-30 | 2025-01-28 | Onpoint Medical, Inc. | Systems for optimizing augmented reality displays for surgical procedures |
| US12433761B1 (en) | 2022-01-20 | 2025-10-07 | Onpoint Medical, Inc. | Systems and methods for determining the shape of spinal rods and spinal interbody devices for use with augmented reality displays, navigation systems and robots in minimally invasive spine procedures |
| US12453600B2 (en) | 2013-09-18 | 2025-10-28 | iMIRGE Medical INC. | Anatomical scanning, targeting, and visualization |
| US12488480B2 (en) | 2024-08-12 | 2025-12-02 | Philipp K. Lang | Augmented reality guidance for surgical procedures |
Families Citing this family (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101844175B1 (ko) | 2017-01-04 | 2018-03-30 | 건양대학교산학협력단 | 증강현실을 이용한 운동치료 보조시스템 |
| KR102024988B1 (ko) * | 2017-08-11 | 2019-09-24 | 서울대학교병원 | 치아교정용 가이드 시스템 및 이를 이용한 치아교정 가이드 방법 |
| KR102133219B1 (ko) * | 2017-10-11 | 2020-07-14 | 주식회사 카이비전 | 움직임감지 증강현실 글라스 |
| KR102056930B1 (ko) * | 2017-11-21 | 2019-12-17 | 경희대학교 산학협력단 | 증강현실 기술을 이용한 척추 수술 네비게이션 시스템 및 방법 |
| KR101898088B1 (ko) * | 2017-12-27 | 2018-09-12 | 주식회사 버넥트 | 객체 추적기반의 프레임 영역 녹화 및 재생기술이 적용된 증강현실 시스템 |
| KR102186551B1 (ko) * | 2018-12-05 | 2020-12-03 | 주식회사 피치랩 | 증강현실장치를 이용하여 비침습뇌자극용 자극기를 위치시키는 방법 |
| KR102313319B1 (ko) * | 2019-05-16 | 2021-10-15 | 서울대학교병원 | 증강현실 대장 내시경 시스템 및 이를 이용한 모니터링 방법 |
| KR102175066B1 (ko) * | 2019-10-15 | 2020-11-05 | 주식회사 메디씽큐 | 머리에 착용하는 의료용 3d 디스플레이 장치 및 이를 이용한 3d 디스플레이 방법 |
| KR102362149B1 (ko) * | 2019-12-06 | 2022-02-10 | 서울대학교산학협력단 | 임플란트 수술을 위한 증강현실 도구 및 임플란트 수술정보 가시화 방법 |
| KR102458276B1 (ko) * | 2020-06-18 | 2022-10-25 | 주식회사 디엠에프 | 3차원 안면 스캔데이터 및 ar 글래스를 이용한 실시간 가시화 서비스 제공 방법 |
| KR102460821B1 (ko) | 2020-10-28 | 2022-10-28 | 재단법인대구경북과학기술원 | 증강 현실 장치 및 증강 현실 장치의 동작 방법 |
| KR102539312B1 (ko) * | 2021-03-25 | 2023-06-01 | 한상범 | 모발 이식용 모발 추출기 |
| WO2024089564A1 (fr) * | 2022-10-28 | 2024-05-02 | Covidien Lp | Chirurgie robotique guidée par capteur |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20080027256A (ko) * | 2005-05-16 | 2008-03-26 | 인튜어티브 서지컬 인코포레이티드 | 최소침습 로봇수술 동안 센서 및/또는 카메라로부터 도출된데이터와의 융합에 의한 3차원 툴 추적을 수행하기 위한방법 및 시스템 |
| KR20090093877A (ko) * | 2008-02-29 | 2009-09-02 | 바이오센스 웹스터 인코포레이티드 | 가상 터치 스크린을 갖는 위치추적 시스템 |
| KR20110036453A (ko) * | 2009-10-01 | 2011-04-07 | 주식회사 이턴 | 수술용 영상 처리 장치 및 그 방법 |
| KR20130135476A (ko) * | 2012-06-01 | 2013-12-11 | 의료법인 우리들의료재단 | 수술 유도영상 시스템 및 그 방법 |
| KR20140112207A (ko) * | 2013-03-13 | 2014-09-23 | 삼성전자주식회사 | 증강현실 영상 표시 시스템 및 이를 포함하는 수술 로봇 시스템 |
-
2015
- 2015-06-05 KR KR1020150079598A patent/KR101647467B1/ko active Active
-
2016
- 2016-06-02 WO PCT/KR2016/005868 patent/WO2016195401A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20080027256A (ko) * | 2005-05-16 | 2008-03-26 | 인튜어티브 서지컬 인코포레이티드 | 최소침습 로봇수술 동안 센서 및/또는 카메라로부터 도출된데이터와의 융합에 의한 3차원 툴 추적을 수행하기 위한방법 및 시스템 |
| KR20090093877A (ko) * | 2008-02-29 | 2009-09-02 | 바이오센스 웹스터 인코포레이티드 | 가상 터치 스크린을 갖는 위치추적 시스템 |
| KR20110036453A (ko) * | 2009-10-01 | 2011-04-07 | 주식회사 이턴 | 수술용 영상 처리 장치 및 그 방법 |
| KR20130135476A (ko) * | 2012-06-01 | 2013-12-11 | 의료법인 우리들의료재단 | 수술 유도영상 시스템 및 그 방법 |
| KR20140112207A (ko) * | 2013-03-13 | 2014-09-23 | 삼성전자주식회사 | 증강현실 영상 표시 시스템 및 이를 포함하는 수술 로봇 시스템 |
Cited By (61)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12453600B2 (en) | 2013-09-18 | 2025-10-28 | iMIRGE Medical INC. | Anatomical scanning, targeting, and visualization |
| US11350072B1 (en) | 2014-12-30 | 2022-05-31 | Onpoint Medical, Inc. | Augmented reality guidance for bone removal and osteotomies in spinal surgery including deformity correction |
| US10742949B2 (en) | 2014-12-30 | 2020-08-11 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays and tracking of instruments and devices |
| US12063338B2 (en) | 2014-12-30 | 2024-08-13 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic displays and magnified views |
| US10194131B2 (en) | 2014-12-30 | 2019-01-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
| US12010285B2 (en) | 2014-12-30 | 2024-06-11 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic displays |
| US11652971B2 (en) | 2014-12-30 | 2023-05-16 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
| US10326975B2 (en) | 2014-12-30 | 2019-06-18 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
| US11483532B2 (en) | 2014-12-30 | 2022-10-25 | Onpoint Medical, Inc. | Augmented reality guidance system for spinal surgery using inertial measurement units |
| US11272151B2 (en) | 2014-12-30 | 2022-03-08 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with display of structures at risk for lesion or damage by penetrating instruments or devices |
| US11750788B1 (en) | 2014-12-30 | 2023-09-05 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic display of images and tracked instruments |
| US11153549B2 (en) | 2014-12-30 | 2021-10-19 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery |
| US10511822B2 (en) | 2014-12-30 | 2019-12-17 | Onpoint Medical, Inc. | Augmented reality visualization and guidance for spinal procedures |
| US11050990B2 (en) | 2014-12-30 | 2021-06-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with cameras and 3D scanners |
| US10594998B1 (en) | 2014-12-30 | 2020-03-17 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays and surface representations |
| US10602114B2 (en) | 2014-12-30 | 2020-03-24 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures using stereoscopic optical see-through head mounted displays and inertial measurement units |
| US10951872B2 (en) | 2014-12-30 | 2021-03-16 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with real time visualization of tracked instruments |
| US10841556B2 (en) | 2014-12-30 | 2020-11-17 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with display of virtual surgical guides |
| US10278777B1 (en) | 2016-03-12 | 2019-05-07 | Philipp K. Lang | Augmented reality visualization for guiding bone cuts including robotics |
| US12127795B2 (en) | 2016-03-12 | 2024-10-29 | Philipp K. Lang | Augmented reality display for spinal rod shaping and placement |
| US10743939B1 (en) | 2016-03-12 | 2020-08-18 | Philipp K. Lang | Systems for augmented reality visualization for bone cuts and bone resections including robotics |
| US10799296B2 (en) | 2016-03-12 | 2020-10-13 | Philipp K. Lang | Augmented reality system configured for coordinate correction or re-registration responsive to spinal movement for spinal procedures, including intraoperative imaging, CT scan or robotics |
| US12472000B2 (en) | 2016-03-12 | 2025-11-18 | Philipp K. Lang | Augmented reality system for monitoring size and laterality of physical implants during surgery and for billing and invoicing |
| US10849693B2 (en) | 2016-03-12 | 2020-12-01 | Philipp K. Lang | Systems for augmented reality guidance for bone resections including robotics |
| US10603113B2 (en) | 2016-03-12 | 2020-03-31 | Philipp K. Lang | Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient |
| US11013560B2 (en) | 2016-03-12 | 2021-05-25 | Philipp K. Lang | Systems for augmented reality guidance for pinning, drilling, reaming, milling, bone cuts or bone resections including robotics |
| US9980780B2 (en) | 2016-03-12 | 2018-05-29 | Philipp K. Lang | Guidance for surgical procedures |
| US10405927B1 (en) | 2016-03-12 | 2019-09-10 | Philipp K. Lang | Augmented reality visualization for guiding physical surgical tools and instruments including robotics |
| US11172990B2 (en) | 2016-03-12 | 2021-11-16 | Philipp K. Lang | Systems for augmented reality guidance for aligning physical tools and instruments for arthroplasty component placement, including robotics |
| US11311341B2 (en) | 2016-03-12 | 2022-04-26 | Philipp K. Lang | Augmented reality guided fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement |
| US10159530B2 (en) | 2016-03-12 | 2018-12-25 | Philipp K. Lang | Guidance for surgical interventions |
| US9861446B2 (en) | 2016-03-12 | 2018-01-09 | Philipp K. Lang | Devices and methods for surgery |
| US11957420B2 (en) | 2016-03-12 | 2024-04-16 | Philipp K. Lang | Augmented reality display for spinal rod placement related applications |
| US11452568B2 (en) | 2016-03-12 | 2022-09-27 | Philipp K. Lang | Augmented reality display for fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement |
| US10368947B2 (en) | 2016-03-12 | 2019-08-06 | Philipp K. Lang | Augmented reality guidance systems for superimposing virtual implant components onto the physical joint of a patient |
| US11850003B2 (en) | 2016-03-12 | 2023-12-26 | Philipp K Lang | Augmented reality system for monitoring size and laterality of physical implants during surgery and for billing and invoicing |
| US10292768B2 (en) | 2016-03-12 | 2019-05-21 | Philipp K. Lang | Augmented reality guidance for articular procedures |
| US11602395B2 (en) | 2016-03-12 | 2023-03-14 | Philipp K. Lang | Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient |
| US11751944B2 (en) | 2017-01-16 | 2023-09-12 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
| WO2018134140A1 (fr) * | 2017-01-17 | 2018-07-26 | Koninklijke Philips N.V. | Système d'intervention à réalité augmentée fournissant des superpositions contextuelles |
| US12169881B2 (en) | 2017-01-17 | 2024-12-17 | Koninklijke Philips N.V. | Augmented reality interventional system providing contextual overylays |
| US11551380B2 (en) | 2017-01-17 | 2023-01-10 | Koninklijke Philips N.V. | Augmented reality interventional system providing contextual overlays |
| US12290414B2 (en) | 2017-09-11 | 2025-05-06 | Philipp K. Lang | Augmented reality guidance for vascular procedures |
| US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
| CN111465912A (zh) * | 2017-10-11 | 2020-07-28 | 凯菲森有限公司 | 具有运动检测功能的增强现实眼镜 |
| US12086998B2 (en) | 2018-01-29 | 2024-09-10 | Philipp K. Lang | Augmented reality guidance for surgical procedures |
| US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
| US11727581B2 (en) | 2018-01-29 | 2023-08-15 | Philipp K. Lang | Augmented reality guidance for dental procedures |
| WO2019164093A1 (fr) * | 2018-02-23 | 2019-08-29 | 서울대학교산학협력단 | Procédé d'amélioration des performances de mise en correspondance de données de tomodensitométrie et de données optiques et dispositif associé |
| KR102099415B1 (ko) | 2018-02-23 | 2020-04-09 | 서울대학교산학협력단 | Ct 데이터와 광학 데이터의 정합성능 향상 방법 및 그 장치 |
| KR20190101694A (ko) * | 2018-02-23 | 2019-09-02 | 서울대학교산학협력단 | Ct 데이터와 광학 데이터의 정합성능 향상 방법 및 그 장치 |
| US12161428B1 (en) | 2019-02-14 | 2024-12-10 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures including interpolation of vertebral position and orientation |
| US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
| US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
| US12364570B1 (en) | 2019-02-14 | 2025-07-22 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
| US12211151B1 (en) | 2019-07-30 | 2025-01-28 | Onpoint Medical, Inc. | Systems for optimizing augmented reality displays for surgical procedures |
| CN110708530A (zh) * | 2019-09-11 | 2020-01-17 | 青岛小鸟看看科技有限公司 | 一种使用增强现实设备透视封闭空间的方法和系统 |
| US12053247B1 (en) | 2020-12-04 | 2024-08-06 | Onpoint Medical, Inc. | System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures |
| US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
| US12433761B1 (en) | 2022-01-20 | 2025-10-07 | Onpoint Medical, Inc. | Systems and methods for determining the shape of spinal rods and spinal interbody devices for use with augmented reality displays, navigation systems and robots in minimally invasive spine procedures |
| US12488480B2 (en) | 2024-08-12 | 2025-12-02 | Philipp K. Lang | Augmented reality guidance for surgical procedures |
Also Published As
| Publication number | Publication date |
|---|---|
| KR101647467B1 (ko) | 2016-08-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016195401A1 (fr) | Système de lunettes 3d pour opération chirurgicale au moyen de la réalité augmentée | |
| US8504136B1 (en) | See-through abdomen display for minimally invasive surgery | |
| US11275249B2 (en) | Augmented visualization during surgery | |
| Hu et al. | Head-mounted augmented reality platform for markerless orthopaedic navigation | |
| Chen et al. | Development of a surgical navigation system based on augmented reality using an optical see-through head-mounted display | |
| US7774044B2 (en) | System and method for augmented reality navigation in a medical intervention procedure | |
| CN107374729B (zh) | 基于ar技术的手术导航系统及方法 | |
| WO2016144005A1 (fr) | Système de projection d'image de réalité augmentée | |
| Wisotzky et al. | Interactive and multimodal-based augmented reality for remote assistance using a digital surgical microscope | |
| CN103238339A (zh) | 查看和跟踪立体视频图像的系统和方法 | |
| CN101904770A (zh) | 一种基于光学增强现实技术的手术导航系统及方法 | |
| US20240293096A1 (en) | Patient viewing system | |
| WO2022035110A1 (fr) | Terminal utilisateur pour fournir une image médicale à réalité augmentée et procédé pour fournir une image médicale à réalité augmentée | |
| CN109730771A (zh) | 一种基于ar技术的手术导航系统 | |
| WO2021045546A2 (fr) | Dispositif de guidage de position de robot, procédé associé et système le comprenant | |
| KR101667152B1 (ko) | 수술 지원 영상을 제공하는 스마트 글라스 시스템 및 스마트 글라스를 이용한 수술 지원 영상 제공 방법 | |
| EP1705513A1 (fr) | Systeme de vision stereoscopique d'images en temps reel ou statiques | |
| WO2015186930A1 (fr) | Appareil de transmission interactive en temps réel d'une image et d'informations médicales et d'assistance à distance | |
| Jiang et al. | User's image perception improved strategy and application of augmented reality systems in smart medical care: A review | |
| Harders et al. | Multimodal augmented reality in medicine | |
| CN209358681U (zh) | 一种应用于手术室内的裸眼3d人眼追踪设备 | |
| EP3595299A1 (fr) | Dispositif de commande d'affichage d'image médicale, dispositif d'affichage d'image médicale, système de traitement d'informations médicales et procédé de commande d'affichage d'image médicale | |
| CN111193830B (zh) | 一种基于智能手机的便携式增强现实医学图像观察辅助设备 | |
| Sun et al. | Virtually transparent epidermal imagery for laparo-endoscopic single-site surgery | |
| CN209358682U (zh) | 一种用于外科手术的3d/2d可自由切换的显示设备 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16803756 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16803756 Country of ref document: EP Kind code of ref document: A1 |