[go: up one dir, main page]

WO2020087141A1 - Système de guidage à réalité augmentée - Google Patents

Système de guidage à réalité augmentée Download PDF

Info

Publication number
WO2020087141A1
WO2020087141A1 PCT/BR2019/000038 BR2019000038W WO2020087141A1 WO 2020087141 A1 WO2020087141 A1 WO 2020087141A1 BR 2019000038 W BR2019000038 W BR 2019000038W WO 2020087141 A1 WO2020087141 A1 WO 2020087141A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
marker
patient
viewing
guide
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/BR2019/000038
Other languages
English (en)
Portuguese (pt)
Inventor
João Alfredo BORGES
Elias Cantarelli HOFFMANN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Prototipos Industria E Comercio De Produtos Prototipos Ltda - Me
Original Assignee
Prototipos Industria E Comercio De Produtos Prototipos Ltda - Me
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Prototipos Industria E Comercio De Produtos Prototipos Ltda - Me filed Critical Prototipos Industria E Comercio De Produtos Prototipos Ltda - Me
Publication of WO2020087141A1 publication Critical patent/WO2020087141A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking

Definitions

  • the present invention relates to a guiding system to be used as a support tool when planning or performing a medical procedure on a patient, for example, a surgical intervention.
  • Image examination techniques including computed tomography, magnetic resonance imaging and ultrasonography, have been widely used for the purpose of medical diagnosis.
  • Surgical procedures have evolved with respect to the use of minimally invasive techniques, robotically assisted surgery, endoscopy, among others.
  • these conventional techniques and endoscopic tools tend to limit the surgeon's view, especially in relation to anatomical structures not reached by the field of view of the endoscope.
  • Imaging exams previously performed on the patient have also been used to provide assistance to the surgeon during the performance of a surgical intervention.
  • the purpose of the present invention is to provide a guide system to be used as a support tool during planning or carrying out a medical procedure, for example, a surgical intervention, which will overcome the limitations of the technical state.
  • the present invention proposes a guide system with augmented reality comprising a computing unit configured to generate a model virtual from a patient's image data obtained through at least one image examination previously performed on the patient, and process the virtual model in order to define a virtual region of interest as a region for viewing.
  • the system also comprises a camera configured to acquire a live image of the patient, a processing unit configured to receive the live image of the patient and generate a video signal, and a screen configured to receive the video signal.
  • the system also comprises means for correlating the region for viewing the virtual model with a real region of interest to the patient and representing the region for viewing in correlation to the patient's live image, with the region for viewing superimposed on the real region of interest for the patient .
  • a user visualizes on the screen the region for visualization correlated to the patient's live image, with the region for visualization superimposed on the patient's real region of interest. If the patient moves, for example, the viewing region moves accordingly, in view of the position and orientation of the viewing region being linked to the patient's live image.
  • the visualization region can be configured with an appropriate level of transparency so that it is possible to visualize its external contour at the same time that it is possible to visualize parts of the patient that would be hidden by the overlapping of the region for visualization.
  • this system can be used as a support tool during the performance of a surgical intervention, allowing the user, for example, a surgeon, to view parts of the patient represented by the region for viewing that would be hidden from their normal field of view , for example, patient's organs that are covered by the patient's skin. Based on this information, the user is able to better evaluate his procedures during surgery. For example, based on the visualization of an internal organ represented by the region for visualization, the user can evaluate the most appropriate place to make an incision on the patient, in order to gain access to that organ.
  • Figure 1 shows a flowchart of the guide system with augmented reality according to a first embodiment of the invention.
  • Figure 2 presents a perspective view of a virtual model.
  • Figure 3 shows a view of a guide for printing.
  • Figure 4 shows a view of a region for viewing.
  • Figure 5 shows a view of a patient wearing a printed guide.
  • Figure 6 shows a view corresponding to the image visible on the screen when using the system according to the first embodiment of the invention.
  • Figure 7 shows a view of a variation of the printed guide.
  • Figure 8 shows a flowchart of the guide system with augmented reality according to a second embodiment of the invention.
  • Figure 9 shows a view of a guide printed with a marker corresponding to a mapped region, according to the second embodiment of the invention.
  • Figure 10 shows a flowchart of the guide system with augmented reality according to a third embodiment of the invention.
  • Figure 11 shows a flowchart of the guide system with reality increased according to a fourth embodiment of the invention.
  • Figure 12 shows a view of a patient with a contrasting region.
  • Figure 13 presents a perspective view of a virtual model.
  • Figure 14 shows a view of a region for viewing.
  • Figure 15 shows a view corresponding to the image visible on the screen when using the system according to the fourth embodiment of the invention.
  • Figure 16 shows a flowchart of the guide system with augmented reality according to a fifth embodiment of the invention.
  • the augmented reality guide system comprises a computing unit (6) configured to generate a virtual model (10) from image data (4) of a patient obtained by means of at least an image exam previously performed on the patient.
  • the image exam can be computed tomography, positron emission tomography, single photon emission computed tomography, magnetic resonance imaging, optical scanning with three-dimensional scanner and / or ultrasound.
  • image data (4) usually in DICOM format, are imported into the computing unit (6) and are processed, with the aid of a CAD computer program, to generate the three-dimensional virtual model (10).
  • the CAD computer program can be a program developed specifically for this purpose, for example, with the aid of ITK - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit libraries.
  • a computer program capable of performing this processing and reconstruction of the three-dimensional virtual model (10) consists of OsiriX ⁇ .
  • the computing unit (6) is still configured to process the virtual model (10) in order to define a virtual region of interest as a region for viewing (30).
  • This processing is performed with the aid of a CAD computer program developed for this purpose.
  • the CAD computer program can be developed with the aid of ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • the augmented reality guide system further comprises a camera (50) configured to acquire a live image of the patient, a processing unit (60) configured to receive the live image of the patient and generate a video signal and a screen (70) configured to receive the video signal.
  • the system also comprises means for correlating the viewing region (30) of the virtual model (10) with a real region of interest to the patient and representing the viewing region (30) in correlation to the patient's live image, with the viewing region (30) superimposed on the patient's real region of interest.
  • Figure 1 illustrates a flowchart of the guide system with augmented reality according to a first embodiment of the invention.
  • the correlation means comprise a guide for printing (20) designed to be fixable, preferably by fitting, in an anatomical structure of the virtual model (10), the guide for printing * (20 ) being projected with a marker (22), according to a system of spatial coordinates that associate the region for visualization (30) with the marker (22).
  • the print guide (20) is designed in the CAD environment and the position of the marker (22) defines an origin for the spatial coordinate system.
  • the image data of the visualization region (30) are associated with the marker (22) with the aid of the CAD computer program.
  • the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • Each point in the region for viewing (30) has spatial coordinates related to the spatial coordinates of the marker (22).
  • the correlation means also comprise a 3D printer (35) configured to print the guide for printing (20), generating a printed gauge (40) with a marker (42) , the printed guide (40) being fixed, preferably by fitting, in an anatomical structure of the patient corresponding to the anatomical structure of the virtual model (10) that served as the basis for the design of the print guide (20).
  • the correlation means comprise the processing unit (60) configured to detect the marker (42) present in the printed guide (40) and identify the region for viewing (30) associated with the marker (42) and represent the region for viewing (30) in correlation to the patient's live image, according to the position of the marker (42).
  • processing can be performed in an augmented reality program, developed from an augmented reality software development kit - SDK -, such as, for example, ARToolKit ® or Vuforia TM.
  • the marker (22, 42) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received from the camera (50).
  • the marker (42) present the printed guide (40) must be positioned at least once within the field of view (CV) of the camera (50) to provide the detection of the marker (22, 42) and the establishment of a correlation between the live image and the region for viewing (30).
  • an augmented reality marker can be of the square marker type, which has a background, usually white, a square border, usually black, and an image forming a pattern, positioned inside the square, as is the case with the marker ( 42) illustrated in Figure 5.
  • ARToolKit makes use of a square marker to determine which mathematical transformation should be applied to the region for visualization (30) in order to represent the region for visualization (30) in correlation to the image live from the patient. This is due to the fact that such a transformation can be defined based on only four coplanar and non-collinear points, which correspond to the vertices of the square marker detected when processing the live image received from the camera (50).
  • Figures 2 to 6 illustrate the guide system with augmented reality, according to the first incorporation of the invention, applied to assist a dental implant surgery.
  • Figure 2 represents the virtual model (10) generated from image data (4) of a patient.
  • the virtual model (10) consists of a mandible (a) with teeth (b) and alveolar nerves (c).
  • the virtual model (10) was processed in the CAD computer program in order to include dental implants (d) in edentulous spaces of the mandible (a), according to a spatial configuration to be reproduced later during dental implant surgery.
  • Figure 3 illustrates a guide for printing (20) designed in the CAD environment to be fitted over the jaw (a) of the virtual model (10) and designed with a marker (22).
  • Figure 4 illustrates a virtual region of interest for the corresponding virtual model (10) to teeth (b), alveolar nerves (c) and dental implants (d) which was defined as the region for visualization (30).
  • the position of the marker (22) defines an origin for the spatial coordinate system and the image data of the region for viewing (30) is associated with the marker (22), so that each point in the region for viewing ( 30) has spatial coordinates related to the marker's spatial coordinates (22).
  • Figure 5 illustrates the printed guide (40) and the marker (42), the printed guide (40) positioned over the patient's jaw (a) and teeth (b), for example, in an early stage of surgery dental implant.
  • Figure 6 illustrates an image corresponding to what is visible on the screen (70) for a user when using the system.
  • the viewing region (30) corresponding to the teeth (b), alveolar nerves (c) and dental implants (d) is correlated to the live image that includes the printed guide (40), the mandible (a) and part of the teeth ( b) not covered by the printed guide (40).
  • the viewing region (30) moves accordingly, in view of the position and orientation of the viewing region (30) being linked to the marker (42).
  • the user can use this system during surgery to assess the most correct position in edentulous spaces to perform the drilling of stores that will receive dental implants, according to the position of dental implants (d) in the region for viewing (30 ) visible on the screen (70).
  • the augmented reality marker can be printed together with the printable guide (20), generating the printed guide (40) with the marker (42), as shown in Figure 5.
  • it can be used a color 3D printer (35), capable of printing the augmented reality marker.
  • the augmented reality marker is attached to the printed guide (40) in a defined position when designing the print guide (20).
  • the guide for printing (20) can be configured with a seat space, such as a flat surface, which will later receive the augmented reality marker.
  • Figure 7 illustrates a printed guide (40) produced with a seat space (44).
  • the augmented reality marker can be made in the form of an adhesive, which is glued on the seat space (44) present in the printed guide (40).
  • FIG 8 illustrates a flowchart of the guide system with augmented reality according to a second embodiment of the invention.
  • the second embodiment of the invention is identical to the first embodiment of the invention except that, according to the second embodiment of the invention, the marker (22) corresponds to a mapped region defined during the design of the print guide (20), the region mapped being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the printed guide (40) with marker (42) corresponding to the mapped region.
  • the marker (22) corresponding to the mapped region of the print guide (20) is defined during the processing of the print guide in the CAD environment.
  • Figure 9 illustrates a printed guide (40) with the marker (42) corresponding to the mapped region, which is represented in gray scale in the figure just for the purpose of understanding the invention.
  • the marker (42) corresponding to the mapped region of the printed guide (40) must be positioned at least once within the scanning field (CS) of the 3D scanner (80) which performs a live scan of the printed guide (40).
  • the 3D scanner (80) can be of the laser type.
  • the processing unit (60) detects the marker (42) corresponding to the mapped region when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker (42) and represents the region for viewing ( 30) in correlation to the live image, according to the position of the marker (42).
  • processing can be performed in a program developed from a software development kit - SDK -, such as, for example, the Bridge Engine Framework by Occipital.
  • SDK makes the correlation when finding the transformation matrix that overlaps the points of the region for visualization (30) to the points of the mapped region.
  • the printed guide (40) can be configured to physically guide a surgical tool.
  • the printed guide (40) may have guide holes compatible with the positions of the stores that will receive the dental implants, each guide hole then being used to guide a handpiece drill. during the drilling of each store.
  • Figure 10 illustrates a flow chart of the guide system with augmented reality according to a third embodiment of the invention.
  • the correlation means comprise a marker (46) in contrasting material fixed to the patient, by means of fixation (48), with at least one image examination being carried out with the patient wearing the marker (46).
  • the contrasting material of the marker (46) is a material that allows the image data (4) to be obtained so that the marker (46) is perceptible and distinguishable in relation to the anatomical structures captured in the image examination.
  • different contrasting materials can be used, according to the type of image examination employed.
  • the marker (46) For example, for imaging exams using X-rays, such as computed tomography, the marker (46) must be made of radio-contrasting material, such as materials based on iodine, barium sulfate, hydroxyapatite, gutta-percha, among others. Alternatively, if the exam is for magnetic resonance imaging, the marker (46) must be made of material contrasting to this type of exam, such as gadolinium-based materials.
  • the correlation means comprise the virtual model (10) generated with the marker (46), according to a system of spatial coordinates that associate the region for visualization (30) with the marker ( 46).
  • the image data of the region for viewing (30) is associated with the marker (46) with the aid of a CAD computer program.
  • the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • Each point in the region for viewing (30) has spatial coordinates related to the marker's spatial coordinates (46).
  • the CAD program itself can be configured to automatically identify the marker (46) and automatically define the marker (46 ) as the origin of the spatial coordinate system.
  • the correlation means comprise the marker (46) fixed on the patient in a position identical to that used during the performance of at least one image examination.
  • the correlation means comprise the processing unit (60) configured to detect the marker (46) attached to the patient and identify the region for viewing (30) associated with the marker (46) and representing the region for viewing (30) in correlation to the patient's live image, according to the marker position (46).
  • processing can be performed in an augmented reality program, developed from an augmented reality software development kit - SDK -, such as, for example, ARToolKit ® or Vuforia TM.
  • the fixing means (48) of the marker (46) on the patient can be of the permanent type, that is, with fixing elements that perform a fixation that cannot be undone without damaging the fixing elements.
  • the fixing means (48) can be configured in the form of an adhesive, glue, tattoo, thermoformable plate of closed section, for example, installed around the patient's forearm, among others.
  • the fixing means (48) of the marker (46) on the patient can be of the movable type, that is, with fixing elements that allow to deactivate the fixation without damaging the fixing elements and that later allow reactivating the fixation about the patient.
  • the fixing means (48) can be configured in the form of a bracelet, elastic strap, thermoformable plate with an open section, for example, installed by fitting over part of the forearm and on the back of the hand, among others .
  • the marker (46) corresponds to an augmented reality marker, detectable via image recognition by the processing unit (60) when processing the live image received * from the camera (50).
  • the marker (46) attached to the patient must be positioned at least once within the field of view (CV) of the camera (50) to provide the detection of the marker (46) and the establishment of a correlation between the live image and the region for viewing (30).
  • ARToolKit makes use of a square marker to determine which transformation mathematics that should be applied to the visualization region (30) in order to represent the visualization region (30) in correlation to the patient's live image. This is due to the fact that such a transformation can be defined based on only four coplanar and non-collinear points, which correspond to the vertices of the square marker detected when processing the live image received from the camera (50).
  • FIG 11 illustrates a flowchart of the guide system with augmented reality according to a fourth embodiment of the invention.
  • the fourth embodiment of the invention is identical to the third embodiment of the invention except that, according to the fourth embodiment of the invention, the marker (46) corresponds to a mapped region (46b) of a contrasting region (46a) acquired in the examination of image, the mapped region (46b) being defined during the processing of the virtual model (10), the mapped region (46b) being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scanning of the patient.
  • the mapped region (46b) of the marker (46) must be positioned at least once within the scanning field (CS) of the 3D scanner (80) that performs a live scan.
  • the 3D scanner (80) can be of the laser type.
  • the processing unit (60) detects the marker (46) corresponding to the mapped region (46b) of the contrasting region (46a) when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker ( 46) and represents the region for viewing (30) in correlation to the live image, according to the position of the marker (46).
  • These processing can be performed in a program developed from a development kit. software - SDK like, the Bridge Engine Framework by Occipital.
  • Figures 12 to 15 illustrate the guide system with augmented reality, according to the fourth embodiment of the invention, applied to assist cardiac surgery, for example, robotically assisted.
  • Figure 12 illustrates a contrasting region (46a) made of contrasting material, which is fixed by means of fixation (48) in the patient's chest region to perform at least one image exam.
  • the fixation means (48) are formed by an adhesive base to the patient's chest, which contains the contrasting region (46a).
  • Figure 13 represents the virtual model (10) generated from the image data (4) obtained in the image examination.
  • the virtual model (10) consists of the patient's thoracic region including the spine (e), ribs (f), heart and vessels (h).
  • the virtual model (10) is generated with the contrasting region (46a) considering that it was acquired in the image exam.
  • marker (46) is defined as a mapped region (46b) of the contrasting region (46a).
  • the mapped region (46b) corresponds to a portion of the contrasting region (46a).
  • the mapped region can be defined as the entire contrasting region.
  • Figure 14 illustrates a virtual region of interest of the virtual model (10) corresponding to the heart and vessels (h) that was defined as the region for visualization (30).
  • the position of the marker (46) corresponding to the mapped region (46b) defines an origin for the spatial coordinate system and the image data of the region for visualization (30) is associated with the mapped region (46b), so that each point in the region for viewing (30) has spatial coordinates related to the coordinates of the mapped region (46b).
  • Figure 15 illustrates an image corresponding to what is vis' ible on the screen (70) for a user during use of the system, for example, at an early stage in heart surgery.
  • the contrasting region (46a) is fixed by means of fixation (48) to the patient's pectoral region.
  • the 3D scanner (80) performs a live scan of the patient.
  • the camera (50) acquires a live image of the patient.
  • the processing unit (60) detects the marker (46) corresponding to the mapped region (46b) of the contrasting region (46a) when processing the data received from the 3D scanner (80), identifies the region for viewing (30) associated with the marker ( 46) and represents the region for viewing (30) in correlation to the live image, according to the position of the marker (46).
  • the region for visualization (30) corresponding to the heart and vessels (h) is visible on the screen (70) correlated to the patient's live image.
  • the user can use this system to identify the position of the heart and vessels (h) in the patient and thus better decide on the most appropriate places to make incisions in the patient's chest for cardiac surgery.
  • FIG 16 illustrates a flowchart of the guide system with augmented reality according to a fifth embodiment of the invention.
  • the correlation means comprise a marker (49) corresponding to a mapped region of a patient's anatomical structure, which is defined during the processing of the virtual model (10), according to a coordinate system spatial maps that associate the viewing region (30) with the mapped region.
  • the marker (49) corresponding to the mapped region of an anatomical structure of the patient is defined during the processing of the Virtual model (10) in the CAD environment, thus defining the origin of the spatial coordinates.
  • the image data of the region for viewing (30) is associated with the marker (49) with the aid of the CAD computer program.
  • the CAD computer program can be developed with the aid of the ITK libraries - Insight Segmentation and Registration Toolkit and VTK - The Visualization Toolkit.
  • Each point in the region for viewing (30) has spatial coordinates related to the spatial coordinates of the marker (49) corresponding to the mapped region.
  • the correlation means comprise the processing unit (60) configured to detect the mapped region of the patient and identify the region for viewing (30) associated with the mapped region, the mapped region being detectable by the processing unit (60) when processing data received from a 3D scanner (80) configured to perform a live scan of the patient, and represent the region for viewing (30) in correlation to the patient's live image, according with the marker position (49) corresponding to the mapped region.
  • a software development kit - SDK such as, for example, the Bridge Engine Framework by Occipital.
  • Such SDK makes the correlation when finding the transformation matrix that overlaps the points of the region for visualization (30) to the points of the mapped region.
  • the marker (49) corresponding to the mapped region of an anatomical structure of the patient must be positioned at least once within the scanning field (CS) of the 3D scanner (80) that performs a live scan of the patient.
  • the 3D scanner (80) can be of the laser type.
  • the unit of processing (60) detects the marker (49) corresponding to the mapped region when processing the data received from the 3D scanner (80), identifies the viewing region (30) associated with the marker (49) and represents the viewing region (30) in correlation to the patient's live image on the screen (70), according to the position of the marker (49) corresponding to the mapped region.
  • the viewing region (30) can be configured with an appropriate level of transparency so that it is possible to visualize its external contour while, for example, it is possible to visualize the contour of structures anatomical features of the patient present within the field of view (CV) of the camera (50).
  • the display region (30) can be configured in different layers, each layer being associated with a respective color and a respective level of transparency. Each layer can correspond to a particular anatomical structure, so that bones, muscles, organs, vessels can be distinguished.
  • the system is configured so that the user can select, in real time, the layers of the region for viewing (30) that should be shown on the screen (70) and can also change the color and the level of transparency of each layer. .
  • the region for visualization (30) has its visualization adaptable to different levels of anatomical structures, and it is up to the user to select the visualization most convenient to their objectives.
  • the camera (50), the processing unit (60), the screen (70), and particularly the 3D scanner (80) according to the second, fourth and fifth embodiments of the invention can be integrated into a smartphone or tablet device.
  • these devices can be integrated into a head mounted display - HMD type device, such as, for example, the Epson ® Moverio model or ⁇ Microsoft HoloLens.
  • the user views the patient's live image through the transparent screen (70) of the HMD device together with the viewing region (30) represented on the screen (70) in a correlated manner to the patient's live image.
  • the computing unit (6) can be configured as a desktop computer.
  • the computing unit (6) can be configured as a mobile electronic device, such as a tablet, smartphone or head mounted display - HMD device.
  • the computing unit (6) and the processing unit (60) can coincide on the same device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un système de guidage à utiliser comme outil d'appui pendant la planification ou la réalisation d'une procédure médicale chez un patient, par exemple, une intervention chirurgicale. Selon l'invention, le système de guidage comprend une unité de calcul (6) conçue pour générer un modèle virtuel (10) à partir de données d'image (4) d'un patient obtenues au moyen d'au moins un examen d'image précédemment réalisé sur le patient, et traiter le modèle virtuel (10) afin de définir une région virtuel d'intérêt tel qu'une région à visualiser (30). Le système comprend en outre une caméra (50) conçue pour acquérir une image en direct du patient, une unité de traitement (60) conçue pour recevoir l'image en direct du patient et générer un signal vidéo, et un afficheur (70) conçu pour recevoir le signal vidéo. Le système comprend également des moyens pour mettre en corrélation de la région à visualiser (30) du modèle virtuel (10) avec une région réelle d'intérêt du patient et représenter la région à visualiser (30) en corrélation avec l'image en direct du patient, la région à visualiser (30) étant superposée à la région réelle d'intérêt du patient.
PCT/BR2019/000038 2018-10-31 2019-10-30 Système de guidage à réalité augmentée Ceased WO2020087141A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
BR102018072428-2 2018-10-31
BR102018072428-2A BR102018072428A2 (pt) 2018-10-31 2018-10-31 Sistema guia com realidade aumentada

Publications (1)

Publication Number Publication Date
WO2020087141A1 true WO2020087141A1 (fr) 2020-05-07

Family

ID=70461749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/BR2019/000038 Ceased WO2020087141A1 (fr) 2018-10-31 2019-10-30 Système de guidage à réalité augmentée

Country Status (2)

Country Link
BR (1) BR102018072428A2 (fr)
WO (1) WO2020087141A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11439469B2 (en) 2018-06-19 2022-09-13 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12465374B2 (en) 2019-12-18 2025-11-11 Howmedica Osteonics Corp. Surgical guidance for surgical tools

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160225192A1 (en) * 2015-02-03 2016-08-04 Thales USA, Inc. Surgeon head-mounted display apparatuses
US20170312032A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content
US20170367771A1 (en) * 2015-10-14 2017-12-28 Surgical Theater LLC Surgical Navigation Inside A Body

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160225192A1 (en) * 2015-02-03 2016-08-04 Thales USA, Inc. Surgeon head-mounted display apparatuses
US20170367771A1 (en) * 2015-10-14 2017-12-28 Surgical Theater LLC Surgical Navigation Inside A Body
US20170312032A1 (en) * 2016-04-27 2017-11-02 Arthrology Consulting, Llc Method for augmenting a surgical field with virtual guidance content

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11439469B2 (en) 2018-06-19 2022-09-13 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US11478310B2 (en) 2018-06-19 2022-10-25 Howmedica Osteonics Corp. Virtual guidance for ankle surgery procedures
US11571263B2 (en) 2018-06-19 2023-02-07 Howmedica Osteonics Corp. Mixed-reality surgical system with physical markers for registration of virtual models
US11645531B2 (en) 2018-06-19 2023-05-09 Howmedica Osteonics Corp. Mixed-reality surgical system with physical markers for registration of virtual models
US11657287B2 (en) 2018-06-19 2023-05-23 Howmedica Osteonics Corp. Virtual guidance for ankle surgery procedures
US12020801B2 (en) 2018-06-19 2024-06-25 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12046349B2 (en) 2018-06-19 2024-07-23 Howmedica Osteonics Corp. Visualization of intraoperatively modified surgical plans
US12050999B2 (en) 2018-06-19 2024-07-30 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12112269B2 (en) 2018-06-19 2024-10-08 Howmedica Osteonics Corp. Mixed reality-aided surgical assistance in orthopedic surgical procedures
US12112843B2 (en) 2018-06-19 2024-10-08 Howmedica Osteonics Corp. Mixed reality-aided education related to orthopedic surgical procedures
US12125577B2 (en) 2018-06-19 2024-10-22 Howmedica Osteonics Corp. Mixed reality-aided education using virtual models or virtual representations for orthopedic surgical procedures
US12148518B2 (en) 2018-06-19 2024-11-19 Howmedica Osteonics Corp. Neural network for recommendation of shoulder surgery type
US12170139B2 (en) 2018-06-19 2024-12-17 Howmedica Osteonics Corp. Virtual checklists for orthopedic surgery
US12237066B2 (en) 2018-06-19 2025-02-25 Howmedica Osteonics Corp. Multi-user collaboration and workflow techniques for orthopedic surgical procedures using mixed reality
US12266440B2 (en) 2018-06-19 2025-04-01 Howmedica Osteonics Corp. Automated instrument or component assistance using mixed reality in orthopedic surgical procedures
US12347545B2 (en) 2018-06-19 2025-07-01 Howmedica Osteonics Corp. Automated instrument or component assistance using externally controlled light sources in orthopedic surgical procedures
US12362057B2 (en) 2018-06-19 2025-07-15 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12380986B2 (en) 2018-06-19 2025-08-05 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12465374B2 (en) 2019-12-18 2025-11-11 Howmedica Osteonics Corp. Surgical guidance for surgical tools

Also Published As

Publication number Publication date
BR102018072428A2 (pt) 2020-05-26

Similar Documents

Publication Publication Date Title
US11883118B2 (en) Using augmented reality in surgical navigation
CN110494921B (zh) 利用三维数据增强患者的实时视图
JP3367663B2 (ja) 解剖学的物体の内部領域を視覚化するシステム
US11426241B2 (en) Device for intraoperative image-controlled navigation during surgical procedures in the region of the spinal column and in the adjacent regions of the thorax, pelvis or head
US9498132B2 (en) Visualization of anatomical data by augmented reality
ES2228043T3 (es) Sistema quirurgico interactivo asistido por ordenador.
US20200113636A1 (en) Robotically-assisted surgical device, robotically-assisted surgery method, and system
TWI396523B (zh) 用以加速牙科診斷及手術規劃之系統及其方法
JP2021523784A (ja) 患者に貼付される光学コードを用いた患者の画像データと患者の実景との位置合わせ
US20140234804A1 (en) Assisted Guidance and Navigation Method in Intraoral Surgery
US11344180B2 (en) System, apparatus, and method for calibrating oblique-viewing rigid endoscope
US20160100773A1 (en) Patient-specific guides to improve point registration accuracy in surgical navigation
JP5961504B2 (ja) 仮想内視鏡画像生成装置およびその作動方法並びにプログラム
JP5934070B2 (ja) 仮想内視鏡画像生成装置およびその作動方法並びにプログラム
Mewes et al. Projector‐based augmented reality system for interventional visualization inside MRI scanners
Condino et al. Registration sanity check for ar-guided surgical interventions: Experience from head and face surgery
WO2020087141A1 (fr) Système de guidage à réalité augmentée
Bichlmeier et al. Laparoscopic virtual mirror for understanding vessel structure evaluation study by twelve surgeons
CN101467890B (zh) 加速牙科诊断及手术规划的系统及观察立体影像的方法
Nowatschin et al. A system for analyzing intraoperative B-Mode ultrasound scans of the liver
Bichlmeier et al. The visible korean human phantom: Realistic test & development environments for medical augmented reality
WO2018204999A1 (fr) Système de biomodèle à réalité augmentée
CN118369732A (zh) 解剖扫描、瞄准和可视化
De Paolis et al. An augmented reality application for the enhancement of surgical decisions
García Mato Optimization of craniosynostosis surgery: virtual planning, intraoperative 3D photography and surgical navigation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19878087

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19878087

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19878087

Country of ref document: EP

Kind code of ref document: A1