[go: up one dir, main page]

WO2025194021A1 - Prothèse maxillaire comprenant des repères de cadre pour navigation stéréotaxique - Google Patents

Prothèse maxillaire comprenant des repères de cadre pour navigation stéréotaxique

Info

Publication number
WO2025194021A1
WO2025194021A1 PCT/US2025/019909 US2025019909W WO2025194021A1 WO 2025194021 A1 WO2025194021 A1 WO 2025194021A1 US 2025019909 W US2025019909 W US 2025019909W WO 2025194021 A1 WO2025194021 A1 WO 2025194021A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
imaging
fiducial markers
dental
stereotactic navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/019909
Other languages
English (en)
Inventor
Kendall H. Lee
Basel SHARAF
Jonathan M. Morris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mayo Foundation for Medical Education and Research
Mayo Clinic in Florida
Original Assignee
Mayo Foundation for Medical Education and Research
Mayo Clinic in Florida
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mayo Foundation for Medical Education and Research, Mayo Clinic in Florida filed Critical Mayo Foundation for Medical Education and Research
Publication of WO2025194021A1 publication Critical patent/WO2025194021A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/14Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
    • A61B90/16Bite blocks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3954Markers, e.g. radio-opaque or breast lesions markers magnetic, e.g. NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/397Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave

Definitions

  • This document relates to devices and methods for providing stereotactic navigation during surgical procedures such as neurosurgical procedures and craniomaxillofacial procedures.
  • this document relates to devices and methods for providing stereotactic navigation using fiducial markers to align medical imaging data with a patient’s anatomy in the head and neck regions.
  • Stereotactic navigation is a medical imaging technology used to assist clinicians in targeting areas of the body during surgery.
  • stereotactic navigation involves aligning preoperative or intraoperative medical imaging data to actual anatomy of the patient. This alignment is sometimes called registration. Once registration is complete, stereotactic navigation systems can provide real-time guidance to clinicians by overlaying the imaging data onto anatomy of the patient based on the intraoperative alignment.
  • intraoperative computed tomography refers to the use of CT imaging technology during surgery.
  • Intraoperative CT provides real-time imaging of the patient’s anatomy in the operating room
  • Stereotactic navigation can involve using realtime intraoperative CT data to map onto the patient’s anatomy based stereotactic navigation registration.
  • This document describes devices and methods for providing stereotactic navigation during surgical procedures such as neurosurgical procedures and craniomaxillofacial procedures.
  • this document describes devices and methods for providing stereotactic navigation using medical imaging data (e.g., preoperative images, intraoperative images, intraoral scanning) and a dental prosthesis (e.g., a patient-matched maxillary and/or mandibular prosthesis) including fiducial markers to align the medical imaging data with a patient’s anatomy.
  • medical imaging data e.g., preoperative images, intraoperative images, intraoral scanning
  • a dental prosthesis e.g., a patient-matched maxillary and/or mandibular prosthesis
  • this disclosure is directed to a stereotactic navigation system
  • a stereotactic navigation system comprising a dental prosthesis sized to fit a maxillary dental pattern of a patient.
  • the stereotactic navigation system also includes a set of fiducial markers located on the dental prosthesis.
  • the dental prosthesis is configured to be attached to maxillary teeth of the patient so that a position of each fiducial marker of the set of fiducial markers is fixed relative to the maxillary teeth and a skull of the patient.
  • a stereotactic navigation system comprises a first image capture system associated with a first imaging modality, wherein the first image capture system is configured to capture first imaging of a patient’s anatomy and a reference device proximate to the patient’s anatomy, and wherein the first imaging comprises a point cloud of the patient’s anatomy and the reference device.
  • the stereotactic navigation system also includes a second image capture system associated with a second imaging modality', wherein the second image capture system is configured to capture second imaging of the patient’s anatomy, and wherein the second imaging comprises a surface geometry mesh of the patient’s anatomy.
  • the stereotactic navigation system includes one or more storage devices configured to store a model indicating dimensions of the reference device; and processing circuitry in communication with the one or more storage devices, wherein the processing circuitry is configured to: fuse the first imaging with the second imaging to create fused imaging of the patient’s anatomy and the reference device; retrieve, from the memory, the model indicating the dimensions of the reference device; and generate, based on the fused imaging of the patient’s anatomy and the reference device and the model indicating dimensions of the reference device, stereotactic navigation data indicating a position of a surgical instrument relative to the patient’s anatomy.
  • stereotactic navigation can be provided using the devices and methods described herein.
  • the dental prosthesis described herein attaches fiducial markers to a patient in a way that is less invasive than attaching fiducial markers directly to the skull and does not require fixed placement of the patient’s head.
  • the dental prosthesis described herein also provides a greater degree of precision of stereotactic navigation as compared with systems that attach fiducial markers to the skin of the patient. This is because the dental prosthesis can be secured tightly to the teeth of the patient.
  • FIG. 1 is a perspective view of a dental prosthesis attached to teeth of a patient.
  • FIG. 2 is a perspective view of fixation pins inserted into tissue of a patient for securing a dental prosthesis.
  • FIG. 3 illustrates an example dental prosthesis including passive fiducial markers and active fiducial markers.
  • FIGS. 4A and 4B illustrate an example of a dental prosthesis comprising a dental portion and a removable shield that can be attached to the dental portion via a connector piece.
  • FIG. 5 illustrates an example of the removable shield that can be attached to the dental prosthesis.
  • FIGS. 6 A and 6B illustrates a dental prosthesis comprising another example connector piece.
  • FIG. 7 illustrates an example dental prosthesis sized to fit a patient.
  • FIG. 8 illustrates example image segmentation of computed tomography (CT) data.
  • CT computed tomography
  • FIG. 9 illustrates an example fusion of CT data and intraoral scan data.
  • FIG. 10 illustrates an example registration of an intraoral scan on CT images.
  • FIG. 11 illustrates another example dental prosthesis including passive fiducial markers and active fiducial markers.
  • FIGS. 12A-12F illustrate examples of light detection and ranging (LiDAR) imaging (FIGS. 12A-12C) and computed tomography (CT) (FIGS. 12D-12F) imaging.
  • FIGS. 13A-13D illustrate aspects of fusion between CT imaging and LiDAR imaging for registration in a stereotactic navigation system.
  • FIG. 14 illustrates a registration of a LiDAR image of an N-bar to a stereotactic N- bar and a CT isolated N-bar.
  • FIG. 15 includes a flow diagram illustrating an example fusion between CT and LiDAR imaging for stereotactic navigation.
  • FIGS. 16-18 illustrate an example of a dental tray that is custom fit for a patient’s maxillary teeth using a dental mold.
  • FIG. 19 is a flow diagram illustrating an example technique for generating information for creating a dental prosthesis.
  • FIG. 20 is a flow diagram illustrating an example technique for generating information to fuse imaging from multiple modalities for stereotactic navigation.
  • This document describes devices and methods for providing stereotactic navigation during surgical procedures involving the head and neck of a patent as shown in FIG. 1, such as neurosurgery procedures, maxillofacial surgery procedures, ear, nose, and throat (ENT) surgery procedures, and other kinds of procedures.
  • this document describes devices and methods for providing stereotactic navigation using fiducial markers to align medical imaging data with a patient’s anatomy.
  • a dental prosthesis 10 can be used to fix a set of fiducial markers to a patient 12 to provide stereotactic navigation for a surgical procedure.
  • Dental prosthesis 10 can include a dental portion 22 configured to fit with maxillary teeth of patient.
  • prosthesis 10 can also include fixators 24, 28 for securing dental prosthesis 10 to the maxillary teeth and the skull of patient 12.
  • prosthesis 10 can include one or more fiducial markers that allow a stereotactic navigation to align medical imaging data with anatomical features of patient 12 in real time, allowing clinicians to access parts of the body of patient 12 during surgical procedures.
  • fiducial markers have at least two characteristics that facilitate stereotactic navigation.
  • fiducial markers are visible in medical imaging data focused on the fiducial markers. This means that when a CT scanner focuses on a patient’s head, for example, fiducial markers are visible alongside features of the patient’s head.
  • fiducial markers can be placed in known locations such that a system can determine distances between fiducial markers and other dimensions relating to the fiducial markers. This allows the system to use these known dimensions to determine a location of anatomical features relative to the fiducial markers, meaning that the system can map medical imaging data onto anatomical features.
  • the fiducial markers can include active fiducial markers, passive fiducial markers, or a combination of active fiducial markers and passive fiducial markers.
  • Active fiducial markers can include wireless electromagnetic (EM) tracking fiducials.
  • Stereotactic navigation can be used in medical procedures such as neurosurgery, craniomaxillofacial procedures in otolaryngology, plastic surgery, oral maxillofacial surgery, and radiation therapy.
  • Stereotactic navigation systems can provide accurate three- dimensional (3D) guidance for clinicians to precisely target specific areas and anatomy within the body.
  • stereotactic navigation systems can receive high- resolution medical imaging data (e.g., magnetic resonance imaging (MRI) data or computed tomography (CT) data). This medical imaging data can provide information about anatomy of the patient’s body including regions of interest that are not visible to the naked eye.
  • stereotactic navigation systems can use imaging modalities such as CT and MRI to perform high level tracking and provide real time feedback in sensitive anatomic areas.
  • Dental prosthesis 10 can attach fiducial markers to patient 12 in a way that improves stereotactic navigation and the surgical procedure for both clinician and patient.
  • dental prosthesis 10 can be sized to fit the maxillary dental pattern of patient 12 in a way that does not require cutting into bone of patient 12. This results in a stereotactic navigation system that is less invasive to patient 12 as compared with systems that attach fiducial markers by cutting into the bone.
  • dental prosthesis 10 can attach fiducial markers to patient 12 in a way that does not require securing a position of the head of patient 12 for any part of the procedure.
  • Some stereotactic navigation systems that do not include a dental prosthesis can be cumbersome and labor intensive. For example, some stereotactic navigation systems use fiducial markers that are attached to the skin, which can be prone to error. Other stereotactic navigation systems use fiducial markers that are secured to the patient via devices that are attached to the bone of the patient. As described above, attaching fiducial markers to the bone can involve cutting the bone, which is invasive to the patient. In some embodiments, dental prosthesis 10 can fix fiducial markers to patient 12 by engaging with the maxillary teeth and without cutting into the teeth or other bone of patient 12. This can provide greater accuracy as compared with systems that use fiducial markers attached to the skin while being less invasive as compared with systems that cut bone to attach fiducial markers.
  • stereotactic navigation to a high level of precision.
  • a surgical procedure to remove a tumor in an orbit in proximity to the optic nerve involves stereotactic navigation that is precise within one millimeter (mm).
  • Precision of stereotactic navigation can depend on registration that maps medical imaging data to anatomy of patient 12.
  • Dental prosthesis 10 can be specific to patient 12 so that dental prosthesis 10 fits tightly and securely with the maxillary teeth of patient 12. This can ensure that fiducial markers of dental prosthesis 10 do not move relative to patient 12, resulting in precise registration.
  • the maxilla, the skull base, and the cranium represent anatomical structures in the human head.
  • the maxilla is a paired bone that forms the upper jaw and a central portion of the facial skeleton.
  • the maxilla houses the upper teeth and provides structural support for facial bones.
  • the maxilla also contributes to forming a part of the orbits (eye sockets) and the nasal cavity.
  • the skull base represents a bottom part of the skull, which forms the floor of the cranial cavity and provides support for the brain.
  • the skull base includes several bones, including the frontal, ethmoid, sphenoid, and occipital bones.
  • the skull base also forms boundaries of the nasal cavity, the oral cavity, and the orbits.
  • the cranium comprises a bony structure that encloses and protects the brain.
  • the cranium includes several bones, including the frontal, parietal, temporal, occipital, sphenoid, and ethmoid bones.
  • dental prosthesis 10 provides a solid platform that improves accuracy of registration as compared with systems that use fiducial markers attached to the skin of a patient.
  • dental prosthesis 10 represents a patient specific, low-cost, rapid design tool that can be deployed with active and passive fiducial markers for stereotactic navigation.
  • Dental prosthesis 10 represents an out-of-the box design that is individualized and patient-centered and will be engineered with patient comfort and efficient workflow and provides stereotactic navigation with high precision.
  • a computing system is configured to receive medical imaging data indicating the maxillary dental pattern of patient 12.
  • the computing system can generate information for creating dental prosthesis 10 to fit the maxillary dental pattern of patient 12.
  • the medical imaging data can include information concerning a size, shape, and contours of the maxillary teeth of patient 12. This allows the computing system to design dental prosthesis 10 to fit the maxillary dental pattern of patient 12.
  • the computing system can output the information to a device to cause the device to create the dental prosthesis.
  • the device can, in some embodiments, comprise a three-dimensional (3D) printer.
  • the 3D printer can process the instructions to create the dental prosthesis 10.
  • the medical imaging data received by the computing system can include computed tomography (CT) medical imaging data and intraoral scan medical imaging data in some embodiments.
  • CT computed tomography
  • the medical imaging data can also include O-armTM imaging data.
  • the computing system can fuse the CT medical imaging data and the intraoral scan medical imaging data to create combined medical imaging data Any combination of CT medical imaging data, intraoral scan imaging data, and O-armTM imaging data can be fused to generate combined medical imaging data. Using this combined medical imaging data, the computing system can generate the information for creating the dental prosthesis 10 to fit the maxillary dental pattern of patient 12.
  • Fusing CT medical imaging data and the intraoral scan medical imaging data can result in a dental prosthesis 10 that fits the maxillary dental pattern of patient 12 more accurately as compared with a prosthesis that is generated using only one kind of imaging data (e.g., only CT data or only intraoral scan data).
  • imaging data e.g., only CT data or only intraoral scan data
  • fusing CT medical imaging data and the intraoral scan medical imaging data combines information present in CT data and information present in intraoral scan data to provide a detailed and comprehensive view of the maxillary dental pattern of patient 12.
  • the computing system can import digital Imaging and communications in medicine (DICOM) files of patient 12 to segment a 3D skull.
  • An intraoral scan e.g., TRIOS 3 can then scan the same dentitions, eliminating any artifacts.
  • Biocompatible materials can be designed and then 3D printed, incorporating configured fiducial markers.
  • the computing system is configured to create a 3D mesh for the skull of patient 12 based on CT data indicating the maxilla, the skull base, and the cranium.
  • the computing system can apply a machine learning algorithm to segment areas of the patient’s skull in some embodiments.
  • This 3D mesh can indicate surfaces of anatomical features (e.g., surfaces of the skull) within a 3D environment.
  • the computing system can use intraoral scan data to create a high-fidelity dental cast.
  • the computing system can fuse the CT data and the intraoral scan data can, in some embodiments, to circumvent artifacts corresponding to dental fillings and crowns.
  • materials such as dental fillings and crowns can create artifacts in CT data such that spatial surfaces of these materials are not discernable in CT data.
  • the intraoral scan captures these features, meaning that it can be beneficial to fuse CT data and intraoral data in some embodiments.
  • the computing system can use this fused data to generate instructions for creating a patient-specific dental prosthesis 10 that fits the maxillary teeth of patient 12.
  • Fusing two kinds of image data can involve mapping features of one kind of data to features of another kind of data.
  • the computing system can receive CT data and intraoral data corresponding to a patient.
  • CT scans can provide detailed information about the bony anatomy of patient 12.
  • Intraoral scans can capture the soft tissue structures within the oral cavity and other materials, such as dental fillings and crowns.
  • the computing system can align the CT data with the intraoral data to ensure that corresponding anatomical structures are accurately matched. This can involve using anatomical landmarks visible in both sets of images (e.g., a tooth).
  • the computing system can apply a coordinate transformation to ensure that the CT data and the intraoral data are in the same spatial reference frame. This can allow for accurate spatial alignment between the CT data and the intraoral scan data.
  • the computing system can use a fusion algorithm to combine the CT data and the intraoral scan data into a single integrated dataset.
  • This algorithm may involve any one or combination of blending the two kinds of data, overlaying the two kinds of data, and combining specific features from each modality to enhance the overall visualization.
  • CT data can include some information that is not present in intraoral scan data
  • intraoral scan data can include some information that is not present in CT data
  • fusing the two datasets can provide a greater amount of information for generating dental prosthesis 10. That is, the fusing CT data and intraoral scan image data can provide a more complete understanding of the patient’s oral anatomy.
  • the computing system can fuse O-armTM imaging data with CT data and/or fuse O- armTM imaging data with intraoral scan data.
  • Dental prosthesis 10 can provide a platform for integrating virtual reality (VR) and/or artificial reality (AR) applications in intracranial and craniofacial surgery procedures. This is because dental prosthesis 10 can be fixed to patient 12 in a way that allows a clinician to move ahead of patient 12 during the procedure. As described above, since the maxilla, the skull base, and the cranium form a continuous mass that moves as one, fiducial markers located on dental prosthesis 10 remain fixed relative to the maxilla, the skull base, and the cranium. This allows a range of head mobility without interfering with intraoperative imaging. By not restricting head movement, dental prosthesis 10 can increase patient comfort during surgical procedures as compared with systems that restrict head movement. Augmented reality headsets can display the fiducials to focus on movement or receive active information from the fiducials through short range communication such as Bluetooth ® or wireless electromagnetic sensors. Mixed reality devices can combine virtual and augmented reality.
  • VR virtual reality
  • AR artificial reality
  • dental prosthesis 10 can be attached to patient 12 without cutting into bone of patient 12.
  • dental prosthesis 10 can include a dental portion 22 that defines a trough or a space to fit the maxillary dental pattern of patient 12.
  • dental prosthesis 10 can also include a first fixator 24 and a second fixator 28 configured to removably attach dental prosthesis 10 to the maxillary teeth of the patient 12 so that dental prosthesis 10 does not move relative to the maxillary teeth of patient 12.
  • first fixator 24 and the second fixator 28 can each include a fixator body portion connected to the dental portion 22 of the dental prosthesis 10 and a fixator head portion connected to the fixator body portion.
  • first fixator 24 includes fixator body portion 25 and fixator head portion 26.
  • Second fixator 28 can include fixator body portion 29 and fixator head portion 30. The fixator body portion corresponding to each of the first fixator 24 and the second fixator 28 can extend upwards away from dental portion 22.
  • the fixator head portion corresponding to each of first fixator 24 and second fixator 28 can define a lumen configured to receive a proximal portion of a fixation pin as a distal portion of the fixation pin is inserted into gum tissue of patient 12.
  • a lumen of fixator head portion 26 can receive fixation pin 34 and a lumen of fixator head portion 30 can receive fixation pin 36.
  • fixation pin 34 can extend through the lumen of fixator head portion 26 and into gum tissue of patient 12.
  • Fixation pin 36 can extend through the lumen of fixator head portion 30 and into gum tissue of patient 12.
  • Fixation pin 34 and fixation pin 36 can secure dental prosthesis 10 to patient 12 so that dental prosthesis 10 does not move relative to patient 12.
  • stereotactic technologies can be effective in helping clinicians to access deep brain structures safely and accurately (e.g., in deep brain stimulation (DBS) implant procedures).
  • DBS deep brain stimulation
  • Certain neuroimaging advancements can provide anatomical localization of the brain at high levels of accuracy, and this can be beneficial for registration in stereotactic navigation.
  • Image fusion techniques that merge different imaging modalities, such as CT and MRI can permit patient-specific volume alignment with available stereotactic atlases for precise targeting. This multimodal imaging can minimize or eliminate a need for manual reference points and expedite a surgical planning process.
  • Some stereotactic techniques rely on rigid frame systems to establish fixed coordinates for brain nuclei targeting. This frame-based stereotactic can result in longer procedural times because following stereotactic localizer box placement, an additional CT scan can be necessary for registration.
  • Stereotactic CT scan provide additional radiation exposure and extra operative time, which adds to the workflow complexity.
  • a stereotactic registration system can include minimally invasive or non-invasive surface-based registration techniques that can be photo-documented in real-time, retaining high accuracy associated with frame-based localization or rigid fiducials.
  • LiDAR light detection and ranging
  • LiDAR is a three-dimensional imaging technology which measures distances and analyzes reflection from surfaces. While some imaging modalities have moved stereotactic surgery to become less invasive and more efficient, some of these modalities have also introduced challenges in maintaining precise alignment and registration.
  • LiDAR is a remote sensing technology that can use laser pulses to measure distances with high precision.
  • a LiDAR system can emit rapid laser beams, typically in the form of infrared light, towards a target area. These pulses can reflect from surfaces and return to the sensor of the LiDAR system. In turn, the sensor can use a time taken for each light pulse to return to calculate the distance between the sensor and the object that the light pulse reflected from. By emitting millions of pulses per second and measuring their reflection times, LiDAR can create highly detailed three-dimensional maps of landscapes, buildings, and other objects.
  • LiDAR systems can include three main components: a laser scanner, a global positioning system (GPS) receiver, and an inertial measurement unit (IMU).
  • GPS global positioning system
  • IMU inertial measurement unit
  • the laser scanner can emit and detect the pulses, the GPS can provide precise location data of the sensor, and the IMU can record the sensor’s orientation and movement. These components can work together to ensure that collected data is georeferenced accurately.
  • LiDAR systems can be mounted on various platforms to map a variety of environments.
  • LiDAR technology can be useful for registration by offering high-resolution 3D surface mapping and depth perception for generating a stereotactic spatial reference system.
  • LiDAR systems can also use artificial intelligence (Al) to automate multimodal 3D imaging fusion with radiological datasets, this could serve as a radiation-free alternative for real-time stereotactic localization space with unique 3D Cartesian coordinates, offering both patient safety and logistical advantages.
  • fixation pm 34 can be inserted into gum tissue of patient 12 at a first location between tooth 44 and tooth 45.
  • fixation pin 36 can be inserted into gum tissue of patient 12 at a second location between tooth 46 and tooth 47.
  • fixation pm 34 and fixation pin 36 are inserted into gum tissue without cutting or piercing any bones or teeth of patient 12. This can result in a fixation of dental prosthesis 10 being less invasive as compared with other stereotactic navigation systems that directly attach devices to bone.
  • dental prosthesis 10 being sized to fit the maxillary dental pattern of patient 12 and fixation pin 34 and fixation pin 36 being inserted into gum tissue of patient 12 secures dental prosthesis 10 to patient 12 in a way that prevents dental prosthesis 10 from moving relative to patient 12. This means that a stereotactic navigation system including dental prosthesis 10 can result in more precise stereotactic navigation as compared with systems that use skin-attached fiducial markers that are not as securely attached.
  • dental prosthesis 10 can include a set of passive fiducial markers 52A-52F (collectively, “‘passive fiducial markers 52”) and a set of active fiducial markers 54A-54F (collectively, “active fiducial markers 54”). As shown in the embodiment of FIG. 3, dental prosthesis 10 includes six passive fiducial markers 52 and six active fiducial markers 54. Passive fiducial markers 52 can be located on dental prosthesis 10 spaced apart from each other so that each of passive fiducial markers 52 occupies a different location. Similarly, active fiducial markers 54 can be located on dental prosthesis 10 spaced apart from each other so that each of active fiducial markers 54 occupies a different location.
  • a stereotactic navigation system can determine the location of fiducial markers relative to each other. Using these relative locations, the stereotactic navigation system can determine a location of anatomy of the patient for registration. For example, a stereotactic navigation system can determine a location of any combination of passive fiducial markers 52-52F and active fiducial markers 54-54F. Passive fiducial marker 52A can have first spatial coordinates, passive fiducial marker 52B can have second spatial coordinates, fiducial marker 52C can have third spatial coordinates, and so on. Based on the spatial coordinates of passive fiducial markers 52 and active fiducial markers 54, the stereotactic navigation system can determine a 3D environment within which passive fiducial markers 52 and active fiducial markers 54 are located.
  • Dental prosthesis 10 is not limited to including six passive fiducial markers and six active fiducial markers. In some embodiments, dental prosthesis 10 includes more than six or less than six active fiducial markers. In some embodiments, dental prosthesis 10 includes more than six or less than six active fiducial markers. In some embodiments, dental prosthesis 10 includes passive fiducial markers without including active fiducial markers. In some embodiments, dental prosthesis 10 includes active fiducial markers without including passive fiducial markers.
  • Passive fiducial markers 52 and active fiducial markers 54 represent two kinds of fiducial markers that can be used to assist with registration and localization for stereotactic navigation.
  • passive fiducial markers 52 comprise inert markers that appear in medical imaging data without emitting any signals.
  • Passive fiducial markers 52 can include materials such as metal, plastic, ceramics, or any combination thereof. Some example metals that can serve as passive fiducial markers include aluminum, brass, stainless steel, titanium, and copper. Radiopaque properties can ensure that passive fiducial markers 52 are visible in medical imaging data corresponding to imaging modalities such as X-ray, CT, or MRI due to radiopaque properties.
  • a stereotactic navigation system locates passive fiducial markers 52 in medical imaging data without receiving any information from passive fiducial markers 52 indicating position, rely on a stereotactic navigation system to localize the imaging system to detect and localize them in the images.
  • Active fiducial markers 54 can output signals that identify a location of active fiducial markers 54 in real time. Examples of active fiducial markers
  • Active fiducial markers 54 include electromagnetic tracking sensors or wireless localization devices that emit radiofrequency signals. Active fiducial markers 54 can be beneficial for registration, because a stereotactic navigation system can receive information from active fiducial markers 54 indicating a location of active fiducial markers 54 without independently determining a location of active fiducial markers 54. Using the determined location of the active fiducial markers 52, the stereotactic navigation system can in some embodiments align the medical imaging data with anatomy of patient 12.
  • dental prosthesis 10 includes a shield 53 and a connector element 55.
  • Connector element 55 can be attached to dental portion 22 of dental prosthesis 10.
  • Connector element 55 can define a lumen for receiving a shaft 56 of shield 53. In some examples, when the lumen of connector element
  • FIG. 4A shows shield 53 being detached from the rest of dental prosthesis 10 such that shaft 56 is fully outside of connector element 55.
  • Dental prosthesis 10 is not limited to examples where dental portion 22 and shield 53 are separate pieces. In some embodiments, dental portion 22, shield 53, connector element 55 represent a single piece that does not separate.
  • patient 12 wears dental prosthesis 10 so that dental portion 22 engages with the maxillary teeth while connector element 55 receives shaft 56 of shield 53.
  • Patient 12 can bite upwards with the mandibular teeth such that connector element 55 is wedged between the maxillary teeth and the mandibular teeth, as shown in FIGS. 4A and 4B. This means that a position of dental portion 22 and shield 53 can be fixed relative to patent 12 when shield 53 is atached to dental portion 22 via connector element 55.
  • one or more objects other than shield 53 can connect to connector element 55.
  • a surgical device such as a robotic arm can connect to connector element 55 in some embodiments.
  • the surgical arm can perform at least a portion of a surgical procedure while attached to connector element 55.
  • One example of a process where an object other than shield 53 could connect to connector element 55 is a process where a stereotactic navigation system receives medical imaging data including fiducials when shield 53 is attached to connector element 55 and performs registration based on this medical imaging data. Following registration, shield 53 can be detached from connector element 55 and a surgical device can be attached to connector element 55. The surgical device can subsequently perform at least part of a surgical procedure based on the mapping of medical imaging data to anatomical features corresponding to registration.
  • shield 53 includes one or more fiducial markers in addition to or alternative to the fiducial markers located on dental portion 22 of dental prosthesis 10.
  • the face of shield 53 is oriented towards a side of the page.
  • a fiducial marker is located at a distal end 58 of an outwardly extending boss of shield 53.
  • shield 53 includes one or more fiducial markers located on a surface in some embodiments. Each of the one or more fiducial markers located on the surface of shield 53 can be either active or passive fiducial markers. As seen in FIG. 5, shield 53 can include fiducial markers 62A-62C. As shown in FIG. 5, fiducial marker 62A is located on a first extending portion of shield 53. fiducial marker 62B is located on a second extending portion of shield 53, and fiducial marker 62C is located on a third extending portion of shield 53. Shield 53 can include fiducials other than fiducial markers 62A-62C in some embodiments. For example, as seen in FIG. 5, shield 53 can include one or more concentric rings and one or more other lines that serve as fiducials.
  • one or more angles, distances, and other dimensions relating to the fiducials of shield 53 are defined so that a location of the fiducials are known relative to each other. For example, an angular displacement between fiducial marker 62A and fiducial marker 62C is 120 degrees. Based on these known dimensions, stereotactic navigation system can generate a 3D space where the coordinates of one or more fiducial markers are known. A stereotactic navigation system can map anatomical features of the patient and other information to coordinates within this 3D space.
  • FIGS. 6A and 6B illustrate another embodiment of dental prosthesis 10 including a connector piece 64.
  • Connector piece 64 can be located on a front of dental portion 22. As seen in FIGS. 6A and 6B, connector piece 64 can form a lumen. This lumen can receive one or more objects, such as a shaft of shield 53 or a connector shaft of a surgical device such as a robotic arm.
  • dental prosthesis 10 is the same as dental prosthesis 10 of FIGS. 4A and 4B except that dental prosthesis 10 includes connector piece 64 instead of connector piece 524.
  • dental prosthesis 10 includes a dental portion 22 that defines a space 17 that is sized to fit a maxillary dental pattern of patient 12 in some embodiments. Each patient can have a unique dental pattern.
  • a computing system use medical imaging data to generate instructions for controlling a 3D printing device to create dental prosthesis 10 to fit the maxillary dental pattern of patient 12.
  • dental prosthesis 10 can secure tightly to the maxillary teeth of patient 12 so that dental prosthesis 10 does not move. This means that fiducial markers can be stationary relative to the skull of patient 12.
  • Space 72 can include a gap for each of the maxillary teeth of patient 12.
  • the size, shape, and contours of the maxillary teeth of patient 12 can be visible in the medical imaging data (e.g. , CT data, mtraoral scan data) of the maxillary dental pattern of the patient 12.
  • patient 12 can place dental prosthesis 10 over the maxillary teeth such that outer surfaces of the maxillary teeth are flush with inner surfaces of dental prosthesis 10 within space 72.
  • shield 53 is attached to dental portion 22 of dental prosthesis 10, but this is not required. In some embodiments, there is not a shield attached to dental portion 22 of dental prosthesis 10.
  • the system can segment imaging data and fuse imaging data to create instructions for generating a maxillary prosthesis.
  • the system can segment CT data to identify patient anatomy (e.g., a maxillary dental pattern.
  • the system can use a machine learning model to segment the CT data.
  • the system can fuse the segmented CT data and intraoral scan data to generate a fused set of data as seen in FIG. 9.
  • the system can coregister the intraoral scan on CT images using a machine learning model or another kind of model as seen in FIG. 10.
  • FIG. 11 illustrates another example dental prosthesis including passive fiducial markers and active fiducial markers.
  • the dental prosthesis can include a maxillary prosthesis and a registration attachment attached to a front of the maxillary prosthesis using a rigid connector.
  • the maxillary prosthesis can include one or more fiducial markers in some examples.
  • the fiducial markers in some examples, can wrap around the maxillary prosthesis as bands.
  • Fiducial markers can be located on one or more surfaces of the registration attachment. These markers can be shaped as straight lines, circles, or other shapes. Some fiducial markers can extend radially outward from the registration attachment.
  • FIGS. 12A-12F illustrate examples of LiDAR imaging and CT imaging.
  • FIGS. 12A-12C illustrate LiDAR imaging and FIGS. 12D-12F illustrate CT imaging.
  • FIGS. 12A and FIG. 12D illustrate baseline head scans without any additional frame components.
  • FIGS. 12B and FIG. 12E illustrate head scans with a skull anchor key fixed to the cranium.
  • FIGS. 12C and FIG. 12F illustrate head scans with stereotactic N-bar localizer boxes attached to the key.
  • CT results in images having similar quality as LiDAR images. This means that LiDAR can potentially serve as a less invasive alternative to CT in medical imaging contexts.
  • LiDAR and CT can be fused together to combine information from both of these modalities.
  • one or more light imaging modalities can be used in addition to or alternatively to LiDAR.
  • imaging modalities include laser scanning, structured light, or any other type of surface scanning technology.
  • FIGS. 13A-13D illustrate aspects of fusion between CT imaging and LiDAR imaging for registration in a stereotactic navigation system.
  • registration can integrate three distinct 3D models wi th corresponding cartesian (x, y, z) coordinates.
  • CT-derived surface geometry S CT
  • a LiDAR-captured surgical environment S LiDAR can produce a point cloud representing the surgical environment, including the facial surface and an N-bar localizer.
  • a stereotactic CAD model (S stereotactic ) can include a computer Assisted Design (CAD) model representing the ground truth of the stereotactic localizer box.
  • CAD computer Assisted Design
  • FIGS. 13A-13D include a schematic representation of the multimodal registration workflow.
  • the CT scan 3D render S CT .
  • FIG. 13B can include an overlap of the LiDAR point cloud data with a baseline CT head.
  • FIG. 13C includes a heatmap analysis between the LiDAR and CT face scans with global image registration error of 0.4, representing a high degree of spatial concordance between these imaging modalities.
  • FIG. 13D includes a fusion of the LiDAR-based N-bar localizer scan to the CAD model of the N-bar localizer.
  • the first step is CT-to-LiDAR fusion, which aligns the CT- derived surface geometry mesh to the LiDAR point cloud (S CT -> S LiDAR ).
  • the second step is LiDAR-to-Stereotactic registration, which aligns the LiDAR-captured point cloud of the N-bar localizer to a CAD reference model, finalizing the transformation to stereotactic space ( ⁇ S CT , S LiDAR ⁇ -> S stereotactic ).
  • ⁇ S CT stereotactic space
  • S LiDAR ⁇ -> S stereotactic stereotactic space
  • the CT-derived surface mesh that can be converted into a point cloud.
  • This point cloud can be centered and filtered to remove a posterior region to ensure alignment with a visible facial surface in the LiDAR scan.
  • Geometric features can be extracted from the CT and LiDAR point clouds using Fast Point Feature Histograms (FPFH).
  • FPFH Fast Point Feature Histograms
  • Global fusion of CT and LiDAR clouds can be performed using RANdom SAmple Consensus (RANSAC) based on the extracted FPFH features.
  • RANSAC RANdom SAmple Consensus
  • ICP Iterative Closest Point
  • This step can result in improving localization accuracy, particularly where traditional fiducial markers or frame-based localization may be limited.
  • FIG. 14 illustrates a registration of a LiDAR image of an N-bar to a stereotactic N- bar and a CT isolated N-bar.
  • an N-bar localizer can be isolated from the remaining surgical environment.
  • a k-dimensional tree can be used to find the nearest neighbor for each LiDAR point in a CT cloud. LiDAR points whose distance to the CT cloud fall below a predefined threshold can be removed, leaving points corresponding to the N-bar structure.
  • This filtering step can ensure that a localizer can be individually aligned to its CAD model without interference from facial geometry or other elements in the LiDAR scan.
  • FIG. 15 includes a flow diagram illustrating an example fusion between CT and LiDAR imaging for stereotactic navigation.
  • a series of pre-operative CT and MR images can be acquired for a surgical planning phase in which the optimal stereotactic brain targets can be chosen to treat specific neurological disorders.
  • These diagnostic radiological images can serve as the baseline scans for merging the stereotactic CT scan on the day of surgery. Consequently, the patients can undergo additional CT imaging during a DBS operation after mounting DI frame components onto the patient’s head. This step can involve connecting the patient’s stereotactic plan and position during surgery to the patient’s acquired baseline CT and MR images (diagnostic scans).
  • a stereotactic navigation system can implement deep learning models for point cloud registration.
  • Deep learning can, in some examples, be successfully implemented for point cloud registration in various domains, including CT to LiDAR fusion. Deep learning approaches can show improved accuracy and can be more robust in cases of low or partial point cloud overlap which may be encountered in real-world surgical settings.
  • applications in CT to LiDAR fusion can be limited to simulated data, small sets of curated samples, or non-stereotactic surgeries with greater room for error.
  • Future work can incorporate a large sample of facial LiDAR with corresponding CT imaging to train and evaluate deep learning models specific to one or more tasks, thereby facilitating accurate and robust LiDAR fusion.
  • stereotactic CT coordinates can be computed directly without a need for Stealth Station’s detection of an NBAR Localizer. This can streamline a software workflow and eliminate an existing (albeit small ⁇ 0.1 mm) error of detecting an NBAR localizer.
  • creating a more “LiDAR friendly” localizer box can dramatically improve an accuracy of LiDAR to CAD registration.
  • FIGS. 16-18 illustrate an example of a dental tray 210 that is custom fit for a patient’s maxillary teeth using a dental mold 230.
  • dental tray 210 can be custom fit for the patient’s maxillary teeth based on imaging data of the patient’s maxillary dental pattern.
  • a medical imaging system can collect medical imaging data that indicates one or more dimensions of the patient’s maxillary dental pattern.
  • This medical imaging system in some embodiments, can include an intraoral scan, a CT system (e.g., a cone beam computed tomography (CBCT) system, an MRI system, or any other kind of medical imaging system that can generate imaging data providing detail concerning dimensions of the patient’s maxillary dental pattern such that dental tray 210 can be formed to fit the patient’s maxillary dental pattern.
  • CBCT cone beam computed tomography
  • Creating a custom-fit dental tray 210 for a patient’s maxillary teeth using medical imaging can begin, in some examples, by collecting a precise digital scan of the patient’s upper jaw using a medical imaging system. This can be done using CBCT or intraoral scanning which captures the shape, size, and alignment of the maxillary teeth and surrounding structures.
  • a computing system can process the imaging data to generate a 3D digital model of the patient’s upper dental arch including the maxillary teeth. This model can serve as the blueprint for creating a physical mold that accurately replicates the patient’s oral anatomy.
  • a 3D digital model can be used to fabricate a physical mold 212 of the patient’s maxillary teeth. This can be achieved through one or more processes such as 3D printing or milling, where scanned data is converted into a solid replica of the patient’s upper jaw.
  • a tray material e.g., thermoplastic, silicone, or acrylic — can be carefully poured or vacuum-formed over the mold to fabricate the tray 210.
  • the tray material can be shaped to conform precisely to the contours of the maxillary teeth and palate, ensuring a snug, custom fit.
  • the dental tray 210 can then be allowed to harden, preserving the detailed structure needed for proper function.
  • the dental tray 210 can then be removed from the dental mold 212 to undergo finishing and adjustments. Excess material can be trimmed, and the edges can be smoothed to enhance comfort. A final fitting can be performed in the patient’s mouth to verify proper retention, seal, and overall comfort. If necessary, minor modifications can be made to improve the fit.
  • This method of creating a dental tray 210 using a dental mold 212 derived from imaging data can ensure a highly precise, patientspecific fit, which enhances an effectiveness of one or more fiducial markers 212A-212D (collectively, fiducial markers 212”) placed on the dental tray 210.
  • Fiducial markers 212 can each include one or more passive elements, one or more active elements, or a combination of passive an active elements.
  • the fiducial markers 212 can be placed, as depicted in FIG. 16, on a surface of the dental mold 212 so that each of the fiducial markers 212 are aligned with one of the patient’s teeth.
  • Fiducial markers 212 can each include a first fiducial and a second fiducial.
  • fiducial marker 212A includes a first fiducial 214A and a second fiducial 216A
  • fiducial marker 212B includes a first fiducial 214B and a second fiducial 216B, and so on.
  • the second fiducials 216 can each be placed in the center of a respective one of first fiducials 214.
  • the first fiducials 214A- 214D can be a first color and the second fiducials 216A-216D can be a second color.
  • the first color is green and the second color is white, but this is not required.
  • One or both of the first color and the second color can comprise a color other than green and white, respectively.
  • Each first fiducial 214 can be either passive or active.
  • Each second fiducial 216 can be either passive or active.
  • Fiducial markers 212 can, in some cases, be visible in one or more kinds of medical imaging, with first fiducials 214 being more visible in a first kind of medical imaging and second fiducials 216 being more visible in a second kind of medical imaging.
  • first fiducials 214 can be more visible in LiDAR imaging due to a green color of the first fiducials 214.
  • second fiducials 216 can be more visible in CT and/or MRI imaging as compared with first fiducials 214 due to the fact that first fiducials 214 include a radiopaque material and are white.
  • second fiducials 216 are placed in the center of respective ones of first fiducials 214, this means that LiDAR imaging prominently featuring first fiducials 214 can be fused with CT and/or MRI imaging prominently featuring second fiducials 216.
  • the dental tray can serve as a reference device during a medical procedure involving stereotactic navigation. Because dimensions of the dental tray 210 are known and relative positions of fiducial markers 212 on the dental tray 210 are also known, a computing system can use the locations of the dental tray 210 and fiducial markers 212 in medical imaging (e.g., LiDAR, CT/MRI) relative to the patient’s anatomy and the known dimensions of dental tray 210 and fiducial markers 212 to generate stereotactic navigation data for placing a surgical instrument relative to the patient’s anatomy. In some cases, an N-bar can serve as a reference device in addition to or instead of the dental tray 210, but this is not required. Dental tray 210 including fiducials 212 can be sufficient for providing reference during stereotactic navigation such that clinicians can view a position of one or more surgical tools relative to the patient’s anatomy in real fame.
  • medical imaging e.g., LiDAR, CT/MRI
  • an N-bar can serve as a reference device in addition to or instead of
  • FIG. 19 is a flow diagram illustrating an example technique for generating information for creating a dental prosthesis.
  • a method can include receiving medical imaging data indicating a maxillary dental pattern of a patient (1902).
  • the medical imaging data comprises computed tomography (CT) medical imaging data and intraoral scan medical imaging data.
  • CT computed tomography
  • the method can also include, in some embodiments, fusing the CT medical imaging data and the intraoral scan medical imaging data to create combined medical imaging data and generating information based on the combined medical imaging data.
  • the method comprises receiving medical imaging data indicating the position of each fiducial marker of a set of fiducial markers relative to the maxillary teeth and the skull of the patient and generating, based on the medical imaging data, stereotactic navigation data indicating a position of a surgical instrument relative to the maxillary teeth and the skull of the patient.
  • the method further includes generating information for creating a dental prosthesis to fit the maxillary dental pattern of a patient, the information indicating a location of each fiducial marker of a set of fiducial markers (1904).
  • the dental prosthesis comprises a dental portion that defines a space to fit the maxillary dental pattern of a patient and one or more fixators for securing the dental prosthesis to maxillary teeth of the patient so that the dental prosthesis does not move relative to the maxillary teeth of the patient.
  • each fixator of the one or more fixators comprises a fixator body portion connected to the dental portion of the dental prosthesis and a fixator head portion connected to the corresponding fixator body portion.
  • the fixator head portion can define a lumen configured to receive a proximal portion of a fixation pin as a distal portion of the fixation pin is inserted into gum tissue of the patient.
  • the fixation pin is inserted into the gum tissue of the patent between two side-by-side maxillary teeth of the maxillary teeth of the patient.
  • the set of fiducial markers in some cases, can include one or more passive fiducial markers.
  • the set of fiducial markers can comprise one or more active fiducial markers.
  • the set of fiducial markers comprise one or more passive fiducial markers and one or more active fiducial markers.
  • the method comprises outputing the information to device to cause the device to create a dental prosthesis (1906).
  • FIG. 20 is a flow diagram illustrating an example technique for generating information to fuse imaging from multiple modalities for stereotactic navigation.
  • a method includes fusing first imaging with second imaging to create fuse imaging of a patient’s anatomy and a reference device, the first imaging comprising a point cloud of a patient’s anatomy and a reference device, and the second imaging comprising a surface geometry mesh to the patient’s anatomy (2002).
  • the first imaging is captured by a first image capture system associated with a first imaging modality and the second imaging is captured by a second image capture system associated with a second imaging modality.
  • the first image capture system comprises a LiDAR system and the first imaging comprises LiDAR imaging
  • the second image capture system comprises a CT system and the second imaging comprises CT imaging
  • the first image capture system comprises a LiDAR system and the first imaging comprises LiDAR imaging
  • the second image capture system comprises an MRI system and the second imaging comprises MRI imaging.
  • the processing circuitry is configured to align the surface geometry mesh with the point cloud.
  • one or more light imaging modalities can be used in addition to or alternatively to LiDAR. Such imaging modalities include laser scanning, structured light, or any other type of surface scanning technology.
  • the method involves, in some examples, retrieving a model indicating dimensions of a reference device (2004).
  • the model is a computer-aided design (CAD) model, but this is not required.
  • the model can be any variety of model that indicates one or more dimensions of the reference device.
  • the method involves, in some examples, generating stereotactic navigation data indicating a position of a surgical instrument relative to the patient’s anatomy (2006).
  • generating the stereotactic navigation data comprises aligning a spatial representation of the reference device from the fused imaging with the reference device indicated by the model to determine the size of the patient’s anatomy based on the dimensions of the reference device and relative locations of the reference device and the patient’s anatomy in the fused imaging.
  • generating the stereotactic navigation data indicating a position of a surgical instrument relative to the patient’s anatomy comprises generating information for creating a dental prosthesis to fit the patient’s anatomy comprising a maxillary dental pattern.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Neurosurgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

La présente invention décrit des dispositifs et des procédés pour fournir une navigation stéréotaxique pendant des interventions chirurgicales. Par exemple, la présente invention décrit des dispositifs et des procédés pour fournir une navigation stéréotaxique à l'aide de repères de cadre pour aligner des données d'imagerie médicale avec l'anatomie d'un patient. En outre, la présente invention décrit un système de navigation stéréotaxique comprenant une prothèse dentaire dimensionnée pour s'adapter à un motif dentaire maxillaire d'un patient, un ensemble de repères de cadre situés sur la prothèse dentaire, la prothèse dentaire étant conçue pour être fixée aux dents maxillaires du patient.
PCT/US2025/019909 2024-03-15 2025-03-14 Prothèse maxillaire comprenant des repères de cadre pour navigation stéréotaxique Pending WO2025194021A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463565985P 2024-03-15 2024-03-15
US63/565,985 2024-03-15

Publications (1)

Publication Number Publication Date
WO2025194021A1 true WO2025194021A1 (fr) 2025-09-18

Family

ID=97064577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/019909 Pending WO2025194021A1 (fr) 2024-03-15 2025-03-14 Prothèse maxillaire comprenant des repères de cadre pour navigation stéréotaxique

Country Status (1)

Country Link
WO (1) WO2025194021A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040015176A1 (en) * 1994-06-20 2004-01-22 Cosman Eric R. Stereotactic localizer system with dental impression
US20090209852A1 (en) * 2005-03-02 2009-08-20 Calypso Medical Technologies, Inc. Systems and Methods for Treating a Patient Using Guided Radiation Therapy or Surgery
US20110060558A1 (en) * 2008-03-19 2011-03-10 Nobel Biocare Services, Ag Repositioning of components related to cranial surgical procedures in a patient
US20160030132A1 (en) * 2010-08-20 2016-02-04 Manhattan Technologies, Llc Surgical component navigation systems and methods
US20180279975A1 (en) * 2014-12-08 2018-10-04 Claronav Inc. Appliance for dental navigation
US20200146790A1 (en) * 2017-04-28 2020-05-14 Visionx, Llc Determining and tracking movement
US20230131343A1 (en) * 2020-03-30 2023-04-27 Uday N. REEBYE Splint device for guided surgical robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040015176A1 (en) * 1994-06-20 2004-01-22 Cosman Eric R. Stereotactic localizer system with dental impression
US20090209852A1 (en) * 2005-03-02 2009-08-20 Calypso Medical Technologies, Inc. Systems and Methods for Treating a Patient Using Guided Radiation Therapy or Surgery
US20110060558A1 (en) * 2008-03-19 2011-03-10 Nobel Biocare Services, Ag Repositioning of components related to cranial surgical procedures in a patient
US20160030132A1 (en) * 2010-08-20 2016-02-04 Manhattan Technologies, Llc Surgical component navigation systems and methods
US20180279975A1 (en) * 2014-12-08 2018-10-04 Claronav Inc. Appliance for dental navigation
US20200146790A1 (en) * 2017-04-28 2020-05-14 Visionx, Llc Determining and tracking movement
US20230131343A1 (en) * 2020-03-30 2023-04-27 Uday N. REEBYE Splint device for guided surgical robot

Similar Documents

Publication Publication Date Title
EP3817684B1 (fr) Système pour chirurgie guidée a réalité augmentée
Ma et al. Augmented reality surgical navigation with accurate CBCT-patient registration for dental implant placement
Yu et al. The indication and application of computer-assisted navigation in oral and maxillofacial surgery—Shanghai's experience based on 104 cases
Widmann Image-guided surgery and medical robotics in the cranial area
EP1486900A1 (fr) Procédé et système de fabrication de guide chirurgical
US8380288B2 (en) System and methods of using image-guidance for providing an access to a cochlear of a living subject
US7651506B2 (en) Frameless stereotactic guidance of medical procedures
EP2588018B1 (fr) Referencement d'une image medicale en utilisant une surface interne rigide.
CN107970074B (zh) 使用个性化夹具进行头部对准
US20210196404A1 (en) Implementation method for operating a surgical instrument using smart surgical glasses
Hong et al. Medical navigation system for otologic surgery based on hybrid registration and virtual intraoperative computed tomography
Ahn et al. Tracking accuracy of a stereo camera-based augmented reality navigation system for orthognathic surgery
Chen et al. Computer-assisted surgery in medical and dental applications
US20230355367A1 (en) Method for dynamically guiding a dental oral and maxillofacial prosthesis using a 3d dataset
WO2018191057A1 (fr) Procédure médicale stéréotaxique utilisant des références séquentielles et système associé
KR101831514B1 (ko) 환자의 악골의 움직임 추정이 반영된 증강현실 시스템 및 증강현실 제공방법
EP3107458B1 (fr) Production à base d'atlas d'un dispositif de support médical
US11890148B2 (en) System and method for dynamic augmented reality imaging of an antomical site
US20170143445A1 (en) Method and apparatus for operating a dental diagnostic image generation system
US20140324182A1 (en) Control system, method and computer program for positioning an endoprosthesis
Tsuji et al. A new navigation system based on cephalograms and dental casts for oral and maxillofacial surgery
Kang et al. The validity of marker registration for an optimal integration method in mandibular navigation surgery
Kinariwala et al. Dynamic navigation in endodontics
WO2025194021A1 (fr) Prothèse maxillaire comprenant des repères de cadre pour navigation stéréotaxique
US12303213B2 (en) Preoperative imaging combined with intraoperative navigation before and after removal of an implant from a surgical site to create a composite surgical three dimensional structural dataset

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25771871

Country of ref document: EP

Kind code of ref document: A1