[go: up one dir, main page]

US20230165639A1 - Extended reality systems with three-dimensional visualizations of medical image scan slices - Google Patents

Extended reality systems with three-dimensional visualizations of medical image scan slices Download PDF

Info

Publication number
US20230165639A1
US20230165639A1 US17/539,796 US202117539796A US2023165639A1 US 20230165639 A1 US20230165639 A1 US 20230165639A1 US 202117539796 A US202117539796 A US 202117539796A US 2023165639 A1 US2023165639 A1 US 2023165639A1
Authority
US
United States
Prior art keywords
anatomical structure
image slice
medical image
patient
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/539,796
Inventor
Isaac Dulin
Tom Calloway
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Globus Medical Inc
Original Assignee
Globus Medical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Globus Medical Inc filed Critical Globus Medical Inc
Priority to US17/539,796 priority Critical patent/US20230165639A1/en
Priority to US17/540,319 priority patent/US12232820B2/en
Assigned to GLOBUS MEDICAL, INC. reassignment GLOBUS MEDICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DULIN, ISAAC, CALLOWAY, TOM
Publication of US20230165639A1 publication Critical patent/US20230165639A1/en
Priority to US19/041,011 priority patent/US20250228624A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/368Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/397Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave
    • A61B2090/3975Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active
    • A61B2090/3979Markers, e.g. radio-opaque or breast lesions markers electromagnetic other than visible, e.g. microwave active infrared
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • G06T2207/30012Spine; Backbone
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30052Implant; Prosthesis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/008Cut plane or projection plane definition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Definitions

  • the present disclosure relates to computer systems for planning surgical operations and computer assisted navigation of equipment and operators during surgery.
  • Surgical operating rooms can contain a diverse range of medical equipment, which can include computer assisted surgical navigation systems, surgical robot systems, medical imaging devices (e.g., computerized tomography (CT) scanners, magnetic resonance imaging scanners, fluoroscopy imaging, etc.), neuromonitoring equipment, patient monitors, microscopes, anesthesia equipment, etc.
  • CT computerized tomography
  • magnetic resonance imaging scanners magnetic resonance imaging scanners
  • fluoroscopy imaging etc.
  • neuromonitoring equipment e.g., neuromonitoring equipment, patient monitors, microscopes, anesthesia equipment, etc.
  • a computer assisted surgical navigation system can provide a surgeon with computerized visualization of the present pose of a surgical tool relative to medical images of a patient's anatomy.
  • Camera tracking systems for computer assisted surgical navigation typically use a set of cameras to track a tool reference array on a surgical tool which is being positioned by a surgeon during surgery relative to a patient reference array attached to a patient.
  • the reference array also referred to as a dynamic reference array (DRA) or dynamic reference base (DRB)
  • DRA dynamic reference array
  • DRB dynamic reference base
  • the surgeon can thereby use real-time visual feedback of the determined pose(s) to navigate the surgical tool during a surgical procedure on the patient.
  • Perpendicular scan slices are used to enable operators to visualize the patient's anatomy alongside the relative poses of surgical instruments. Projections of the three-dimensional (3D) scan can also be shown.
  • 3D models of a patient's anatomy alongside two-dimensional (2D) slices it can be challenging for an operator to know how the 3D model and the patient anatomy relates geometrically to the 2D slices.
  • Some embodiments of the present disclosure are directed to providing a navigated surgery system that enables a user wearing an extended reality (XR) headset to visualize how a displayed 2D medical image slice of anatomical structure of a patient relates geometrically to a displayed 3D graphical model of anatomical structure.
  • XR extended reality
  • a navigated surgery system includes at least one processor that is operative to obtain a first 2D medical image slice of anatomical structure of a patient.
  • the operations obtain a 3D graphical model of anatomical structure.
  • the operations determine a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice.
  • the operations control the XR headset to display the first 2D medical image slice of the anatomical structure of the patient, display the 3D graphical model of the anatomical structure, and display a first graphical object oriented with the first pose relative to the 3D graphical model of the anatomical structure.
  • FIG. 1 is an overhead view of a personnel wearing extended reality (XR) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and a surgical robot system for robotic assistance, in accordance with some embodiments;
  • XR extended reality
  • FIG. 2 illustrates the navigated surgery camera tracking system and the surgical robot system positioned relative to a patient, according to some embodiments
  • FIG. 3 illustrates a navigated surgery camera tracking system and a surgical robot system configured according to some embodiments
  • FIGS. 4 A- 4 B respectively illustrate a C-arm image device and an O-arm imaging device in accordance with some embodiments
  • FIG. 5 illustrates an XR headset view of an axial 2D medical image slice of anatomical structure of a patient, a sagittal 2D medical image slice of the anatomical structure of the patient, and a 3D graphical model of anatomical structure, in accordance with some embodiments;
  • FIG. 6 illustrates an XR headset view of two graphical objects which are displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model of FIG. 5 that correspond to the anatomical structure of the axial and sagittal 2D medical image slices of FIG. 5 , in accordance with some embodiments;
  • FIG. 7 illustrates another XR headset view that adds to the display of FIG. 6 a coronal 2D medical image slice of the anatomical structure of the patient and further adds a corresponding graphical object which is displayed with a pose that is defined to visually illustrate to the user a virtual cross-sectional plane extending through the 3D graphical model of FIG. 5 that corresponds to the anatomical structure of the coronal 2D medical image slice, in accordance with some embodiments;
  • FIG. 8 illustrates another XR headset view of a 3D graphical model of anatomical structure with axial and sagittal 2D medical image slice of the anatomical structure of the patient being dynamically selected and posed responsive to tracking pose of a tip of a tool, in accordance with some embodiments;
  • FIGS. 9 A and 9 B illustrate two alternative views displayed through the XR headset of a starting orientation and 90 degree rotated orientation, respectively, of the axial and sagittal 2D medical image slices of FIG. 5 and graphical objects being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model that correspond to the anatomical structure of the viewed axial and sagittal 2D medical image slices, in accordance with some embodiments;
  • FIGS. 10 A, 10 B, 10 C, and 10 D illustrate four alternative views displayed through the XR headset of an axial unmirrored and sagittal unmirrored view, an axial unmirrored and sagittal mirrored view, an axial mirrored and sagittal unmirrored view, and an axial mirrored and sagittal mirrored view, respectively, and graphical objects being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model, in accordance with some embodiments;
  • FIG. 11 illustrates the XR headset view of a 3D graphical object posed and extending to overly a region of the 3D graphical model corresponding to where a surgical procedure is to be performed on the anatomical structure of the patient, in accordance with some embodiments;
  • FIGS. 12 - 15 illustrates flowcharts of operations by a navigated surgery system in accordance with some embodiments.
  • FIG. 16 illustrates a block diagram of a navigated surgery system that includes an XR headset, a computer platform, and a camera tracking system component which are operative in accordance with some embodiments.
  • FIG. 1 is an overhead view of personnel wearing extended reality (XR) headsets 150 a and 150 b during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery and a surgical robot system 100 for robotic assistance, in accordance with some embodiments.
  • FIG. 2 illustrates the navigated surgery camera tracking system 202 and the surgical robot system 100 positioned relative to a patient, according to some embodiments.
  • FIG. 3 illustrates the navigated surgery camera tracking system 202 and the surgical robot system 100 configured according to some embodiments.
  • XR extended reality
  • An XR headset may be configured to augment a real-world scene with computer generated XR images.
  • the XR headset may be configured to provide an augmented reality (AR) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user.
  • the XR headset may be configured to provide a virtual reality (VR) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer generated AR images on a display screen.
  • An XR headset can be configured to provide both AR and VR viewing environments.
  • the term XR headset can referred to as an AR headset or a VR headset.
  • the surgical robot system 100 may include, for example, a surgical robot 102 , one or more robot arms 104 , a display 110 , an end-effector 112 , for example, including a guide tube 114 , and an end effector reference array which can include one or more tracking markers.
  • the surgical robot system 100 may include a patient reference array 116 with a plurality of tracking markers, which is adapted to be secured directly to the patient 210 (e.g., to a bone of the patient 210 ).
  • Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc.
  • the surgical robot system 100 may also utilize a tracking camera 200 , for example, positioned on the camera tracking system 202 .
  • the camera tracking system 202 can have any suitable configuration to move, orient, and support the tracking camera 200 in a desired position, and may contain a computer operable to track pose of reference arrays.
  • the tracking camera 200 may include any suitable camera or cameras, such as one or more infrared cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active and passive tracking markers for various reference arrays attached as the patient 210 (patient reference array), end effector 112 (end effector reference array), extended reality (XR) headset(s) 150 a - 150 b worn by a surgeon 120 and/or a surgical assistant 126 , etc. in a given measurement volume viewable from the perspective of the tracking camera 200 .
  • the tracking camera 200 may track markers 170 attached to a surgical tool, implant, or instrument manipulated by a user.
  • the tracking camera 200 may scan the given measurement volume and detect the light that is emitted or reflected from the reference arrays in order to identify and determine poses of the reference arrays in three-dimensions.
  • active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking camera 200 or other suitable device.
  • the XR headsets 150 a and 150 b may each include tracking cameras that can track poses of reference arrays within their camera field-of-views (FOVs) 152 and 154 , respectively. Accordingly, as illustrated in FIG. 1 , the poses of reference arrays attached to various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 a and 150 b and/or a FOV 600 of the tracking cameras 200 .
  • FOVs camera field-of-views
  • FIGS. 1 and 2 illustrate a potential configuration for the placement of the camera tracking system 202 and the surgical robot system 100 in an operating room environment.
  • Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 a and 150 b to display surgical procedure navigation information.
  • the surgical robot system 100 in optional during computer-aided navigated surgery.
  • the camera tracking system 202 may use tracking information and other information from multiple XR headsets 150 a and 150 b such as inertial tracking information and optical tracking information as well as (optional) microphone information.
  • the XR headsets 150 a and 150 b operate to display visual information and play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 102 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment.
  • the XR headsets 150 a and 150 b track apparatus such as instruments, patient references and end effectors in 6 degrees-of-freedom (6DOF), and may track the hands of the wearer.
  • 6DOF 6 degrees-of-freedom
  • the XR headsets 150 a and 150 b may also operate to track hand poses and gestures to enable gesture based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 a and 150 b and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 a and 150 b may have a 1-10 ⁇ magnification digital color camera sensor called a digital loupe.
  • An “outside-in” machine vision navigation bar tracks instruments and may include a color camera.
  • the machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 a and 150 b tend to move while positioned on wearers' heads.
  • the patient reference array 116 is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112 , instrument reference array 170 , and reference arrays on the XR headsets 150 a and 150 b.
  • one or more of the XR headsets 150 a and 150 b are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
  • the surgical robot system (also “robot”) 102 may be positioned near or next to patient 210 . Although depicted near the head of the patient 210 , it will be appreciated that the robot 102 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing a surgical procedure.
  • the tracking camera 200 may be separated from the robot system 100 and positioned at the foot of patient 210 . This location allows the tracking camera 200 to have a direct visual line of sight to the surgical field 208 . Again, it is contemplated that the tracking camera 200 may be located at any suitable position having line of sight to the surgical field 208 .
  • the surgeon 120 may be positioned across from the robot 102 , but is still able to manipulate the end-effector 112 and the display 110 .
  • a surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110 . If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. The traditional areas for the anesthesiologist 122 and the nurse or scrub tech 124 remain unimpeded by the locations of the robot 102 and camera 200 .
  • the anesthesiologist 122 can operate anesthesia equipment which can include a display 34 .
  • the display 110 can be attached to the surgical robot 102 and in other example embodiments, display 110 can be detached from surgical robot 102 , either within a surgical room with the surgical robot 102 , or in a remote location.
  • End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor.
  • end-effector 112 can comprise a guide tube 114 , which is able to receive and orient a surgical instrument, tool, or implant 608 used to perform a surgical procedure on the patient 210 .
  • end-effector is used interchangeably with the terms “end-effectuator” and “effectuator element.”
  • instrument is used in a non-limiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein.
  • Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc.
  • end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery.
  • end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument 608 in a desired manner.
  • the surgical robot 102 is operable to control the translation and orientation of the end-effector 112 .
  • the robot 102 is operable to move end-effector 112 under computer control along x-, y-, and z-axes, for example.
  • the end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis (such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled).
  • selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a six degree of freedom robot arm comprising only rotational axes.
  • the surgical robot system 100 may be used to operate on patient 210 , and robot arm 104 can be positioned above the body of patient 210 , with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210 .
  • the XR headsets 150 a and 150 b can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
  • the term “pose” refers to the position and/or the rotational angle of one object (e.g., dynamic reference array, end-effector, surgical instrument, anatomical structure, etc.) relative to another object and/or to a defined coordinate system.
  • a pose may therefore be defined based on only the multidimensional position of one object relative to another object and/or relative to a defined coordinate system, based on only the multidimensional rotational angles of the object relative to another object and/or to a defined coordinate system, or based on a combination of the multidimensional position and the multidimensional rotational angles.
  • the term “pose” therefore is used to refer to position, rotational angle, or combination thereof.
  • surgical robot 102 can be configured to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory.
  • surgical robot 102 can be configured to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument.
  • a surgeon or other user can operate the system 100 , and has the option to stop, modify, or manually control the autonomous movement of end-effector 112 and/or the surgical instrument.
  • Reference arrays can be formed on or connected to robot arm 104 , end-effector 112 , patient 210 , and/or the surgical instrument to track poses in 6 degree-of-freedom (e.g., position along 3 orthogonal axes and rotation about the axes).
  • a reference array including a plurality of tracking markers can be provided thereon (e.g., formed-on or connected-to) to an outer surface of the robot 102 , such as on robot 102 , on robot arm 104 , and/or on the end-effector 112 .
  • a patient reference array including one or more tracking markers can further be provided on the patient 210 (e.g., formed-on or connected-to).
  • An instrument reference array including one or more tracking markers can be provided on surgical instruments (e.g., a screwdriver, dilator, implant inserter, or the like).
  • the reference arrays enable each of the marked objects (e.g., the end-effector 112 , the patient 210 , and the surgical instruments) to be tracked by the tracking camera 200 , and the tracked poses can be used to provide navigation guidance to a surgical procedure and/or used to control movement of the surgical robot 102 for guiding the end-effector 112 and/or an instrument.
  • the surgical robot system 100 includes the surgical robot 102 including a display 110 , upper arm 306 , lower arm 308 , end-effector 112 , vertical column 312 , casters 314 , tablet drawer 318 , and ring 324 which uses lights to indicate statuses and other information.
  • Cabinet 106 may house certain components of surgical robot system 100 including but not limited to a battery, a power distribution module, a platform interface board module, and a computer.
  • the tracking camera 200 is supported by the camera tracking system 202 .
  • FIGS. 4 A and 4 B illustrate medical imaging systems 1304 that may be used in conjunction with the camera tracking system 202 for navigated surgery, to acquire pre-operative, intra-operative, post-operative, and/or real-time image data of patient 210 .
  • Any necessary anatomical structure may be imaged for any appropriate procedure using the imaging system 1304 .
  • the imaging system 1304 may be any imaging device such as a C-arm computerized tomography (CT) scan device 1308 , an O-arm CT scan device 1306 , a fluoroscopy imaging device, a magnetic resonance imaging scanner, etc. It may be desirable to take x-rays of patient 210 from a number of different positions, without the need for frequent manual repositioning of patient 210 which may be required in an x-ray system.
  • CT computerized tomography
  • the imaging system 1304 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape.
  • C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316 .
  • the space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318 .
  • FIG. 1 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape.
  • C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316 .
  • the space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318 .
  • the imaging system 1304 may include an O-arm imaging device 1306 having a gantry housing 1324 attached to a support structure imaging device support structure 1328 , such as a wheeled mobile cart 1330 with wheels 1332 , which may enclose an image capturing portion, not illustrated.
  • the image capturing portion may include an x-ray source and/or emission portion and an x-ray receiving and/or image receiving portion, which may be disposed about one hundred and eighty degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion.
  • the image capturing portion may be operable to rotate three hundred and sixty degrees during image acquisition.
  • the image capturing portion may rotate around a central point and/or axis, allowing image data of patient 210 to be acquired from multiple directions or in multiple planes.
  • imaging systems 1304 are exemplified herein, it will be appreciated that any suitable imaging system may be selected by one of ordinary skill in the art.
  • an XR headset is controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy.
  • the 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn't necessarily formed from a scan of the patient.
  • Various embodiments of the present disclosure are directed to providing a navigated surgery system that enables a user wearing the XR headset to visualize how the displayed 2D medical image slice of anatomical structure of a patient relates geometrically to a displayed 3D graphical model of anatomical structure.
  • a navigated surgery system displays a graphical object through the XR headset that visually indicates a virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice. Additional information, such as the orientation of the 2D scan slice and e.g., a current vertebral level may be displayed relative to the 3D graphical model.
  • Various embodiments display graphical objects that enable a user to visualize the pose of the cross-sectional plane(s) where the one or more 2D scan slice(s) geometrically correspond to visual “slice(s)” through the 3D graphical model.
  • various embodiments are described in the context of orthopedic surgery, they are not limited to any type of surgery.
  • a navigation “plan” for navigated implanting of screws and/or other devices may be viewed based on navigation guidance information that is provided to the XR headsets 150 a and 150 b and/or 2D monitor for display.
  • FIG. 5 illustrates an XR headset view of an axial 2D medical image slice 500 of anatomical structure of a patient, a sagittal 2D medical image slice 510 of the anatomical structure of the patient, and a 3D graphical model 520 of anatomical structure, in accordance with some embodiments.
  • the illustration of FIG. 5 does not include a computer-generated graphical object which is configured to visually assist the user (wearer of the XR headset) with determining how the 2D medical image slices 500 and 510 geometrically relate to the 3D graphical model 520 .
  • the 3D graphical model 520 may be registered to be displayed at or above the patient anatomy, when the patient is viewed through the XR headset, i.e., patient stabilized display.
  • the axial 2D medical image slice 500 and the sagittal 2D medical image slice 510 may be registered to the user's head, i.e., head stabilized, so that they remain visible as the user looks around the surgical room.
  • the XR headset may be further controlled to display other navigated surgery information, such as graphical representations of planned screw and interbody placement poses relative to the patient viewed through the XR headset and/or CAD graphical models which are displayed with poses that that updated to dynamically track sensed instrument poses.
  • FIG. 6 illustrates an XR headset view of two graphical objects 600 and 610 which are displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model 520 of FIG. 5 that correspond to the anatomical structure of the axial and sagittal 2D medical image slices 500 and 510 , respectively, of FIG. 5 , in accordance with some embodiments.
  • the graphical object 600 is displayed as a cross-sectional plane that extends through the 3D graphical model 520 with a pose that corresponds to where the 2D axial medical image slice 500 slices through the anatomical structure of the 3D graphical model 520 .
  • the other graphical object 610 is displayed as another cross-sectional plane that extends through the 3D graphical model 520 with a pose that corresponds to where the 2D sagittal medical image slice 610 slices through the anatomical structure of the 3D graphical model 520 .
  • a user wearing the XR headset is able to intuitively visualize the geometric relationship between the axial and sagittal 2D medical image slices 500 and 510 , respectively, and the anatomical structure of the 3D graphical model 520 .
  • FIG. 7 illustrates another XR headset view that adds to the display of FIG. 6 .
  • a coronal 2D medical image slice 700 of the anatomical structure of the patient is displayed through the XR headset along with the axial medical image slice 500 and the sagittal 2D medical image slice 510 .
  • the XR headset is also controlled to display a graphical object 710 with a pose that is defined to visually illustrate to the user a virtual cross-sectional plane extending through the 3D graphical model 520 that corresponds to the anatomical structure of the coronal 2D medical image slice, in accordance with some embodiments.
  • FIG. 12 illustrates a flowchart of operations by a navigated surgery system in accordance with some embodiments. Embodiments are not limited to the order of operations shown in FIG. 12 or to including all illustrated operations. For example, at least operations 1208 through 1212 are optional.
  • the navigated surgery system operates to obtain 1200 a first 2D medical image slice of anatomical structure of a patient, such as one of the slices 500 , 510 , and 700 , from a medical image scanner or image database.
  • the system obtains 1202 a 3D graphical model of anatomical structure, such as the model 520 , from a medical image scanner, image database, or model database or generator.
  • the system determines 1204 a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice.
  • the system controls an XR headset to display the first 2D medical image slice of the anatomical structure of the patient, display the 3D graphical model of the anatomical structure, and display a first graphical object oriented with the first pose relative to the 3D graphical model of the anatomical structure.
  • the navigated surgery system may control the XR headset to display a graphical representation of a plane overlaid with the first pose on the 3D graphical model of the anatomical structure, such as the plane 610 overlaid on the model 520 .
  • the graphical representation of the plane may be provided to the XR headset for display as a shaded and/or colored box overlaid with the first pose on the 3D graphical model of the anatomical structure, such as the shaded plane 610 overlaid on the model 520 .
  • the navigated surgery system may display more than one 2D medical image slice such as illustrated in FIGS. 5 through 7 . Accordingly, the system may obtain 1208 a second 2D medical image slice of the anatomical structure of the patient, wherein the first 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second 2D medical image slice, and determine 1210 a second pose of a second virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the second 2D medical image slice. The system can then control 1212 the XR headset to display the second 2D medical image slice of the anatomical structure of the patient and to display a second graphical object oriented with the second pose relative to the 3D graphical model of the anatomical structure.
  • the first 2D medical image slice may be an axial image slice of the anatomical structure of the patient and the second 2D medical image slice may be a sagittal image slice of the anatomical structure of the patient, such as those illustrated in FIGS. 5 through 7 .
  • the navigated surgery system may further operate to control the XR headset to display a graphical representation of a first plane, e.g., 600 in FIG. 7 , overlaid with the first pose on the 3D graphical model of the anatomical structure, and control the XR headset to display a second graphical representation of a second plane, e.g., 620 in FIG. 7 , overlaid with the second pose on the 3D graphical model of the anatomical structure.
  • a first plane e.g., 600 in FIG. 7
  • a second graphical representation of a second plane e.g., 620 in FIG. 7
  • the navigated surgery system may further operate to control the XR headset to use a first color and/or shading to render at least part of the first 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the first plane for display, and control the XR headset to use a second color and/or shading, which is different from the first color and/or shading, to render at least part of the second 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the second plane for display.
  • a first color and/or shading to render at least part of the first 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the first plane for display
  • a second color and/or shading which is different from the first color and/or shading
  • the navigated surgery system may further operate to obtain a third 2D medical image slice of the anatomical structure of the patient, where the third 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second and third 2D medical image slices.
  • the third 2D medical image slice may be a coronal image slice of the anatomical structure of the patient.
  • the system determines a third pose of a third virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the third 2D medical image slice.
  • the system controls the XR headset to display the third 2D medical image slice, e.g., 700 in FIG. 7 , of the anatomical structure of the patient and to display a third graphical object, e.g., 710 in FIG. 7 , oriented with the third pose relative to the 3D graphical model of the anatomical structure.
  • the standard axial 2D image slice and/or sagittal 2D image slice visualization may be swapped for a tool-centric visualization whereby perpendicular image slice(s) are selected among image slices forming an image volume based on the tip of a tracked surgical instrument (tool).
  • the image slice(s) can be displayed as overlay(s) on the 3D graphical model of the anatomical structure and/or on the patient viewed through a see-through screen of the XR headset.
  • FIG. 8 illustrates an XR headset view of axial and sagittal 2D image slices 800 and 810 , respectively, which are displayed with poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model 520 of FIG. 5 , and where the 2D image slices are dynamically selected and posed responsive to tracking pose of a tip of a tool 820 (instrument, etc.), in accordance with some embodiments.
  • FIG. 13 illustrates a flowchart of corresponding operations by a navigated surgery system in accordance with some embodiments.
  • the navigated surgery system is operative to obtain 1300 a tracked pose of a tip of a tool 820 being tracked by a camera tracking system relative to the anatomical structure of the patient.
  • the system operates to select 1302 a 2D medical image slice 810 from among a set of 2D medical image slices forming an imaged volume of the anatomical structure of the patient, based on the tracked pose of the tip of the tool 820 relative to the anatomical structure of the patient.
  • the navigated surgery system responds to the updated pose locations of the tool tip by selecting, from among the set of 2D image slices, and displaying through the XR headset corresponding 2D image slices. In this manner, the surgeon can dynamically reposition the tool 820 to see corresponding 2D image slices of the patient's anatomy, i.e., spine illustrated in FIG. 8 .
  • Another operational embodiment aids a surgeon with visualization by keeping the slice planes fixed while allowing manual rotation of the 3D graphical model 610 .
  • the operations allow the surgeon to spin the 3D graphical model 610 around while maintaining fixed viewing planes of the axial and sagittal 2D image slices.
  • FIG. 9 A illustrates a view displayed through the XR headset of a starting orientation of the axial medical image slice 520 and the sagittal 2D image slice 510 and graphical objects 600 and 610 being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model 610 that correspond to the anatomical structure of the viewed axial and sagittal 2D image slices, in accordance with some embodiments.
  • the graphical object 600 is illustrated as the virtual cross-sectional plane corresponding to the axial medical image slice 520 .
  • the other graphical object 610 is illustrated as the virtual cross-sectional plane corresponding to the sagittal 2D image slice 510 .
  • FIG. 9 B illustrates a view displayed through the XR headset of a 90 degree rotated orientation of the axial medical image slice 520 and the sagittal 2D image slice 510 , and graphical objects 910 and 900 being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 90 degree rotated orientation of the 3D graphical model 920 , in accordance with some embodiments.
  • the graphical object 910 is illustrated as the virtual cross-sectional plane corresponding to the axial medical image slice 500 .
  • the other graphical object 900 is illustrated as the virtual cross-sectional plane corresponding to the sagittal 2D image slice 510 .
  • a corresponding operation by the navigated surgery system can include, responding to a rotation command from a user by controlling the XR headset to display an angularly rotated view of the 3D graphical model of the anatomical structure while displaying the first graphical object oriented with the first pose.
  • Some other embodiments are directed to displaying further information which enables visualization of the orientations of the 2D image slices.
  • Sagittal image slices may be flipped in order to match the orientation from which a surgeon is viewing the patient's spine or other anatomy, and the orientation of the axial 2D image slices can then be difficult to perceive because the axial 2D image slices have few asymmetries.
  • overlays can be shaded and/or colored to visually indicate the orientation of the 2D image slice.
  • the current orientation of the 2D image slice can be shown with an overlay, and if viewed from behind (or if the 2D image slice view is flipped) the overlay can be rendered as a hollow outline, or vice versa.
  • FIGS. 10 A, 10 B, 10 C, and 10 D illustrate four alternative views displayed through the XR headset of an axial unmirrored and sagittal unmirrored view, an axial unmirrored and sagittal mirrored view, an axial mirrored and sagittal unmirrored view, and an axial mirrored and sagittal mirrored view, respectively, and graphical objects being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model, in accordance with some embodiments.
  • FIG. 10 A illustrates the axial unmirrored and sagittal unmirrored view in which the cross-sectional objects 600 and 610 are both shaded and/or colored.
  • FIG. 10 B illustrates the axial unmirrored and sagittal mirrored view in which the cross-sectional object 1010 corresponding to the sagittal 2D image slice is not shaded and/or colored to indicate it is a mirrored view.
  • FIG. 10 C illustrates the axial mirrored and sagittal unmirrored view in which the cross-sectional object 1000 corresponding to the axial 2D image slice is not shaded and/or colored to indicate it is a mirrored view.
  • FIG. 10 B illustrates the axial unmirrored and sagittal mirrored view in which the cross-sectional object 1010 corresponding to the sagittal 2D image slice is not shaded and/or colored to indicate it is a mirrored view.
  • FIG. 10 C illustrates the axial mirrored and sagittal unmirrored view in which the cross-sectional object 1000 corresponding to the
  • 10 D illustrates the axial mirrored and sagittal mirrored view in which the cross-sectional object 1000 corresponding to the axial 2D image slice is not shaded and/or colored to indicate it is a mirrored view and in which the cross-sectional object 1010 corresponding to the sagittal 2D image slice is not shaded and/or colored to indicate it is a mirrored view.
  • the surgeon can move around the patient to view an overlay from an opposite side which uses an opposite shading and/or color effect to visually illustrate the different viewing perspectives.
  • the operations for the shading or hollow representations are reversed, and/or a visual indication (queue) is added, e.g., as a star or other symbol in the upper left corner of the corresponding slices.
  • a gradient across the slice background is used to intuitively indicate the viewed directionality.
  • Corresponding operations that may be performed by the navigated surgery system are illustrated in the flowchart of FIG. 14 .
  • the operations obtain 1400 a tracked pose of the XR headset relative to the anatomical structure of the patient.
  • the operations determine 1402 based on the tracked pose whether to flip orientation of a rendering of the first 2D medical image slice to be provided to the XR headset for display.
  • the operations select 1404 between two graphically distinct objects to be rendered as the first graphical object for display responsive to whether the determination is to flip orientation of the rendering of the first 2D medical image slice to be provided to the XR headset for display.
  • the navigated surgery system may operate to identify the selected (active) level in the 2D image scan and display a highlighted zone (or other 3D graphical object) on the currently selected level.
  • a highlighted zone or other 3D graphical object
  • FIG. 15 Corresponding operations that may be performed by the navigated surgery system are illustrated in the flowchart of FIG. 15 .
  • the operations control 1500 the XR headset to display a 3D graphical object 1100 with a pose and extending to overly a region of the 3D graphical model of the anatomical structure corresponding to where a surgical procedure is to be performed on the anatomical structure of the patient.
  • 3D graphical object 1100 is rendered as a shaded 3D rectangular object that is posed to correspond to a level of the spine where a surgical procedure is to be performed.
  • the operations obtain 1502 a tracked pose of a tool or an implant being tracked by a camera tracking system relative to the anatomical structure of the patient, e.g., spine illustrated in the 3D graphical model 520 .
  • the operations control 1504 the XR headset to display a graphical representation of the tool or the implant overlaid with the tracked pose on the 3D graphical model 520 of the anatomical structure while continuing to display the 3D graphical object 1100 .
  • a wearer of the XR headset can visually determine in an intuitive manner whether the tool or implant is presently positioned and oriented (e.g., posed) at the correct level of the spine.
  • FIG. 16 illustrates a block diagram of a surgical system that includes an XR headset 150 , a computer platform 1600 , imaging devices, and a surgical robot 102 which are configured to operate in accordance with various embodiments.
  • the imaging devices may include the C-arm imaging device 1304 , the O-arm imaging device 1306 , and/or a patient image database 1620 .
  • the XR headset 150 provides an improved human interface for performing navigated surgical procedures.
  • the XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 1600 , that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 1612 .
  • the display device 1612 may a video projector, flat panel display, etc.
  • the user can view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen.
  • the XR headset 150 may additionally or alternatively be configured to display on the display device 1612 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
  • Electrical components of the XR headset 150 can include a plurality of cameras 1622 , a microphone 1620 , a gesture sensor 1618 , a pose sensor (e.g., inertial measurement unit (IMU)) 1616 , the display device 1612 , and a wireless/wired communication interface 1624 .
  • the cameras 1622 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
  • the cameras 1622 may be configured to operate as the gesture sensor 1618 by tracking for identification user hand gestures performed within the field of view of the camera(s) 1622 .
  • the gesture sensor 1618 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 1618 and/or senses physical contact, e.g. tapping on the sensor 1618 or its enclosure.
  • the pose sensor 1616 e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
  • a surgical system includes a camera tracking system 202 which may be part of a computer platform 1600 that can also provide functionality of a navigation controller 1604 and/or of a XR headset controller 1610 .
  • the surgical system may include the imaging devices and/or a surgical robot 102 .
  • the navigation controller 1604 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 202 .
  • the navigation controller 1604 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 102 , where the steering information is used to display information through the XR headset 150 to indicate where the surgical tool and/or the end effector of the surgical robot 102 should be moved to perform the surgical plan.
  • the electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 1600 through a wired/wireless interface 1624 .
  • the electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 1600 or directly connected, to various imaging devices, e.g., the C-arm imaging device 1304 , the I/O-arm imaging device 1306 , the patient image database 1620 , and/or to other medical equipment through the wired/wireless interface 1624 .
  • the surgical system further includes at least one XR headset controller 1610 (also referred to as “XR headset controller” for brevity) that may reside in the XR headset 150 , the computer platform 1600 , and/or in another system component connected via wired cables and/or wireless communication links.
  • XR headset controller 1610 also referred to as “XR headset controller” for brevity
  • the XR headset controller 1610 is configured to receive information from the camera tracking system 202 and the navigation controller 1604 , and to generate an XR image based on the information for display on the display device 1612 .
  • the XR headset controller 1610 can be configured to operationally process signaling from the cameras 1622 , the microphone 1620 , and/or the pose sensor 1616 , and is connected to display XR images on the display device 1612 for user viewing.
  • the XR headset controller 1610 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user.
  • the XR headset controller 1610 may reside within the computer platform 1600 which, in turn, may reside within a housing of the surgical robot 102 , the camera tracking system 202 , etc.
  • the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
  • the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
  • the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • inventions of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A navigated surgery system includes at least one processor that is operative to obtain a 2D medical image slice of anatomical structure of a patient. The operations further obtain a 3D graphical model of anatomical structure. The operations determine a pose of a virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the 2D medical image slice. The operations control the XR headset to display the 2D medical image slice of the anatomical structure of the patient, display the 3D graphical model of the anatomical structure, and display a graphical object oriented with the pose relative to the 3D graphical model of the anatomical structure.

Description

    FIELD
  • The present disclosure relates to computer systems for planning surgical operations and computer assisted navigation of equipment and operators during surgery.
  • BACKGROUND
  • Surgical operating rooms can contain a diverse range of medical equipment, which can include computer assisted surgical navigation systems, surgical robot systems, medical imaging devices (e.g., computerized tomography (CT) scanners, magnetic resonance imaging scanners, fluoroscopy imaging, etc.), neuromonitoring equipment, patient monitors, microscopes, anesthesia equipment, etc.
  • A computer assisted surgical navigation system can provide a surgeon with computerized visualization of the present pose of a surgical tool relative to medical images of a patient's anatomy. Camera tracking systems for computer assisted surgical navigation typically use a set of cameras to track a tool reference array on a surgical tool which is being positioned by a surgeon during surgery relative to a patient reference array attached to a patient. The reference array, also referred to as a dynamic reference array (DRA) or dynamic reference base (DRB), allows the camera tracking system to determine a pose of the surgical tool relative to anatomical structure within a medical image and relative to the patient. The surgeon can thereby use real-time visual feedback of the determined pose(s) to navigate the surgical tool during a surgical procedure on the patient.
  • Many surgical workflows using computer assisted surgical navigation systems require medical image scans, such as CT scans or magnetic resonance imaging scans, during operation and/or registration procedures. Perpendicular scan slices (axial, sagittal, and coronal) are used to enable operators to visualize the patient's anatomy alongside the relative poses of surgical instruments. Projections of the three-dimensional (3D) scan can also be shown. When showing 3D models of a patient's anatomy alongside two-dimensional (2D) slices, it can be challenging for an operator to know how the 3D model and the patient anatomy relates geometrically to the 2D slices.
  • SUMMARY
  • Some embodiments of the present disclosure are directed to providing a navigated surgery system that enables a user wearing an extended reality (XR) headset to visualize how a displayed 2D medical image slice of anatomical structure of a patient relates geometrically to a displayed 3D graphical model of anatomical structure.
  • In some embodiments, a navigated surgery system includes at least one processor that is operative to obtain a first 2D medical image slice of anatomical structure of a patient. The operations obtain a 3D graphical model of anatomical structure. The operations determine a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice. The operations control the XR headset to display the first 2D medical image slice of the anatomical structure of the patient, display the 3D graphical model of the anatomical structure, and display a first graphical object oriented with the first pose relative to the 3D graphical model of the anatomical structure.
  • Other navigated surgery systems and corresponding methods and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional navigated surgery systems, methods. and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
  • DESCRIPTION OF THE DRAWINGS
  • Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
  • FIG. 1 is an overhead view of a personnel wearing extended reality (XR) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and a surgical robot system for robotic assistance, in accordance with some embodiments;
  • FIG. 2 illustrates the navigated surgery camera tracking system and the surgical robot system positioned relative to a patient, according to some embodiments;
  • FIG. 3 illustrates a navigated surgery camera tracking system and a surgical robot system configured according to some embodiments;
  • FIGS. 4A-4B respectively illustrate a C-arm image device and an O-arm imaging device in accordance with some embodiments;
  • FIG. 5 illustrates an XR headset view of an axial 2D medical image slice of anatomical structure of a patient, a sagittal 2D medical image slice of the anatomical structure of the patient, and a 3D graphical model of anatomical structure, in accordance with some embodiments;
  • FIG. 6 illustrates an XR headset view of two graphical objects which are displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model of FIG. 5 that correspond to the anatomical structure of the axial and sagittal 2D medical image slices of FIG. 5 , in accordance with some embodiments;
  • FIG. 7 illustrates another XR headset view that adds to the display of FIG. 6 a coronal 2D medical image slice of the anatomical structure of the patient and further adds a corresponding graphical object which is displayed with a pose that is defined to visually illustrate to the user a virtual cross-sectional plane extending through the 3D graphical model of FIG. 5 that corresponds to the anatomical structure of the coronal 2D medical image slice, in accordance with some embodiments;
  • FIG. 8 illustrates another XR headset view of a 3D graphical model of anatomical structure with axial and sagittal 2D medical image slice of the anatomical structure of the patient being dynamically selected and posed responsive to tracking pose of a tip of a tool, in accordance with some embodiments;
  • FIGS. 9A and 9B illustrate two alternative views displayed through the XR headset of a starting orientation and 90 degree rotated orientation, respectively, of the axial and sagittal 2D medical image slices of FIG. 5 and graphical objects being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model that correspond to the anatomical structure of the viewed axial and sagittal 2D medical image slices, in accordance with some embodiments;
  • FIGS. 10A, 10B, 10C, and 10D illustrate four alternative views displayed through the XR headset of an axial unmirrored and sagittal unmirrored view, an axial unmirrored and sagittal mirrored view, an axial mirrored and sagittal unmirrored view, and an axial mirrored and sagittal mirrored view, respectively, and graphical objects being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model, in accordance with some embodiments;
  • FIG. 11 illustrates the XR headset view of a 3D graphical object posed and extending to overly a region of the 3D graphical model corresponding to where a surgical procedure is to be performed on the anatomical structure of the patient, in accordance with some embodiments;
  • FIGS. 12-15 illustrates flowcharts of operations by a navigated surgery system in accordance with some embodiments; and
  • FIG. 16 illustrates a block diagram of a navigated surgery system that includes an XR headset, a computer platform, and a camera tracking system component which are operative in accordance with some embodiments.
  • DETAILED DESCRIPTION
  • It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
  • The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.
  • Turning now to the drawing, FIG. 1 is an overhead view of personnel wearing extended reality (XR) headsets 150 a and 150 b during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery and a surgical robot system 100 for robotic assistance, in accordance with some embodiments. FIG. 2 illustrates the navigated surgery camera tracking system 202 and the surgical robot system 100 positioned relative to a patient, according to some embodiments. FIG. 3 illustrates the navigated surgery camera tracking system 202 and the surgical robot system 100 configured according to some embodiments.
  • An XR headset may be configured to augment a real-world scene with computer generated XR images. The XR headset may be configured to provide an augmented reality (AR) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user. Alternatively, the XR headset may be configured to provide a virtual reality (VR) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer generated AR images on a display screen. An XR headset can be configured to provide both AR and VR viewing environments. Thus, the term XR headset can referred to as an AR headset or a VR headset.
  • Referring to FIGS. 1-3 , the surgical robot system 100 may include, for example, a surgical robot 102, one or more robot arms 104, a display 110, an end-effector 112, for example, including a guide tube 114, and an end effector reference array which can include one or more tracking markers. The surgical robot system 100 may include a patient reference array 116 with a plurality of tracking markers, which is adapted to be secured directly to the patient 210 (e.g., to a bone of the patient 210). Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc. The surgical robot system 100 may also utilize a tracking camera 200, for example, positioned on the camera tracking system 202. The camera tracking system 202 can have any suitable configuration to move, orient, and support the tracking camera 200 in a desired position, and may contain a computer operable to track pose of reference arrays.
  • The tracking camera 200 may include any suitable camera or cameras, such as one or more infrared cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active and passive tracking markers for various reference arrays attached as the patient 210 (patient reference array), end effector 112 (end effector reference array), extended reality (XR) headset(s) 150 a-150 b worn by a surgeon 120 and/or a surgical assistant 126, etc. in a given measurement volume viewable from the perspective of the tracking camera 200. The tracking camera 200 may track markers 170 attached to a surgical tool, implant, or instrument manipulated by a user. The tracking camera 200 may scan the given measurement volume and detect the light that is emitted or reflected from the reference arrays in order to identify and determine poses of the reference arrays in three-dimensions. For example, active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking camera 200 or other suitable device.
  • The XR headsets 150 a and 150 b (also referred to as an XR headset 150) may each include tracking cameras that can track poses of reference arrays within their camera field-of-views (FOVs) 152 and 154, respectively. Accordingly, as illustrated in FIG. 1 , the poses of reference arrays attached to various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 a and 150 b and/or a FOV 600 of the tracking cameras 200.
  • FIGS. 1 and 2 illustrate a potential configuration for the placement of the camera tracking system 202 and the surgical robot system 100 in an operating room environment. Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 a and 150 b to display surgical procedure navigation information. The surgical robot system 100 in optional during computer-aided navigated surgery.
  • The camera tracking system 202 may use tracking information and other information from multiple XR headsets 150 a and 150 b such as inertial tracking information and optical tracking information as well as (optional) microphone information. The XR headsets 150 a and 150 b operate to display visual information and play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 102 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment. The XR headsets 150 a and 150 b track apparatus such as instruments, patient references and end effectors in 6 degrees-of-freedom (6DOF), and may track the hands of the wearer. The XR headsets 150 a and 150 b may also operate to track hand poses and gestures to enable gesture based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 a and 150 b and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 a and 150 b may have a 1-10× magnification digital color camera sensor called a digital loupe.
  • An “outside-in” machine vision navigation bar (tracking cameras 200) tracks instruments and may include a color camera. The machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 a and 150 b tend to move while positioned on wearers' heads. The patient reference array 116 is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112, instrument reference array 170, and reference arrays on the XR headsets 150 a and 150 b.
  • In some embodiments, one or more of the XR headsets 150 a and 150 b are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
  • When present, the surgical robot system (also “robot”) 102 may be positioned near or next to patient 210. Although depicted near the head of the patient 210, it will be appreciated that the robot 102 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing a surgical procedure. The tracking camera 200 may be separated from the robot system 100 and positioned at the foot of patient 210. This location allows the tracking camera 200 to have a direct visual line of sight to the surgical field 208. Again, it is contemplated that the tracking camera 200 may be located at any suitable position having line of sight to the surgical field 208. In the configuration shown, the surgeon 120 may be positioned across from the robot 102, but is still able to manipulate the end-effector 112 and the display 110. A surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110. If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. The traditional areas for the anesthesiologist 122 and the nurse or scrub tech 124 remain unimpeded by the locations of the robot 102 and camera 200. The anesthesiologist 122 can operate anesthesia equipment which can include a display 34.
  • With respect to the other components of the robot 102, the display 110 can be attached to the surgical robot 102 and in other example embodiments, display 110 can be detached from surgical robot 102, either within a surgical room with the surgical robot 102, or in a remote location. End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor. In example embodiments, end-effector 112 can comprise a guide tube 114, which is able to receive and orient a surgical instrument, tool, or implant 608 used to perform a surgical procedure on the patient 210.
  • As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” The term “instrument” is used in a non-limiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein. Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc. Although generally shown with a guide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument 608 in a desired manner.
  • The surgical robot 102 is operable to control the translation and orientation of the end-effector 112. The robot 102 is operable to move end-effector 112 under computer control along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis (such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled). In some example embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a six degree of freedom robot arm comprising only rotational axes. For example, the surgical robot system 100 may be used to operate on patient 210, and robot arm 104 can be positioned above the body of patient 210, with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210.
  • In some example embodiments, the XR headsets 150 a and 150 b can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
  • As used herein, the term “pose” refers to the position and/or the rotational angle of one object (e.g., dynamic reference array, end-effector, surgical instrument, anatomical structure, etc.) relative to another object and/or to a defined coordinate system. A pose may therefore be defined based on only the multidimensional position of one object relative to another object and/or relative to a defined coordinate system, based on only the multidimensional rotational angles of the object relative to another object and/or to a defined coordinate system, or based on a combination of the multidimensional position and the multidimensional rotational angles. The term “pose” therefore is used to refer to position, rotational angle, or combination thereof.
  • In some further embodiments, surgical robot 102 can be configured to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory. In some example embodiments, surgical robot 102 can be configured to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument. Thus, in use, in example embodiments, a surgeon or other user can operate the system 100, and has the option to stop, modify, or manually control the autonomous movement of end-effector 112 and/or the surgical instrument.
  • Reference arrays can be formed on or connected to robot arm 104, end-effector 112, patient 210, and/or the surgical instrument to track poses in 6 degree-of-freedom (e.g., position along 3 orthogonal axes and rotation about the axes). In example embodiments, a reference array including a plurality of tracking markers can be provided thereon (e.g., formed-on or connected-to) to an outer surface of the robot 102, such as on robot 102, on robot arm 104, and/or on the end-effector 112. A patient reference array including one or more tracking markers can further be provided on the patient 210 (e.g., formed-on or connected-to). An instrument reference array including one or more tracking markers can be provided on surgical instruments (e.g., a screwdriver, dilator, implant inserter, or the like). The reference arrays enable each of the marked objects (e.g., the end-effector 112, the patient 210, and the surgical instruments) to be tracked by the tracking camera 200, and the tracked poses can be used to provide navigation guidance to a surgical procedure and/or used to control movement of the surgical robot 102 for guiding the end-effector 112 and/or an instrument.
  • Referring to FIG. 3 the surgical robot system 100 includes the surgical robot 102 including a display 110, upper arm 306, lower arm 308, end-effector 112, vertical column 312, casters 314, tablet drawer 318, and ring 324 which uses lights to indicate statuses and other information. Cabinet 106 may house certain components of surgical robot system 100 including but not limited to a battery, a power distribution module, a platform interface board module, and a computer. The tracking camera 200 is supported by the camera tracking system 202.
  • FIGS. 4A and 4B illustrate medical imaging systems 1304 that may be used in conjunction with the camera tracking system 202 for navigated surgery, to acquire pre-operative, intra-operative, post-operative, and/or real-time image data of patient 210. Any necessary anatomical structure may be imaged for any appropriate procedure using the imaging system 1304. The imaging system 1304 may be any imaging device such as a C-arm computerized tomography (CT) scan device 1308, an O-arm CT scan device 1306, a fluoroscopy imaging device, a magnetic resonance imaging scanner, etc. It may be desirable to take x-rays of patient 210 from a number of different positions, without the need for frequent manual repositioning of patient 210 which may be required in an x-ray system. As illustrated in FIG. 4A, the imaging system 1304 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape. C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316. The space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318. As illustrated in FIG. 4B, the imaging system 1304 may include an O-arm imaging device 1306 having a gantry housing 1324 attached to a support structure imaging device support structure 1328, such as a wheeled mobile cart 1330 with wheels 1332, which may enclose an image capturing portion, not illustrated. The image capturing portion may include an x-ray source and/or emission portion and an x-ray receiving and/or image receiving portion, which may be disposed about one hundred and eighty degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion. The image capturing portion may be operable to rotate three hundred and sixty degrees during image acquisition. The image capturing portion may rotate around a central point and/or axis, allowing image data of patient 210 to be acquired from multiple directions or in multiple planes. Although certain imaging systems 1304 are exemplified herein, it will be appreciated that any suitable imaging system may be selected by one of ordinary skill in the art.
  • XR Headset View of 2D Medical Image Slices of Patient Anatomical Structure and 3D Graphical Model of Anatomical Structure
  • As was explained above, in traditional computer-assisted navigated surgeries, perpendicular 2D scan slices, such as axial, sagittal, and/or coronal views, of patient anatomical structure are used to visualize the patient's anatomy alongside the relative poses of surgical instruments. In accordance with various embodiments of the present disclosure, an XR headset is controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy. The 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn't necessarily formed from a scan of the patient.
  • When displaying the 3D graphical model concurrently with the one or more 2D scan slices through the XR headset, without further computer-aided assistance it is anticipated that it can be difficult for a user to understand how the 3D graphical model of anatomical structure geometrically relates to the anatomical structure captured in the one or more 2D scan slices. Various embodiments of the present disclosure are directed to providing a navigated surgery system that enables a user wearing the XR headset to visualize how the displayed 2D medical image slice of anatomical structure of a patient relates geometrically to a displayed 3D graphical model of anatomical structure.
  • As will be explained in further detail below, in some embodiments a navigated surgery system displays a graphical object through the XR headset that visually indicates a virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice. Additional information, such as the orientation of the 2D scan slice and e.g., a current vertebral level may be displayed relative to the 3D graphical model.
  • Various embodiments display graphical objects that enable a user to visualize the pose of the cross-sectional plane(s) where the one or more 2D scan slice(s) geometrically correspond to visual “slice(s)” through the 3D graphical model. Although various embodiments are described in the context of orthopedic surgery, they are not limited to any type of surgery.
  • A navigation “plan” for navigated implanting of screws and/or other devices may be viewed based on navigation guidance information that is provided to the XR headsets 150 a and 150 b and/or 2D monitor for display.
  • Basic Display of 2D Scan Slices and 3D Graphical Model without Visualization of Geometric Correspondence:
  • FIG. 5 illustrates an XR headset view of an axial 2D medical image slice 500 of anatomical structure of a patient, a sagittal 2D medical image slice 510 of the anatomical structure of the patient, and a 3D graphical model 520 of anatomical structure, in accordance with some embodiments. The illustration of FIG. 5 does not include a computer-generated graphical object which is configured to visually assist the user (wearer of the XR headset) with determining how the 2D medical image slices 500 and 510 geometrically relate to the 3D graphical model 520. The 3D graphical model 520 may be registered to be displayed at or above the patient anatomy, when the patient is viewed through the XR headset, i.e., patient stabilized display. The axial 2D medical image slice 500 and the sagittal 2D medical image slice 510 may be registered to the user's head, i.e., head stabilized, so that they remain visible as the user looks around the surgical room. The XR headset may be further controlled to display other navigated surgery information, such as graphical representations of planned screw and interbody placement poses relative to the patient viewed through the XR headset and/or CAD graphical models which are displayed with poses that that updated to dynamically track sensed instrument poses.
  • Geometric Correspondence Visualization Between 2D Scan Slices and 3D Graphical Model:
  • FIG. 6 illustrates an XR headset view of two graphical objects 600 and 610 which are displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model 520 of FIG. 5 that correspond to the anatomical structure of the axial and sagittal 2D medical image slices 500 and 510, respectively, of FIG. 5 , in accordance with some embodiments.
  • Referring to FIG. 6 , the graphical object 600 is displayed as a cross-sectional plane that extends through the 3D graphical model 520 with a pose that corresponds to where the 2D axial medical image slice 500 slices through the anatomical structure of the 3D graphical model 520. Similarly, the other graphical object 610 is displayed as another cross-sectional plane that extends through the 3D graphical model 520 with a pose that corresponds to where the 2D sagittal medical image slice 610 slices through the anatomical structure of the 3D graphical model 520. In this manner, a user wearing the XR headset is able to intuitively visualize the geometric relationship between the axial and sagittal 2D medical image slices 500 and 510, respectively, and the anatomical structure of the 3D graphical model 520.
  • FIG. 7 illustrates another XR headset view that adds to the display of FIG. 6 . A coronal 2D medical image slice 700 of the anatomical structure of the patient is displayed through the XR headset along with the axial medical image slice 500 and the sagittal 2D medical image slice 510. The XR headset is also controlled to display a graphical object 710 with a pose that is defined to visually illustrate to the user a virtual cross-sectional plane extending through the 3D graphical model 520 that corresponds to the anatomical structure of the coronal 2D medical image slice, in accordance with some embodiments.
  • Although operations have been described in the context of the example FIGS. 5 through 7 , embodiments of the present disclosure are not limited thereto. More general corresponding operations are now explained with reference to FIG. 12 . FIG. 12 illustrates a flowchart of operations by a navigated surgery system in accordance with some embodiments. Embodiments are not limited to the order of operations shown in FIG. 12 or to including all illustrated operations. For example, at least operations 1208 through 1212 are optional.
  • Referring to FIG. 12 , the navigated surgery system operates to obtain 1200 a first 2D medical image slice of anatomical structure of a patient, such as one of the slices 500, 510, and 700, from a medical image scanner or image database. The system obtains 1202 a 3D graphical model of anatomical structure, such as the model 520, from a medical image scanner, image database, or model database or generator. The system determines 1204 a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice. The system controls an XR headset to display the first 2D medical image slice of the anatomical structure of the patient, display the 3D graphical model of the anatomical structure, and display a first graphical object oriented with the first pose relative to the 3D graphical model of the anatomical structure.
  • The navigated surgery system may control the XR headset to display a graphical representation of a plane overlaid with the first pose on the 3D graphical model of the anatomical structure, such as the plane 610 overlaid on the model 520. The graphical representation of the plane may be provided to the XR headset for display as a shaded and/or colored box overlaid with the first pose on the 3D graphical model of the anatomical structure, such as the shaded plane 610 overlaid on the model 520.
  • The navigated surgery system may display more than one 2D medical image slice such as illustrated in FIGS. 5 through 7 . Accordingly, the system may obtain 1208 a second 2D medical image slice of the anatomical structure of the patient, wherein the first 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second 2D medical image slice, and determine 1210 a second pose of a second virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the second 2D medical image slice. The system can then control 1212 the XR headset to display the second 2D medical image slice of the anatomical structure of the patient and to display a second graphical object oriented with the second pose relative to the 3D graphical model of the anatomical structure.
  • The first 2D medical image slice may be an axial image slice of the anatomical structure of the patient and the second 2D medical image slice may be a sagittal image slice of the anatomical structure of the patient, such as those illustrated in FIGS. 5 through 7 .
  • The navigated surgery system may further operate to control the XR headset to display a graphical representation of a first plane, e.g., 600 in FIG. 7 , overlaid with the first pose on the 3D graphical model of the anatomical structure, and control the XR headset to display a second graphical representation of a second plane, e.g., 620 in FIG. 7 , overlaid with the second pose on the 3D graphical model of the anatomical structure.
  • The navigated surgery system may further operate to control the XR headset to use a first color and/or shading to render at least part of the first 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the first plane for display, and control the XR headset to use a second color and/or shading, which is different from the first color and/or shading, to render at least part of the second 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the second plane for display. In this manner, the user is able to intuitively understand how each of the two 2D medical image slices geometrically correspond to cross-sectional slices through the 3D graphical model of the anatomical structure.
  • The navigated surgery system may further operate to obtain a third 2D medical image slice of the anatomical structure of the patient, where the third 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second and third 2D medical image slices. The third 2D medical image slice may be a coronal image slice of the anatomical structure of the patient. The system determines a third pose of a third virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the third 2D medical image slice. The system controls the XR headset to display the third 2D medical image slice, e.g., 700 in FIG. 7 , of the anatomical structure of the patient and to display a third graphical object, e.g., 710 in FIG. 7 , oriented with the third pose relative to the 3D graphical model of the anatomical structure.
  • Tool-Centric Visualization of 2D Scan Slice:
  • During a navigated surgical procedure, the standard axial 2D image slice and/or sagittal 2D image slice visualization may be swapped for a tool-centric visualization whereby perpendicular image slice(s) are selected among image slices forming an image volume based on the tip of a tracked surgical instrument (tool). The image slice(s) can be displayed as overlay(s) on the 3D graphical model of the anatomical structure and/or on the patient viewed through a see-through screen of the XR headset.
  • FIG. 8 illustrates an XR headset view of axial and sagittal 2D image slices 800 and 810, respectively, which are displayed with poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model 520 of FIG. 5 , and where the 2D image slices are dynamically selected and posed responsive to tracking pose of a tip of a tool 820 (instrument, etc.), in accordance with some embodiments. FIG. 13 illustrates a flowchart of corresponding operations by a navigated surgery system in accordance with some embodiments.
  • Referring to FIGS. 8 and 13 , the navigated surgery system is operative to obtain 1300 a tracked pose of a tip of a tool 820 being tracked by a camera tracking system relative to the anatomical structure of the patient. The system operates to select 1302 a 2D medical image slice 810 from among a set of 2D medical image slices forming an imaged volume of the anatomical structure of the patient, based on the tracked pose of the tip of the tool 820 relative to the anatomical structure of the patient.
  • Thus, for example, as a surgeon moves the tool tip within the displayed spine of the 3D graphical model 520, the navigated surgery system responds to the updated pose locations of the tool tip by selecting, from among the set of 2D image slices, and displaying through the XR headset corresponding 2D image slices. In this manner, the surgeon can dynamically reposition the tool 820 to see corresponding 2D image slices of the patient's anatomy, i.e., spine illustrated in FIG. 8 .
  • Fixed-Plane 3D Graphical Model Rotation:
  • Another operational embodiment aids a surgeon with visualization by keeping the slice planes fixed while allowing manual rotation of the 3D graphical model 610. The operations allow the surgeon to spin the 3D graphical model 610 around while maintaining fixed viewing planes of the axial and sagittal 2D image slices.
  • FIG. 9A illustrates a view displayed through the XR headset of a starting orientation of the axial medical image slice 520 and the sagittal 2D image slice 510 and graphical objects 600 and 610 being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model 610 that correspond to the anatomical structure of the viewed axial and sagittal 2D image slices, in accordance with some embodiments. The graphical object 600 is illustrated as the virtual cross-sectional plane corresponding to the axial medical image slice 520. The other graphical object 610 is illustrated as the virtual cross-sectional plane corresponding to the sagittal 2D image slice 510.
  • In FIG. 9B the 3D graphical model 610 is rotated 90 around the vertical axis to be illustrated as viewed model 920, which effectively switches the axial slice 500 view and the sagittal slice 510 view. While rotating between these points, the surgeon is able to view all intermediate slice orientations with a clear and intuitive visualization. FIG. 9B illustrates a view displayed through the XR headset of a 90 degree rotated orientation of the axial medical image slice 520 and the sagittal 2D image slice 510, and graphical objects 910 and 900 being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 90 degree rotated orientation of the 3D graphical model 920, in accordance with some embodiments. The graphical object 910 is illustrated as the virtual cross-sectional plane corresponding to the axial medical image slice 500. The other graphical object 900 is illustrated as the virtual cross-sectional plane corresponding to the sagittal 2D image slice 510.
  • A corresponding operation by the navigated surgery system can include, responding to a rotation command from a user by controlling the XR headset to display an angularly rotated view of the 3D graphical model of the anatomical structure while displaying the first graphical object oriented with the first pose.
  • 2D Image Slice Orientation Visualization:
  • Some other embodiments are directed to displaying further information which enables visualization of the orientations of the 2D image slices. Sagittal image slices may be flipped in order to match the orientation from which a surgeon is viewing the patient's spine or other anatomy, and the orientation of the axial 2D image slices can then be difficult to perceive because the axial 2D image slices have few asymmetries. To enable more intuitive and accurate visualization of the corresponding 2D image slice orientations, overlays can be shaded and/or colored to visually indicate the orientation of the 2D image slice. The current orientation of the 2D image slice can be shown with an overlay, and if viewed from behind (or if the 2D image slice view is flipped) the overlay can be rendered as a hollow outline, or vice versa.
  • FIGS. 10A, 10B, 10C, and 10D illustrate four alternative views displayed through the XR headset of an axial unmirrored and sagittal unmirrored view, an axial unmirrored and sagittal mirrored view, an axial mirrored and sagittal unmirrored view, and an axial mirrored and sagittal mirrored view, respectively, and graphical objects being displayed with respective poses defined to visually illustrate to the user virtual cross-sectional planes extending through the 3D graphical model, in accordance with some embodiments.
  • More particularly, FIG. 10A illustrates the axial unmirrored and sagittal unmirrored view in which the cross-sectional objects 600 and 610 are both shaded and/or colored. FIG. 10B illustrates the axial unmirrored and sagittal mirrored view in which the cross-sectional object 1010 corresponding to the sagittal 2D image slice is not shaded and/or colored to indicate it is a mirrored view. FIG. 10C illustrates the axial mirrored and sagittal unmirrored view in which the cross-sectional object 1000 corresponding to the axial 2D image slice is not shaded and/or colored to indicate it is a mirrored view. FIG. 10D illustrates the axial mirrored and sagittal mirrored view in which the cross-sectional object 1000 corresponding to the axial 2D image slice is not shaded and/or colored to indicate it is a mirrored view and in which the cross-sectional object 1010 corresponding to the sagittal 2D image slice is not shaded and/or colored to indicate it is a mirrored view.
  • Accordingly, while wearing the XR headset, the surgeon can move around the patient to view an overlay from an opposite side which uses an opposite shading and/or color effect to visually illustrate the different viewing perspectives. In some embodiments the operations for the shading or hollow representations are reversed, and/or a visual indication (queue) is added, e.g., as a star or other symbol in the upper left corner of the corresponding slices. In some embodiments a gradient across the slice background is used to intuitively indicate the viewed directionality.
  • Corresponding operations that may be performed by the navigated surgery system are illustrated in the flowchart of FIG. 14 . Referring to FIG. 14 , the operations obtain 1400 a tracked pose of the XR headset relative to the anatomical structure of the patient. The operations determine 1402 based on the tracked pose whether to flip orientation of a rendering of the first 2D medical image slice to be provided to the XR headset for display. The operations then select 1404 between two graphically distinct objects to be rendered as the first graphical object for display responsive to whether the determination is to flip orientation of the rendering of the first 2D medical image slice to be provided to the XR headset for display.
  • Current Level Visualization:
  • During surgery, the active vertebral level is always important for a surgeon or other user to know. Incorrectly identifying levels or associating a level on the 2D image scan with the wrong level on a patient can be catastrophic during surgery. 3D graphical model visualizations can help resolve this issue. The navigated surgery system may operate to identify the selected (active) level in the 2D image scan and display a highlighted zone (or other 3D graphical object) on the currently selected level. When surgical implants or tracked instruments appear on the wrong level, then such improper location (pose) will be immediately visually recognizable by the surgeon because the implants and instruments would appear outside of the highlighted zone. These operations enable such possible mistakes to be more intuitively and accurately detected and corrected.
  • Corresponding operations that may be performed by the navigated surgery system are illustrated in the flowchart of FIG. 15 . Referring to FIG. 15 , the operations control 1500 the XR headset to display a 3D graphical object 1100 with a pose and extending to overly a region of the 3D graphical model of the anatomical structure corresponding to where a surgical procedure is to be performed on the anatomical structure of the patient. In the particular example of FIG. 15 , 3D graphical object 1100 is rendered as a shaded 3D rectangular object that is posed to correspond to a level of the spine where a surgical procedure is to be performed. The operations obtain 1502 a tracked pose of a tool or an implant being tracked by a camera tracking system relative to the anatomical structure of the patient, e.g., spine illustrated in the 3D graphical model 520. The operations control 1504 the XR headset to display a graphical representation of the tool or the implant overlaid with the tracked pose on the 3D graphical model 520 of the anatomical structure while continuing to display the 3D graphical object 1100. In this manner, a wearer of the XR headset can visually determine in an intuitive manner whether the tool or implant is presently positioned and oriented (e.g., posed) at the correct level of the spine.
  • Example Surgical System
  • FIG. 16 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 1600, imaging devices, and a surgical robot 102 which are configured to operate in accordance with various embodiments.
  • The imaging devices may include the C-arm imaging device 1304, the O-arm imaging device 1306, and/or a patient image database 1620. The XR headset 150 provides an improved human interface for performing navigated surgical procedures. The XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 1600, that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 1612. The display device 1612 may a video projector, flat panel display, etc. The user can view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen. The XR headset 150 may additionally or alternatively be configured to display on the display device 1612 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
  • Electrical components of the XR headset 150 can include a plurality of cameras 1622, a microphone 1620, a gesture sensor 1618, a pose sensor (e.g., inertial measurement unit (IMU)) 1616, the display device 1612, and a wireless/wired communication interface 1624. The cameras 1622 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
  • The cameras 1622 may be configured to operate as the gesture sensor 1618 by tracking for identification user hand gestures performed within the field of view of the camera(s) 1622. Alternatively the gesture sensor 1618 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 1618 and/or senses physical contact, e.g. tapping on the sensor 1618 or its enclosure. The pose sensor 1616, e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
  • As explained above, a surgical system includes a camera tracking system 202 which may be part of a computer platform 1600 that can also provide functionality of a navigation controller 1604 and/or of a XR headset controller 1610. The surgical system may include the imaging devices and/or a surgical robot 102. The navigation controller 1604 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 202. The navigation controller 1604 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 102, where the steering information is used to display information through the XR headset 150 to indicate where the surgical tool and/or the end effector of the surgical robot 102 should be moved to perform the surgical plan.
  • The electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 1600 through a wired/wireless interface 1624. The electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 1600 or directly connected, to various imaging devices, e.g., the C-arm imaging device 1304, the I/O-arm imaging device 1306, the patient image database 1620, and/or to other medical equipment through the wired/wireless interface 1624.
  • The surgical system further includes at least one XR headset controller 1610 (also referred to as “XR headset controller” for brevity) that may reside in the XR headset 150, the computer platform 1600, and/or in another system component connected via wired cables and/or wireless communication links. Various functionality is provided by software executed by the XR headset controller 1610. The XR headset controller 1610 is configured to receive information from the camera tracking system 202 and the navigation controller 1604, and to generate an XR image based on the information for display on the display device 1612.
  • The XR headset controller 1610 can be configured to operationally process signaling from the cameras 1622, the microphone 1620, and/or the pose sensor 1616, and is connected to display XR images on the display device 1612 for user viewing. Thus, the XR headset controller 1610 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user. For example, the XR headset controller 1610 may reside within the computer platform 1600 which, in turn, may reside within a housing of the surgical robot 102, the camera tracking system 202, etc.
  • Further Definitions and Embodiments
  • In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
  • When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
  • It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
  • As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
  • It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
  • Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims (20)

What is claimed is:
1. A navigated surgery system comprising at least one processor operative to:
obtain a first two-dimensional (2D) medical image slice of anatomical structure of a patient;
obtain a three-dimensional (3D) graphical model of anatomical structure;
determine a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the first 2D medical image slice; and
control an extended reality (XR) headset to display the first 2D medical image slice of the anatomical structure of the patient, display the 3D graphical model of the anatomical structure, and display a first graphical object oriented with the first pose relative to the 3D graphical model of the anatomical structure.
2. The navigated surgery system of claim 1, wherein the at least one processor is further operative to control the XR headset to display a graphical representation of a plane overlaid with the first pose on the 3D graphical model of the anatomical structure.
3. The navigated surgery system of claim 2, wherein the graphical representation of the plane is provided to the XR headset for display as a shaded and/or colored box overlaid with the first pose on the 3D graphical model of the anatomical structure.
4. The navigated surgery system of claim 1, wherein the at least one processor is further operative to:
obtain a second 2D medical image slice of the anatomical structure of the patient, wherein the first 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second 2D medical image slice;
determine a second pose of a second virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the second 2D medical image slice; and
control the XR headset to display the second 2D medical image slice of the anatomical structure of the patient and to display a second graphical object oriented with the second pose relative to the 3D graphical model of the anatomical structure.
5. The navigated surgery system of claim 4, wherein the first 2D medical image slice is an axial image slice of the anatomical structure of the patient and the second 2D medical image slice is a sagittal image slice of the anatomical structure of the patient.
6. The navigated surgery system of claim 4, wherein the at least one processor is further operative to:
control the XR headset to display a graphical representation of a first plane overlaid with the first pose on the 3D graphical model of the anatomical structure; and
control the XR headset to display a second graphical representation of a second plane overlaid with the second pose on the 3D graphical model of the anatomical structure.
7. The navigated surgery system of claim 6, wherein the at least one processor is further operative to:
control the XR headset to use a first color to render at least part of the first 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the first plane for display; and
control the XR headset to use a second color, which is different from the first color, to render at least part of the second 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the second plane for display.
8. The navigated surgery system of claim 4, wherein the at least one processor is further operative to:
obtain a third 2D medical image slice of the anatomical structure of the patient, wherein the third 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second and third 2D medical image slices;
determine a third pose of a third virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the third 2D medical image slice; and
control the XR headset to display the third 2D medical image slice of the anatomical structure of the patient and to display a third graphical object oriented with the third pose relative to the 3D graphical model of the anatomical structure.
9. The navigated surgery system of claim 8, wherein:
the first 2D medical image slice is an axial image slice of the anatomical structure of the patient;
the second 2D medical image slice is a sagittal image slice of the anatomical structure of the patient; and
the third 2D medical image slice is a coronal image slice of the anatomical structure of the patient.
10. The navigated surgery system of claim 1, wherein the at least one processor is further operative to:
obtain a tracked pose of a tip of a tool being tracked by a camera tracking system relative to the anatomical structure of the patient;
select the first 2D medical image slice from among a set of 2D medical image slices forming an imaged volume of the anatomical structure of the patient, based on the tracked pose of the tip of the tool relative to the anatomical structure of the patient.
11. The navigated surgery system of claim 1, wherein the at least one processor is further operative to:
responsive to a rotation command from a user, control the XR headset to display an angularly rotated view of the 3D graphical model of the anatomical structure while displaying the first graphical object oriented with the first pose.
12. The navigated surgery system of claim 1, wherein the at least one processor is further operative to:
obtain a tracked pose of the XR headset relative to the anatomical structure of the patient;
determine based on the tracked pose whether to flip orientation of a rendering of the first 2D medical image slice to be provided to the XR headset for display; and
select between two graphically distinct objects to be rendered as the first graphical object for display responsive to whether the determination is to flip orientation of the rendering of the first 2D medical image slice to be provided to the XR headset for display.
13. The navigated surgery system of claim 1, wherein the at least one processor is further operative to:
control the XR headset to display the 3D graphical object with the first pose and extending overlying a region of the 3D graphical model of the anatomical structure corresponding to where a surgical procedure is to be performed on the anatomical structure of the patient;
obtain a tracked pose of a tool or an implant being tracked by a camera tracking system relative to the anatomical structure of the patient;
control the XR headset to display a graphical representation of the tool or the implant overlaid with the tracked pose on the 3D graphical model of the anatomical structure while continuing to display the 3D graphical object.
14. A navigated surgery system comprising at least one processor operative to:
obtain a first two-dimensional (2D) medical image slice of a spinal anatomical structure of a patient;
obtain a three-dimensional (3D) graphical model of the spinal anatomical structure;
determine a first pose of a first virtual cross-sectional plane extending through the 3D graphical model of the spinal anatomical structure that corresponds to the spinal anatomical structure of the first 2D medical image slice; and
control an extended reality (XR) headset to display the first 2D medical image slice of the spinal anatomical structure of the patient, display the 3D graphical model of the spinal anatomical structure, and display a first graphical object oriented with the first pose relative to the 3D graphical model of the spinal anatomical structure
wherein the at least one processor is further operative to:
obtain a second 2D medical image slice of the anatomical structure of the patient, wherein the first 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second 2D medical image slice;
determine a second pose of a second virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the second 2D medical image slice; and
control the XR headset to display the second 2D medical image slice of the anatomical structure of the patient and to display a second graphical object oriented with the second pose relative to the 3D graphical model of the anatomical stru.
15. The navigated surgery system of claim 14, wherein the at least one processor is further operative to control the XR headset to display a graphical representation of a plane overlaid with the first pose on the 3D graphical model of the anatomical structure.
16. The navigated surgery system of claim 15, wherein the graphical representation of the plane is provided to the XR headset for display as a shaded and/or colored box overlaid with the first pose on the 3D graphical model of the anatomical structure.
17. The navigated surgery system of claim 14, wherein the first 2D medical image slice is an axial image slice of the anatomical structure of the patient and the second 2D medical image slice is a sagittal image slice of the anatomical structure of the patient.
18. The navigated surgery system of claim 14, wherein the at least one processor is further operative to:
control the XR headset to display a graphical representation of a first plane overlaid with the first pose on the 3D graphical model of the anatomical structure; and
control the XR headset to display a second graphical representation of a second plane overlaid with the second pose on the 3D graphical model of the anatomical structure.
19. The navigated surgery system of claim 18, wherein the at least one processor is further operative to:
control the XR headset to use a first color to render at least part of the first 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the first plane for display; and
control the XR headset to use a second color, which is different from the first color, to render at least part of the second 2D medical image slice of the anatomical structure of the patient and to render at least part of the graphical representation of the second plane for display.
20. The navigated surgery system of claim 14, wherein the at least one processor is further operative to:
obtain a third 2D medical image slice of the anatomical structure of the patient, wherein the third 2D medical image slice is an angularly offset image slice of the anatomical structure of the patient relative to the second and third 2D medical image slices;
determine a third pose of a third virtual cross-sectional plane extending through the 3D graphical model of the anatomical structure that corresponds to the anatomical structure of the third 2D medical image slice; and
control the XR headset to display the third 2D medical image slice of the anatomical structure of the patient and to display a third graphical object oriented with the third pose relative to the 3D graphical model of the anatomical structure.
US17/539,796 2021-12-01 2021-12-01 Extended reality systems with three-dimensional visualizations of medical image scan slices Pending US20230165639A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/539,796 US20230165639A1 (en) 2021-12-01 2021-12-01 Extended reality systems with three-dimensional visualizations of medical image scan slices
US17/540,319 US12232820B2 (en) 2021-12-01 2021-12-02 Extended reality systems with three-dimensional visualizations of medical image scan slices
US19/041,011 US20250228624A1 (en) 2021-12-01 2025-01-30 Extended reality systems with three-dimensional visualizations of medical image scan slices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/539,796 US20230165639A1 (en) 2021-12-01 2021-12-01 Extended reality systems with three-dimensional visualizations of medical image scan slices

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/540,319 Continuation US12232820B2 (en) 2021-12-01 2021-12-02 Extended reality systems with three-dimensional visualizations of medical image scan slices

Publications (1)

Publication Number Publication Date
US20230165639A1 true US20230165639A1 (en) 2023-06-01

Family

ID=86501164

Family Applications (3)

Application Number Title Priority Date Filing Date
US17/539,796 Pending US20230165639A1 (en) 2021-12-01 2021-12-01 Extended reality systems with three-dimensional visualizations of medical image scan slices
US17/540,319 Active 2043-01-09 US12232820B2 (en) 2021-12-01 2021-12-02 Extended reality systems with three-dimensional visualizations of medical image scan slices
US19/041,011 Pending US20250228624A1 (en) 2021-12-01 2025-01-30 Extended reality systems with three-dimensional visualizations of medical image scan slices

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/540,319 Active 2043-01-09 US12232820B2 (en) 2021-12-01 2021-12-02 Extended reality systems with three-dimensional visualizations of medical image scan slices
US19/041,011 Pending US20250228624A1 (en) 2021-12-01 2025-01-30 Extended reality systems with three-dimensional visualizations of medical image scan slices

Country Status (1)

Country Link
US (3) US20230165639A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12020801B2 (en) 2018-06-19 2024-06-25 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12070272B2 (en) 2013-10-10 2024-08-27 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
WO2024253828A1 (en) * 2023-06-05 2024-12-12 Apple Inc. Techniques for three-dimensional environments
EP4620421A1 (en) * 2024-03-19 2025-09-24 Stryker European Operations Limited Automatic recentering of surgical navigation view on instrument tip
US12458446B2 (en) 2019-05-14 2025-11-04 Howmedica Osteonics Corp. Bone wall tracking and guidance for orthopedic implant placement
US12465374B2 (en) 2019-12-18 2025-11-11 Howmedica Osteonics Corp. Surgical guidance for surgical tools
US12472013B2 (en) 2019-11-26 2025-11-18 Howmedica Osteonics Corp. Virtual guidance for correcting surgical pin installation
US12496135B2 (en) 2021-02-02 2025-12-16 Howmedica Osteonics Corp. Mixed-reality humeral-head sizing and placement

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2536650A (en) 2015-03-24 2016-09-28 Augmedics Ltd Method and system for combining video-based and optic-based augmented reality in a near eye display
US12458411B2 (en) 2017-12-07 2025-11-04 Augmedics Ltd. Spinous process clamp
US11980507B2 (en) 2018-05-02 2024-05-14 Augmedics Ltd. Registration of a fiducial marker for an augmented reality system
US11766296B2 (en) 2018-11-26 2023-09-26 Augmedics Ltd. Tracking system for image-guided surgery
US12178666B2 (en) 2019-07-29 2024-12-31 Augmedics Ltd. Fiducial marker
US11382712B2 (en) 2019-12-22 2022-07-12 Augmedics Ltd. Mirroring in image guided surgery
US11389252B2 (en) 2020-06-15 2022-07-19 Augmedics Ltd. Rotating marker for image guided surgery
US12239385B2 (en) 2020-09-09 2025-03-04 Augmedics Ltd. Universal tool adapter
US12150821B2 (en) 2021-07-29 2024-11-26 Augmedics Ltd. Rotating marker and adapter for image-guided surgery
WO2023021450A1 (en) 2021-08-18 2023-02-23 Augmedics Ltd. Stereoscopic display and digital loupe for augmented-reality near-eye display
WO2023203521A1 (en) 2022-04-21 2023-10-26 Augmedics Ltd. Systems and methods for medical image visualization
EP4587881A1 (en) 2022-09-13 2025-07-23 Augmedics Ltd. Augmented reality eyewear for image-guided medical intervention

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070238981A1 (en) * 2006-03-13 2007-10-11 Bracco Imaging Spa Methods and apparatuses for recording and reviewing surgical navigation processes
US20090128618A1 (en) * 2007-11-16 2009-05-21 Samsung Electronics Co., Ltd. System and method for object selection in a handheld image capture device
US20190239926A1 (en) * 2007-12-18 2019-08-08 Howmedica Osteonics Corporation System and method for image segmentation, bone model generation and modification, and surgical planning
US20190247130A1 (en) * 2009-02-17 2019-08-15 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
EP3861956A1 (en) * 2020-02-04 2021-08-11 Globus Medical, Inc. Extended reality instrument interaction zone for navigated robotic surgery
WO2022072700A1 (en) * 2020-10-01 2022-04-07 Covidien Lp Systems and method of planning thoracic surgery

Family Cites Families (554)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2614083B2 (en) 1976-04-01 1979-02-08 Siemens Ag, 1000 Berlin Und 8000 Muenchen X-ray film device for the production of transverse slice images
US5354314A (en) 1988-12-23 1994-10-11 Medical Instrumentation And Diagnostics Corporation Three-dimensional beam localization apparatus and microscope for stereotactic diagnoses or surgery mounted on robotic type arm
US5246010A (en) 1990-12-11 1993-09-21 Biotrine Corporation Method and apparatus for exhalation analysis
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US6963792B1 (en) 1992-01-21 2005-11-08 Sri International Surgical method
US5631973A (en) 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
US5657429A (en) 1992-08-10 1997-08-12 Computer Motion, Inc. Automated endoscope system optimal positioning
US5397323A (en) 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
WO1994026167A1 (en) 1993-05-14 1994-11-24 Sri International Remote center positioner
JP3378401B2 (en) 1994-08-30 2003-02-17 株式会社日立メディコ X-ray equipment
US6646541B1 (en) 1996-06-24 2003-11-11 Computer Motion, Inc. General purpose distributed operating room control system
AU3950595A (en) 1994-10-07 1996-05-06 St. Louis University Surgical navigation systems including reference and localization frames
US6978166B2 (en) 1994-10-07 2005-12-20 Saint Louis University System for use in displaying images of a body part
US5882206A (en) 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
US5887121A (en) 1995-04-21 1999-03-23 International Business Machines Corporation Method of constrained Cartesian control of robotic mechanisms with active and passive joints
US6122541A (en) 1995-05-04 2000-09-19 Radionics, Inc. Head band for frameless stereotactic registration
US5649956A (en) 1995-06-07 1997-07-22 Sri International System and method for releasably holding a surgical instrument
US5825982A (en) 1995-09-15 1998-10-20 Wright; James Head cursor control interface for an automated endoscope system for optimal positioning
US5772594A (en) 1995-10-17 1998-06-30 Barrick; Earl F. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US5855583A (en) 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
SG64340A1 (en) 1996-02-27 1999-04-27 Inst Of Systems Science Nation Curved surgical instruments and methods of mapping a curved path for stereotactic surgery
US6167145A (en) 1996-03-29 2000-12-26 Surgical Navigation Technologies, Inc. Bone navigation system
US5792135A (en) 1996-05-20 1998-08-11 Intuitive Surgical, Inc. Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6167296A (en) 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
US7302288B1 (en) 1996-11-25 2007-11-27 Z-Kat, Inc. Tool position indicator
US8529582B2 (en) 1996-12-12 2013-09-10 Intuitive Surgical Operations, Inc. Instrument interface of a robotic surgical system
US9050119B2 (en) 2005-12-20 2015-06-09 Intuitive Surgical Operations, Inc. Cable tensioning in a robotic surgical system
US7727244B2 (en) 1997-11-21 2010-06-01 Intuitive Surgical Operation, Inc. Sterile surgical drape
US6205411B1 (en) 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
US6012216A (en) 1997-04-30 2000-01-11 Ethicon, Inc. Stand alone swage apparatus
US5820559A (en) 1997-03-20 1998-10-13 Ng; Wan Sing Computerized boundary estimation in medical images
US5911449A (en) 1997-04-30 1999-06-15 Ethicon, Inc. Semi-automated needle feed method and apparatus
US6231565B1 (en) 1997-06-18 2001-05-15 United States Surgical Corporation Robotic arm DLUs for performing surgical tasks
EP2362286B1 (en) 1997-09-19 2015-09-02 Massachusetts Institute Of Technology Robotic apparatus
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US5951475A (en) 1997-09-25 1999-09-14 International Business Machines Corporation Methods and apparatus for registering CT-scan data to multiple fluoroscopic images
US5987960A (en) 1997-09-26 1999-11-23 Picker International, Inc. Tool calibrator
US6212419B1 (en) 1997-11-12 2001-04-03 Walter M. Blume Method and apparatus using shaped field of repositionable magnet to guide implant
US6157853A (en) 1997-11-12 2000-12-05 Stereotaxis, Inc. Method and apparatus using shaped field of repositionable magnet to guide implant
US6031888A (en) 1997-11-26 2000-02-29 Picker International, Inc. Fluoro-assist feature for a diagnostic imaging device
US6165170A (en) 1998-01-29 2000-12-26 International Business Machines Corporation Laser dermablator and dermablation
US7169141B2 (en) 1998-02-24 2007-01-30 Hansen Medical, Inc. Surgical instrument
FR2779339B1 (en) 1998-06-09 2000-10-13 Integrated Surgical Systems Sa MATCHING METHOD AND APPARATUS FOR ROBOTIC SURGERY, AND MATCHING DEVICE COMPRISING APPLICATION
US6477400B1 (en) 1998-08-20 2002-11-05 Sofamor Danek Holdings, Inc. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
DE19839825C1 (en) 1998-09-01 1999-10-07 Siemens Ag Diagnostic X=ray device
US6033415A (en) 1998-09-14 2000-03-07 Integrated Surgical Systems System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system
DE19842798C1 (en) 1998-09-18 2000-05-04 Howmedica Leibinger Gmbh & Co Calibration device
WO2000021442A1 (en) 1998-10-09 2000-04-20 Surgical Navigation Technologies, Inc. Image guided vertebral distractor
US6659939B2 (en) 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US8527094B2 (en) 1998-11-20 2013-09-03 Intuitive Surgical Operations, Inc. Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures
US6325808B1 (en) 1998-12-08 2001-12-04 Advanced Realtime Control Systems, Inc. Robotic system, docking station, and surgical tool for collaborative control in minimally invasive surgery
US7125403B2 (en) 1998-12-08 2006-10-24 Intuitive Surgical In vivo accessories for minimally invasive robotic surgery
US6322567B1 (en) 1998-12-14 2001-11-27 Integrated Surgical Systems, Inc. Bone motion tracking system
US6451027B1 (en) 1998-12-16 2002-09-17 Intuitive Surgical, Inc. Devices and methods for moving an image capture device in telesurgical systems
US7016457B1 (en) 1998-12-31 2006-03-21 General Electric Company Multimode imaging system for generating high quality images
DE19905974A1 (en) 1999-02-12 2000-09-07 Siemens Ag Computer tomography scanning method using multi-line detector
US6560354B1 (en) 1999-02-16 2003-05-06 University Of Rochester Apparatus and method for registration of images to physical space using a weighted combination of points and surfaces
US6778850B1 (en) 1999-03-16 2004-08-17 Accuray, Inc. Frameless radiosurgery treatment system and method
US6501981B1 (en) 1999-03-16 2002-12-31 Accuray, Inc. Apparatus and method for compensating for respiratory and patient motions during treatment
US6144875A (en) 1999-03-16 2000-11-07 Accuray Incorporated Apparatus and method for compensating for respiratory and patient motion during treatment
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
JP2000271110A (en) 1999-03-26 2000-10-03 Hitachi Medical Corp Medical x-ray system
US6565554B1 (en) 1999-04-07 2003-05-20 Intuitive Surgical, Inc. Friction compensation in a minimally invasive surgical apparatus
US6424885B1 (en) 1999-04-07 2002-07-23 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US6594552B1 (en) 1999-04-07 2003-07-15 Intuitive Surgical, Inc. Grip strength with tactile feedback for robotic surgery
US6301495B1 (en) 1999-04-27 2001-10-09 International Business Machines Corporation System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan
DE19927953A1 (en) 1999-06-18 2001-01-11 Siemens Ag X=ray diagnostic apparatus
US6314311B1 (en) 1999-07-28 2001-11-06 Picker International, Inc. Movable mirror laser registration system
US6788018B1 (en) 1999-08-03 2004-09-07 Intuitive Surgical, Inc. Ceiling and floor mounted surgical robot set-up arms
US8271130B2 (en) 2009-03-09 2012-09-18 Intuitive Surgical Operations, Inc. Master controller having redundant degrees of freedom and added forces to create internal motion
US8004229B2 (en) 2005-05-19 2011-08-23 Intuitive Surgical Operations, Inc. Software center and highly configurable robotic systems for surgery and other uses
US7594912B2 (en) 2004-09-30 2009-09-29 Intuitive Surgical, Inc. Offset remote center manipulator for robotic surgery
US6312435B1 (en) 1999-10-08 2001-11-06 Intuitive Surgical, Inc. Surgical instrument with extended reach for use in minimally invasive surgery
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6499488B1 (en) 1999-10-28 2002-12-31 Winchester Development Associates Surgical sensor
US6235038B1 (en) 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US7366562B2 (en) 2003-10-17 2008-04-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
AU4311901A (en) 1999-12-10 2001-06-18 Michael I. Miller Method and apparatus for cross modality image registration
US7635390B1 (en) 2000-01-14 2009-12-22 Marctec, Llc Joint replacement component having a modular articulating surface
US6377011B1 (en) 2000-01-26 2002-04-23 Massachusetts Institute Of Technology Force feedback user interface for minimally invasive surgical simulator and teleoperator and other similar apparatus
WO2001056007A1 (en) 2000-01-28 2001-08-02 Intersense, Inc. Self-referenced tracking
WO2001064124A1 (en) 2000-03-01 2001-09-07 Surgical Navigation Technologies, Inc. Multiple cannula image guided tool for image guided procedures
WO2001067979A1 (en) 2000-03-15 2001-09-20 Orthosoft Inc. Automatic calibration system for computer-aided surgical instruments
US6535756B1 (en) 2000-04-07 2003-03-18 Surgical Navigation Technologies, Inc. Trajectory storage apparatus and method for surgical navigation system
US6856827B2 (en) 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6856826B2 (en) 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6490475B1 (en) 2000-04-28 2002-12-03 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6614453B1 (en) 2000-05-05 2003-09-02 Koninklijke Philips Electronics, N.V. Method and apparatus for medical image display for surgical tool planning and navigation in clinical environments
US6645196B1 (en) 2000-06-16 2003-11-11 Intuitive Surgical, Inc. Guided tool change
US6782287B2 (en) 2000-06-27 2004-08-24 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for tracking a medical instrument based on image registration
US6837892B2 (en) 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
US6902560B1 (en) 2000-07-27 2005-06-07 Intuitive Surgical, Inc. Roll-pitch-roll surgical tool
DE10037491A1 (en) 2000-08-01 2002-02-14 Stryker Leibinger Gmbh & Co Kg Process for three-dimensional visualization of structures inside the body
US6823207B1 (en) 2000-08-26 2004-11-23 Ge Medical Systems Global Technology Company, Llc Integrated fluoroscopic surgical navigation and imaging workstation with command protocol
JP4022145B2 (en) 2000-09-25 2007-12-12 ゼット − キャット、インコーポレイテッド Fluoroscopic superposition structure with optical and / or magnetic markers
WO2002034152A1 (en) 2000-10-23 2002-05-02 Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts Method, device and navigation aid for navigation during medical interventions
US6718194B2 (en) 2000-11-17 2004-04-06 Ge Medical Systems Global Technology Company, Llc Computer assisted intramedullary rod surgery system with enhanced features
US6666579B2 (en) 2000-12-28 2003-12-23 Ge Medical Systems Global Technology Company, Llc Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system
US6840938B1 (en) 2000-12-29 2005-01-11 Intuitive Surgical, Inc. Bipolar cauterizing instrument
CN100491914C (en) 2001-01-30 2009-05-27 Z-凯特公司 Tool calibrator and tracker system
US7220262B1 (en) 2001-03-16 2007-05-22 Sdgi Holdings, Inc. Spinal fixation system and related methods
FR2822674B1 (en) 2001-04-03 2003-06-27 Scient X STABILIZED INTERSOMATIC MELTING SYSTEM FOR VERTEBERS
WO2002083003A1 (en) 2001-04-11 2002-10-24 Clarke Dana S Tissue structure identification in advance of instrument
US7824401B2 (en) 2004-10-08 2010-11-02 Intuitive Surgical Operations, Inc. Robotic tool with wristed monopolar electrosurgical end effectors
US6994708B2 (en) 2001-04-19 2006-02-07 Intuitive Surgical Robotic tool with monopolar electro-surgical scissors
US6783524B2 (en) 2001-04-19 2004-08-31 Intuitive Surgical, Inc. Robotic surgical tool with ultrasound cauterizing and cutting instrument
US8398634B2 (en) 2002-04-18 2013-03-19 Intuitive Surgical Operations, Inc. Wristed robotic surgical tool for pluggable end-effectors
US6636757B1 (en) 2001-06-04 2003-10-21 Surgical Navigation Technologies, Inc. Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US7607440B2 (en) 2001-06-07 2009-10-27 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
EP1395194B1 (en) 2001-06-13 2007-08-29 Volume Interactions Pte. Ltd. A guide system
US6584339B2 (en) 2001-06-27 2003-06-24 Vanderbilt University Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery
CA2451824C (en) 2001-06-29 2015-02-24 Intuitive Surgical, Inc. Platform link wrist mechanism
US7063705B2 (en) 2001-06-29 2006-06-20 Sdgi Holdings, Inc. Fluoroscopic locator and registration device
US20040243147A1 (en) 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
ITMI20011759A1 (en) 2001-08-09 2003-02-09 Nuovo Pignone Spa SCRAPER DEVICE FOR PISTON ROD OF ALTERNATIVE COMPRESSORS
US7708741B1 (en) 2001-08-28 2010-05-04 Marctec, Llc Method of preparing bones for knee replacement surgery
US6728599B2 (en) 2001-09-07 2004-04-27 Computer Motion, Inc. Modularity system for computer assisted surgery
US6587750B2 (en) 2001-09-25 2003-07-01 Intuitive Surgical, Inc. Removable infinite roll master grip handle and touch sensor for robotic surgery
US6619840B2 (en) 2001-10-15 2003-09-16 Koninklijke Philips Electronics N.V. Interventional volume scanner
US6839612B2 (en) 2001-12-07 2005-01-04 Institute Surgical, Inc. Microwrist system for surgical procedures
US6947786B2 (en) 2002-02-28 2005-09-20 Surgical Navigation Technologies, Inc. Method and apparatus for perspective inversion
US8996169B2 (en) 2011-12-29 2015-03-31 Mako Surgical Corp. Neural monitor-based dynamic haptics
WO2003081220A2 (en) 2002-03-19 2003-10-02 Breakaway Imaging, Llc Computer tomograph with a detector following the movement of a pivotable x-ray source
WO2003086714A2 (en) 2002-04-05 2003-10-23 The Trustees Of Columbia University In The City Of New York Robotic scrub nurse
US7099428B2 (en) 2002-06-25 2006-08-29 The Regents Of The University Of Michigan High spatial resolution X-ray computed tomography (CT) system
US7248914B2 (en) 2002-06-28 2007-07-24 Stereotaxis, Inc. Method of navigating medical devices in the presence of radiopaque material
US7630752B2 (en) 2002-08-06 2009-12-08 Stereotaxis, Inc. Remote control of medical devices using a virtual device interface
US6922632B2 (en) 2002-08-09 2005-07-26 Intersense, Inc. Tracking, auto-calibration, and map-building system
US7231063B2 (en) 2002-08-09 2007-06-12 Intersense, Inc. Fiducial detection system
WO2004014244A2 (en) 2002-08-13 2004-02-19 Microbotics Corporation Microsurgical robot system
US6892090B2 (en) 2002-08-19 2005-05-10 Surgical Navigation Technologies, Inc. Method and apparatus for virtual endoscopy
US7331967B2 (en) 2002-09-09 2008-02-19 Hansen Medical, Inc. Surgical instrument coupling mechanism
ES2204322B1 (en) 2002-10-01 2005-07-16 Consejo Sup. De Invest. Cientificas FUNCTIONAL BROWSER.
JP3821435B2 (en) 2002-10-18 2006-09-13 松下電器産業株式会社 Ultrasonic probe
US7318827B2 (en) 2002-12-02 2008-01-15 Aesculap Ag & Co. Kg Osteotomy procedure
US7319897B2 (en) 2002-12-02 2008-01-15 Aesculap Ag & Co. Kg Localization device display method and apparatus
US8814793B2 (en) 2002-12-03 2014-08-26 Neorad As Respiration monitor
US7386365B2 (en) 2004-05-04 2008-06-10 Intuitive Surgical, Inc. Tool grip calibration for robotic surgery
US7945021B2 (en) 2002-12-18 2011-05-17 Varian Medical Systems, Inc. Multi-mode cone beam CT radiotherapy simulator and treatment machine with a flat panel imager
US7505809B2 (en) 2003-01-13 2009-03-17 Mediguide Ltd. Method and system for registering a first image with a second image relative to the body of a patient
US7660623B2 (en) 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
US7542791B2 (en) 2003-01-30 2009-06-02 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
WO2004069040A2 (en) 2003-02-04 2004-08-19 Z-Kat, Inc. Method and apparatus for computer assistance with intramedullary nail procedure
US6988009B2 (en) 2003-02-04 2006-01-17 Zimmer Technology, Inc. Implant registration device for surgical navigation system
US7083615B2 (en) 2003-02-24 2006-08-01 Intuitive Surgical Inc Surgical tool having electrocautery energy supply conductor with inhibited current leakage
JP4163991B2 (en) 2003-04-30 2008-10-08 株式会社モリタ製作所 X-ray CT imaging apparatus and imaging method
US9060770B2 (en) 2003-05-20 2015-06-23 Ethicon Endo-Surgery, Inc. Robotically-driven surgical instrument with E-beam driver
US7194120B2 (en) 2003-05-29 2007-03-20 Board Of Regents, The University Of Texas System Methods and systems for image-guided placement of implants
US7171257B2 (en) 2003-06-11 2007-01-30 Accuray Incorporated Apparatus and method for radiosurgery
US9002518B2 (en) 2003-06-30 2015-04-07 Intuitive Surgical Operations, Inc. Maximum torque driving of robotic surgical tools in robotic surgical systems
US7960935B2 (en) 2003-07-08 2011-06-14 The Board Of Regents Of The University Of Nebraska Robotic devices with agent delivery components and related methods
US7042184B2 (en) 2003-07-08 2006-05-09 Board Of Regents Of The University Of Nebraska Microrobot for surgical applications
DE602004024682D1 (en) 2003-07-15 2010-01-28 Koninkl Philips Electronics Nv UNG
US7313430B2 (en) 2003-08-28 2007-12-25 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
US7835778B2 (en) 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US20050171558A1 (en) 2003-10-17 2005-08-04 Abovitz Rony A. Neurosurgery targeting and delivery system for brain structures
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US20050096502A1 (en) 2003-10-29 2005-05-05 Khalili Theodore M. Robotic surgical device
US9393039B2 (en) 2003-12-17 2016-07-19 Brainlab Ag Universal instrument or instrument set for computer guided surgery
US7466303B2 (en) 2004-02-10 2008-12-16 Sunnybrook Health Sciences Center Device and process for manipulating real and virtual objects in three-dimensional space
WO2005086062A2 (en) 2004-03-05 2005-09-15 Depuy International Limited Registration methods and apparatus
US20060100610A1 (en) 2004-03-05 2006-05-11 Wallace Daniel T Methods using a robotic catheter system
US20080269596A1 (en) 2004-03-10 2008-10-30 Ian Revie Orthpaedic Monitoring Systems, Methods, Implants and Instruments
US7657298B2 (en) 2004-03-11 2010-02-02 Stryker Leibinger Gmbh & Co. Kg System, device, and method for determining a position of an object
US8475495B2 (en) 2004-04-08 2013-07-02 Globus Medical Polyaxial screw
US8860753B2 (en) 2004-04-13 2014-10-14 University Of Georgia Research Foundation, Inc. Virtual surgical system and methods
KR100617974B1 (en) 2004-04-22 2006-08-31 한국과학기술원 Laparoscopic device capable of command following
US7567834B2 (en) 2004-05-03 2009-07-28 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7379790B2 (en) 2004-05-04 2008-05-27 Intuitive Surgical, Inc. Tool memory-based software upgrades for robotic surgery
US7974674B2 (en) 2004-05-28 2011-07-05 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for surface modeling
US8528565B2 (en) 2004-05-28 2013-09-10 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for automated therapy delivery
FR2871363B1 (en) 2004-06-15 2006-09-01 Medtech Sa ROBOTIZED GUIDING DEVICE FOR SURGICAL TOOL
US7327865B2 (en) 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
ITMI20041448A1 (en) 2004-07-20 2004-10-20 Milano Politecnico APPARATUS FOR THE MERGER AND NAVIGATION OF ECOGRAPHIC AND VOLUMETRIC IMAGES OF A PATIENT USING A COMBINATION OF ACTIVE AND PASSIVE OPTICAL MARKERS FOR THE LOCALIZATION OF ECHOGRAPHIC PROBES AND SURGICAL INSTRUMENTS COMPARED TO THE PATIENT
US7440793B2 (en) 2004-07-22 2008-10-21 Sunita Chauhan Apparatus and method for removing abnormal tissue
US7979157B2 (en) 2004-07-23 2011-07-12 Mcmaster University Multi-purpose robotic operating system and method
US9072535B2 (en) 2011-05-27 2015-07-07 Ethicon Endo-Surgery, Inc. Surgical stapling instruments with rotatable staple deployment arrangements
GB2422759B (en) 2004-08-05 2008-07-16 Elekta Ab Rotatable X-ray scan apparatus with cone beam offset
US7702379B2 (en) 2004-08-25 2010-04-20 General Electric Company System and method for hybrid tracking in surgical navigation
US7555331B2 (en) 2004-08-26 2009-06-30 Stereotaxis, Inc. Method for surgical navigation utilizing scale-invariant registration between a navigation system and a localization system
DE102004042489B4 (en) 2004-08-31 2012-03-29 Siemens Ag Medical examination or treatment facility with associated method
AU2004323338B2 (en) 2004-09-15 2011-01-20 Ao Technology Ag Calibrating device
WO2006038145A1 (en) 2004-10-06 2006-04-13 Philips Intellectual Property & Standards Gmbh Computed tomography method
US7831294B2 (en) 2004-10-07 2010-11-09 Stereotaxis, Inc. System and method of surgical imagining with anatomical overlay for navigation of surgical devices
US7983733B2 (en) 2004-10-26 2011-07-19 Stereotaxis, Inc. Surgical navigation using a three-dimensional user interface
US7062006B1 (en) 2005-01-19 2006-06-13 The Board Of Trustees Of The Leland Stanford Junior University Computed tomography with increased field of view
US7837674B2 (en) 2005-01-24 2010-11-23 Intuitive Surgical Operations, Inc. Compact counter balance for robotic surgical systems
US7763015B2 (en) 2005-01-24 2010-07-27 Intuitive Surgical Operations, Inc. Modular manipulator support for robotic surgery
US20060184396A1 (en) 2005-01-28 2006-08-17 Dennis Charles L System and method for surgical navigation
US7231014B2 (en) 2005-02-14 2007-06-12 Varian Medical Systems Technologies, Inc. Multiple mode flat panel X-ray imaging system
ES2784219T3 (en) 2005-03-07 2020-09-23 Hector O Pacheco Cannula for improved access to vertebral bodies for kyphoplasty, vertebroplasty, vertebral body biopsy or screw placement
WO2006102756A1 (en) 2005-03-30 2006-10-05 University Western Ontario Anisotropic hydrogels
US8375808B2 (en) 2005-12-30 2013-02-19 Intuitive Surgical Operations, Inc. Force sensing for surgical instruments
US8496647B2 (en) 2007-12-18 2013-07-30 Intuitive Surgical Operations, Inc. Ribbed force sensor
US7720523B2 (en) 2005-04-20 2010-05-18 General Electric Company System and method for managing power deactivation within a medical imaging system
US8208988B2 (en) 2005-05-13 2012-06-26 General Electric Company System and method for controlling a medical imaging device
EP1887961B1 (en) 2005-06-06 2012-01-11 Intuitive Surgical Operations, Inc. Laparoscopic ultrasound robotic surgical system
US8398541B2 (en) 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
JP2007000406A (en) 2005-06-24 2007-01-11 Ge Medical Systems Global Technology Co Llc X-ray ct method and x-ray ct apparatus
US7840256B2 (en) 2005-06-27 2010-11-23 Biomet Manufacturing Corporation Image guided tracking array and method
US8241271B2 (en) 2005-06-30 2012-08-14 Intuitive Surgical Operations, Inc. Robotic surgical instruments with a fluid flow control system for irrigation, aspiration, and blowing
US20070038059A1 (en) 2005-07-07 2007-02-15 Garrett Sheffer Implant and instrument morphing
WO2007022081A2 (en) 2005-08-11 2007-02-22 The Brigham And Women's Hospital, Inc. System and method for performing single photon emission computed tomography (spect) with a focal-length cone-beam collimation
US7787699B2 (en) 2005-08-17 2010-08-31 General Electric Company Real-time integration and recording of surgical image data
US8800838B2 (en) 2005-08-31 2014-08-12 Ethicon Endo-Surgery, Inc. Robotically-controlled cable-based surgical end effectors
US20070073133A1 (en) 2005-09-15 2007-03-29 Schoenefeld Ryan J Virtual mouse for use in surgical navigation
US7643862B2 (en) 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US8079950B2 (en) 2005-09-29 2011-12-20 Intuitive Surgical Operations, Inc. Autofocus and/or autoscaling in telesurgery
EP1946243A2 (en) 2005-10-04 2008-07-23 Intersense, Inc. Tracking objects with markers
WO2007061890A2 (en) 2005-11-17 2007-05-31 Calypso Medical Technologies, Inc. Apparatus and methods for using an electromagnetic transponder in orthopedic procedures
US7711406B2 (en) 2005-11-23 2010-05-04 General Electric Company System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging
EP1795142B1 (en) 2005-11-24 2008-06-11 BrainLAB AG Medical tracking system using a gamma camera
US7689320B2 (en) 2005-12-20 2010-03-30 Intuitive Surgical Operations, Inc. Robotic surgical system with joint motion controller adapted to reduce instrument tip vibrations
US8672922B2 (en) 2005-12-20 2014-03-18 Intuitive Surgical Operations, Inc. Wireless communication in a robotic surgical system
US8182470B2 (en) 2005-12-20 2012-05-22 Intuitive Surgical Operations, Inc. Telescoping insertion axis of a robotic surgical system
US7762825B2 (en) 2005-12-20 2010-07-27 Intuitive Surgical Operations, Inc. Electro-mechanical interfaces to mount robotic surgical arms
US7819859B2 (en) 2005-12-20 2010-10-26 Intuitive Surgical Operations, Inc. Control system for reducing internally generated frictional and inertial resistance to manual positioning of a surgical manipulator
US8054752B2 (en) 2005-12-22 2011-11-08 Intuitive Surgical Operations, Inc. Synchronous data communication
ES2292327B1 (en) 2005-12-26 2009-04-01 Consejo Superior Investigaciones Cientificas MINI CAMERA GAMMA AUTONOMA AND WITH LOCATION SYSTEM, FOR INTRACHIRURGICAL USE.
US7907166B2 (en) 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
JP5152993B2 (en) 2005-12-30 2013-02-27 インテュイティブ サージカル インコーポレイテッド Modular force sensor
US7930065B2 (en) 2005-12-30 2011-04-19 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US7533892B2 (en) 2006-01-05 2009-05-19 Intuitive Surgical, Inc. Steering system for heavy mobile medical equipment
KR100731052B1 (en) 2006-01-23 2007-06-22 한양대학교 산학협력단 Computer Integrated Surgery Support System for Microinvasive Surgery
US8142420B2 (en) 2006-01-25 2012-03-27 Intuitive Surgical Operations Inc. Robotic arm with five-bar spherical linkage
US8162926B2 (en) 2006-01-25 2012-04-24 Intuitive Surgical Operations Inc. Robotic arm with five-bar spherical linkage
US7845537B2 (en) 2006-01-31 2010-12-07 Ethicon Endo-Surgery, Inc. Surgical instrument having recording capabilities
US20110290856A1 (en) 2006-01-31 2011-12-01 Ethicon Endo-Surgery, Inc. Robotically-controlled surgical instrument with force-feedback capabilities
EP1815950A1 (en) 2006-02-03 2007-08-08 The European Atomic Energy Community (EURATOM), represented by the European Commission Robotic surgical system for performing minimally invasive medical procedures
US8219177B2 (en) 2006-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US8526688B2 (en) 2006-03-09 2013-09-03 General Electric Company Methods and systems for registration of surgical navigation data and image data
US8208708B2 (en) 2006-03-30 2012-06-26 Koninklijke Philips Electronics N.V. Targeting method, targeting device, computer readable medium and program element
US20070233238A1 (en) 2006-03-31 2007-10-04 Medtronic Vascular, Inc. Devices for Imaging and Navigation During Minimally Invasive Non-Bypass Cardiac Procedures
US7760849B2 (en) 2006-04-14 2010-07-20 William Beaumont Hospital Tetrahedron beam computed tomography
US8021310B2 (en) 2006-04-21 2011-09-20 Nellcor Puritan Bennett Llc Work of breathing display for a ventilation system
US8112292B2 (en) 2006-04-21 2012-02-07 Medtronic Navigation, Inc. Method and apparatus for optimizing a therapy
US7940999B2 (en) 2006-04-24 2011-05-10 Siemens Medical Solutions Usa, Inc. System and method for learning-based 2D/3D rigid registration for image-guided surgery using Jensen-Shannon divergence
WO2007131561A2 (en) 2006-05-16 2007-11-22 Surgiceye Gmbh Method and device for 3d acquisition, 3d visualization and computer guided surgery using nuclear probes
US20080004523A1 (en) 2006-06-29 2008-01-03 General Electric Company Surgical tool guide
DE102006032127B4 (en) 2006-07-05 2008-04-30 Aesculap Ag & Co. Kg Calibration method and calibration device for a surgical referencing unit
US20080013809A1 (en) 2006-07-14 2008-01-17 Bracco Imaging, Spa Methods and apparatuses for registration in image guided surgery
EP1886640B1 (en) 2006-08-08 2009-11-18 BrainLAB AG Planning method and system for adjusting a free-shaped bone implant
EP2053972B1 (en) 2006-08-17 2013-09-11 Koninklijke Philips Electronics N.V. Computed tomography image acquisition
DE102006041033B4 (en) 2006-09-01 2017-01-19 Siemens Healthcare Gmbh Method for reconstructing a three-dimensional image volume
US8231610B2 (en) 2006-09-06 2012-07-31 National Cancer Center Robotic surgical system for laparoscopic surgery
US8150497B2 (en) 2006-09-08 2012-04-03 Medtronic, Inc. System for navigating a planned procedure within a body
US20080082109A1 (en) 2006-09-08 2008-04-03 Hansen Medical, Inc. Robotic surgical system with forward-oriented field of view guide instrument navigation
US8150498B2 (en) 2006-09-08 2012-04-03 Medtronic, Inc. System for identification of anatomical landmarks
US8532741B2 (en) 2006-09-08 2013-09-10 Medtronic, Inc. Method and apparatus to optimize electrode placement for neurological stimulation
US8248413B2 (en) 2006-09-18 2012-08-21 Stryker Corporation Visual navigation system for endoscopic surgery
EP2074383B1 (en) 2006-09-25 2016-05-11 Mazor Robotics Ltd. C-arm computerized tomography
US8660635B2 (en) 2006-09-29 2014-02-25 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
US8052688B2 (en) 2006-10-06 2011-11-08 Wolf Ii Erich Electromagnetic apparatus and method for nerve localization during spinal surgery
US20080144906A1 (en) 2006-10-09 2008-06-19 General Electric Company System and method for video capture for fluoroscopy and navigation
US20080109012A1 (en) 2006-11-03 2008-05-08 General Electric Company System, method and apparatus for tableside remote connections of medical instruments and systems using wireless communications
US8551114B2 (en) 2006-11-06 2013-10-08 Human Robotics S.A. De C.V. Robotic surgical device
US20080108912A1 (en) 2006-11-07 2008-05-08 General Electric Company System and method for measurement of clinical parameters of the knee for use during knee replacement surgery
US20080108991A1 (en) 2006-11-08 2008-05-08 General Electric Company Method and apparatus for performing pedicle screw fusion surgery
US8682413B2 (en) 2006-11-15 2014-03-25 General Electric Company Systems and methods for automated tracker-driven image selection
US7935130B2 (en) 2006-11-16 2011-05-03 Intuitive Surgical Operations, Inc. Two-piece end-effectors for robotic surgical tools
WO2008063494A2 (en) 2006-11-16 2008-05-29 Vanderbilt University Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same
US8727618B2 (en) 2006-11-22 2014-05-20 Siemens Aktiengesellschaft Robotic device and method for trauma patient diagnosis and therapy
US7835557B2 (en) 2006-11-28 2010-11-16 Medtronic Navigation, Inc. System and method for detecting status of imaging device
US8320991B2 (en) 2006-12-01 2012-11-27 Medtronic Navigation Inc. Portable electromagnetic navigation system
US7683331B2 (en) 2006-12-08 2010-03-23 Rush University Medical Center Single photon emission computed tomography (SPECT) system for cardiac imaging
US7683332B2 (en) 2006-12-08 2010-03-23 Rush University Medical Center Integrated single photon emission computed tomography (SPECT)/transmission computed tomography (TCT) system for cardiac imaging
US8556807B2 (en) 2006-12-21 2013-10-15 Intuitive Surgical Operations, Inc. Hermetically sealed distal sensor endoscope
US20080177203A1 (en) 2006-12-22 2008-07-24 General Electric Company Surgical navigation planning system and method for placement of percutaneous instrumentation and implants
DE102006061178A1 (en) 2006-12-22 2008-06-26 Siemens Ag Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit
US20080161680A1 (en) 2006-12-29 2008-07-03 General Electric Company System and method for surgical navigation of motion preservation prosthesis
US9220573B2 (en) 2007-01-02 2015-12-29 Medtronic Navigation, Inc. System and method for tracking positions of uniform marker geometries
US8684253B2 (en) 2007-01-10 2014-04-01 Ethicon Endo-Surgery, Inc. Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor
US8374673B2 (en) 2007-01-25 2013-02-12 Warsaw Orthopedic, Inc. Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control
EP2124799B1 (en) 2007-02-01 2012-10-31 Interactive Neuroscience Center, Llc Surgical navigation
US20080195081A1 (en) 2007-02-02 2008-08-14 Hansen Medical, Inc. Spinal surgery methods using a robotic instrument system
US8233963B2 (en) 2007-02-19 2012-07-31 Medtronic Navigation, Inc. Automatic identification of tracked surgical devices using an electromagnetic localization system
US8600478B2 (en) 2007-02-19 2013-12-03 Medtronic Navigation, Inc. Automatic identification of instruments used with a surgical navigation system
DE102007009017B3 (en) 2007-02-23 2008-09-25 Siemens Ag Arrangement for supporting a percutaneous procedure
US10039613B2 (en) 2007-03-01 2018-08-07 Surgical Navigation Technologies, Inc. Method for localizing an imaging device with a surgical navigation system
US8098914B2 (en) 2007-03-05 2012-01-17 Siemens Aktiengesellschaft Registration of CT volumes with fluoroscopic images
US20080228068A1 (en) 2007-03-13 2008-09-18 Viswanathan Raju R Automated Surgical Navigation with Electro-Anatomical and Pre-Operative Image Data
US8821511B2 (en) 2007-03-15 2014-09-02 General Electric Company Instrument guide for use with a surgical navigation system
US20080235052A1 (en) 2007-03-19 2008-09-25 General Electric Company System and method for sharing medical information between image-guided surgery systems
US8150494B2 (en) 2007-03-29 2012-04-03 Medtronic Navigation, Inc. Apparatus for registering a physical space to image space
US7879045B2 (en) 2007-04-10 2011-02-01 Medtronic, Inc. System for guiding instruments having different sizes
US8560118B2 (en) 2007-04-16 2013-10-15 Neuroarm Surgical Ltd. Methods, devices, and systems for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis
CA2684472C (en) 2007-04-16 2015-11-24 Neuroarm Surgical Ltd. Methods, devices, and systems for automated movements involving medical robots
US8311611B2 (en) 2007-04-24 2012-11-13 Medtronic, Inc. Method for performing multiple registrations in a navigated procedure
US8301226B2 (en) 2007-04-24 2012-10-30 Medtronic, Inc. Method and apparatus for performing a navigated procedure
US20090012509A1 (en) 2007-04-24 2009-01-08 Medtronic, Inc. Navigated Soft Tissue Penetrating Laser System
US8108025B2 (en) 2007-04-24 2012-01-31 Medtronic, Inc. Flexible array for use in navigated surgery
US8010177B2 (en) 2007-04-24 2011-08-30 Medtronic, Inc. Intraoperative image registration
US8062364B1 (en) 2007-04-27 2011-11-22 Knee Creations, Llc Osteoarthritis treatment and device
DE102007022122B4 (en) 2007-05-11 2019-07-11 Deutsches Zentrum für Luft- und Raumfahrt e.V. Gripping device for a surgery robot arrangement
US8057397B2 (en) 2007-05-16 2011-11-15 General Electric Company Navigation and imaging system sychronized with respiratory and/or cardiac activity
US20080287771A1 (en) 2007-05-17 2008-11-20 General Electric Company Surgical navigation system with electrostatic shield
US8934961B2 (en) 2007-05-18 2015-01-13 Biomet Manufacturing, Llc Trackable diagnostic scope apparatus and methods of use
US20080300477A1 (en) 2007-05-30 2008-12-04 General Electric Company System and method for correction of automated image registration
US20080300478A1 (en) 2007-05-30 2008-12-04 General Electric Company System and method for displaying real-time state of imaged anatomy during a surgical procedure
US9468412B2 (en) 2007-06-22 2016-10-18 General Electric Company System and method for accuracy verification for image based surgical navigation
EP2170564A4 (en) 2007-07-12 2015-10-07 Univ Nebraska METHODS AND SYSTEMS FOR ACTUATION IN ROBOTIC DEVICES
US7834484B2 (en) 2007-07-16 2010-11-16 Tyco Healthcare Group Lp Connection cable and method for activating a voltage-controlled generator
JP2009045428A (en) 2007-07-25 2009-03-05 Terumo Corp Operating mechanism, medical manipulator and surgical robot system
US8100950B2 (en) 2007-07-27 2012-01-24 The Cleveland Clinic Foundation Oblique lumbar interbody fusion
US8035685B2 (en) 2007-07-30 2011-10-11 General Electric Company Systems and methods for communicating video data between a mobile imaging system and a fixed monitor system
US8328818B1 (en) 2007-08-31 2012-12-11 Globus Medical, Inc. Devices and methods for treating bone
CA2737938C (en) 2007-09-19 2016-09-13 Walter A. Roberts Direct visualization robotic intra-operative radiation therapy applicator device
US20090080737A1 (en) 2007-09-25 2009-03-26 General Electric Company System and Method for Use of Fluoroscope and Computed Tomography Registration for Sinuplasty Navigation
US9050120B2 (en) 2007-09-30 2015-06-09 Intuitive Surgical Operations, Inc. Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US9522046B2 (en) 2010-08-23 2016-12-20 Gip Robotic surgery system
CN101848679B (en) 2007-11-06 2014-08-06 皇家飞利浦电子股份有限公司 Nuclear medicine SPECT-CT machine with integrated asymmetric flat panel cone-beam CT and SPECT system
DE102007055203A1 (en) 2007-11-19 2009-05-20 Kuka Roboter Gmbh A robotic device, medical workstation and method for registering an object
US8561473B2 (en) 2007-12-18 2013-10-22 Intuitive Surgical Operations, Inc. Force sensor temperature compensation
US8400094B2 (en) 2007-12-21 2013-03-19 Intuitive Surgical Operations, Inc. Robotic surgical system with patient support
CN101902968A (en) 2007-12-21 2010-12-01 皇家飞利浦电子股份有限公司 Synchronous interventional scanner
US8864798B2 (en) 2008-01-18 2014-10-21 Globus Medical, Inc. Transverse connector
EP2244784A2 (en) 2008-01-30 2010-11-03 The Trustees of Columbia University in the City of New York Systems, devices, and methods for robot-assisted micro-surgical stenting
US20090198121A1 (en) 2008-02-01 2009-08-06 Martin Hoheisel Method and apparatus for coordinating contrast agent injection and image acquisition in c-arm computed tomography
US8573465B2 (en) 2008-02-14 2013-11-05 Ethicon Endo-Surgery, Inc. Robotically-controlled surgical end effector system with rotary actuated closure systems
US8696458B2 (en) 2008-02-15 2014-04-15 Thales Visionix, Inc. Motion tracking system and method using camera and non-camera sensors
US7925653B2 (en) 2008-02-27 2011-04-12 General Electric Company Method and system for accessing a group of objects in an electronic document
US20090228019A1 (en) 2008-03-10 2009-09-10 Yosef Gross Robotic surgical system
US8282653B2 (en) 2008-03-24 2012-10-09 Board Of Regents Of The University Of Nebraska System and methods for controlling surgical tool elements
BRPI0822423B1 (en) 2008-03-28 2020-09-24 Telefonaktiebolaget Lm Ericsson (Publ) METHODS TO ENABLE DETECTION AND DETECTION OF A BASE STATION, BASE STATION OF A COMMUNICATION NETWORK, AND, NUCLEUS NETWORK NODE
US8808164B2 (en) 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
US7843158B2 (en) 2008-03-31 2010-11-30 Intuitive Surgical Operations, Inc. Medical robotic system adapted to inhibit motions resulting in excessive end effector forces
US8333755B2 (en) 2008-03-31 2012-12-18 Intuitive Surgical Operations, Inc. Coupler to transfer controller motion from a robotic manipulator to an attached instrument
US7886743B2 (en) 2008-03-31 2011-02-15 Intuitive Surgical Operations, Inc. Sterile drape interface for robotic surgical instrument
US9002076B2 (en) 2008-04-15 2015-04-07 Medtronic, Inc. Method and apparatus for optimal trajectory planning
US9345875B2 (en) 2008-04-17 2016-05-24 Medtronic, Inc. Method and apparatus for cannula fixation for an array insertion tube set
US8810631B2 (en) 2008-04-26 2014-08-19 Intuitive Surgical Operations, Inc. Augmented stereoscopic visualization for a surgical robot using a captured visible image combined with a fluorescence image and a captured visible image
ES2764964T3 (en) 2008-04-30 2020-06-05 Nanosys Inc Dirt-resistant surfaces for reflective spheres
US9579161B2 (en) 2008-05-06 2017-02-28 Medtronic Navigation, Inc. Method and apparatus for tracking a patient
CN102014760B (en) 2008-06-09 2013-11-06 韩商未来股份有限公司 Active Interface and Actuation Methods for Surgical Robots
TW201004607A (en) 2008-07-25 2010-02-01 Been-Der Yang Image guided navigation system and method thereof
US8054184B2 (en) 2008-07-31 2011-11-08 Intuitive Surgical Operations, Inc. Identification of surgical instrument attached to surgical robot
US8771170B2 (en) 2008-08-01 2014-07-08 Microaccess, Inc. Methods and apparatus for transesophageal microaccess surgery
JP2010035984A (en) 2008-08-08 2010-02-18 Canon Inc X-ray imaging apparatus
US9248000B2 (en) 2008-08-15 2016-02-02 Stryker European Holdings I, Llc System for and method of visualizing an interior of body
WO2010022088A1 (en) 2008-08-18 2010-02-25 Encision, Inc. Enhanced control systems including flexible shielding and support systems for electrosurgical applications
DE102008041813B4 (en) 2008-09-04 2013-06-20 Carl Zeiss Microscopy Gmbh Method for the depth analysis of an organic sample
US7900524B2 (en) 2008-09-09 2011-03-08 Intersense, Inc. Monitoring tools
US8165658B2 (en) 2008-09-26 2012-04-24 Medtronic, Inc. Method and apparatus for positioning a guide relative to a base
US8073335B2 (en) 2008-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Operator input device for a robotic surgical system
WO2010041193A2 (en) 2008-10-10 2010-04-15 Koninklijke Philips Electronics N.V. Method and apparatus to improve ct image acquisition using a displaced geometry
KR100944412B1 (en) 2008-10-13 2010-02-25 (주)미래컴퍼니 Surgical slave robot
US8781630B2 (en) 2008-10-14 2014-07-15 University Of Florida Research Foundation, Inc. Imaging platform to provide integrated navigation capabilities for surgical guidance
CN102238916B (en) 2008-10-20 2013-12-04 约翰霍普金斯大学 Environment property estimation and graphical display
EP2179703B1 (en) 2008-10-21 2012-03-28 BrainLAB AG Integration of surgical instrument and display device for supporting image-based surgery
KR101075363B1 (en) 2008-10-31 2011-10-19 정창욱 Surgical Robot System Having Tool for Minimally Invasive Surgery
US8798933B2 (en) 2008-10-31 2014-08-05 The Invention Science Fund I, Llc Frozen compositions and methods for piercing a substrate
US9033958B2 (en) 2008-11-11 2015-05-19 Perception Raisonnement Action En Medecine Surgical robotic system
TWI435705B (en) 2008-11-20 2014-05-01 Been Der Yang Surgical position device and image guided navigation system using the same
WO2010061810A1 (en) 2008-11-27 2010-06-03 株式会社 日立メディコ Radiation image pickup device
US8483800B2 (en) 2008-11-29 2013-07-09 General Electric Company Surgical navigation enabled imaging table environment
CN102300512B (en) 2008-12-01 2016-01-20 马佐尔机器人有限公司 Robot-guided oblique spine stabilization
ES2341079B1 (en) 2008-12-11 2011-07-13 Fundacio Clinic Per A La Recerca Biomedica EQUIPMENT FOR IMPROVED VISION BY INFRARED VASCULAR STRUCTURES, APPLICABLE TO ASSIST PHYTOSCOPIC, LAPAROSCOPIC AND ENDOSCOPIC INTERVENTIONS AND SIGNAL TREATMENT PROCESS TO IMPROVE SUCH VISION.
US8021393B2 (en) 2008-12-12 2011-09-20 Globus Medical, Inc. Lateral spinous process spacer with deployable wings
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US8374723B2 (en) 2008-12-31 2013-02-12 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
US8594841B2 (en) 2008-12-31 2013-11-26 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US8184880B2 (en) 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
CN103349556B (en) 2009-01-21 2015-09-23 皇家飞利浦电子股份有限公司 For Large visual angle imaging and the detection of motion artifacts and the method and apparatus of compensation
WO2010086374A1 (en) 2009-01-29 2010-08-05 Imactis Method and device for navigation of a surgical tool
KR101038417B1 (en) 2009-02-11 2011-06-01 주식회사 이턴 Surgical Robot System and Its Control Method
US8120301B2 (en) 2009-03-09 2012-02-21 Intuitive Surgical Operations, Inc. Ergonomic surgeon control console in robotic surgical systems
US9737235B2 (en) 2009-03-09 2017-08-22 Medtronic Navigation, Inc. System and method for image-guided navigation
US8918207B2 (en) 2009-03-09 2014-12-23 Intuitive Surgical Operations, Inc. Operator input device for a robotic surgical system
US8418073B2 (en) 2009-03-09 2013-04-09 Intuitive Surgical Operations, Inc. User interfaces for electrosurgical tools in robotic surgical systems
CA2755036A1 (en) 2009-03-10 2010-09-16 Mcmaster University Mobile robotic surgical system
US8335552B2 (en) 2009-03-20 2012-12-18 Medtronic, Inc. Method and apparatus for instrument placement
CN105342705A (en) 2009-03-24 2016-02-24 伊顿株式会社 Surgical robot system using augmented reality, and method for controlling same
US20100249571A1 (en) 2009-03-31 2010-09-30 General Electric Company Surgical navigation system with wireless magnetoresistance tracking sensors
US8882803B2 (en) 2009-04-01 2014-11-11 Globus Medical, Inc. Orthopedic clamp and extension rod
EP2429438A1 (en) 2009-04-24 2012-03-21 Medtronic, Inc. Electromagnetic navigation of medical instruments for cardiothoracic surgery
EP2432372B1 (en) 2009-05-18 2018-12-26 Teleflex Medical Incorporated Devices for performing minimally invasive surgery
ES2388029B1 (en) 2009-05-22 2013-08-13 Universitat Politècnica De Catalunya ROBOTIC SYSTEM FOR LAPAROSCOPIC SURGERY.
CN101897593B (en) 2009-05-26 2014-08-13 清华大学 A computer tomography device and method
US8121249B2 (en) 2009-06-04 2012-02-21 Virginia Tech Intellectual Properties, Inc. Multi-parameter X-ray computed tomography
WO2011013164A1 (en) 2009-07-27 2011-02-03 株式会社島津製作所 Radiographic apparatus
WO2011015957A1 (en) 2009-08-06 2011-02-10 Koninklijke Philips Electronics N.V. Method and apparatus for generating computed tomography images with offset detector geometries
EP2467798B1 (en) 2009-08-17 2020-04-15 Mazor Robotics Ltd. Device for improving the accuracy of manual operations
US9844414B2 (en) 2009-08-31 2017-12-19 Gregory S. Fischer System and method for robotic surgical intervention in a magnetic resonance imager
EP2298223A1 (en) 2009-09-21 2011-03-23 Stryker Leibinger GmbH & Co. KG Technique for registering image data of an object
US8465476B2 (en) 2009-09-23 2013-06-18 Intuitive Surgical Operations, Inc. Cannula mounting fixture
WO2011038759A1 (en) 2009-09-30 2011-04-07 Brainlab Ag Two-part medical tracking marker
NL1037348C2 (en) 2009-10-02 2011-04-05 Univ Eindhoven Tech Surgical robot, instrument manipulator, combination of an operating table and a surgical robot, and master-slave operating system.
US8685098B2 (en) 2010-06-25 2014-04-01 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US8556979B2 (en) 2009-10-15 2013-10-15 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US8679183B2 (en) 2010-06-25 2014-03-25 Globus Medical Expandable fusion device and method of installation thereof
US8062375B2 (en) 2009-10-15 2011-11-22 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US20110098553A1 (en) 2009-10-28 2011-04-28 Steven Robbins Automatic registration of images for image guided surgery
USD631966S1 (en) 2009-11-10 2011-02-01 Globus Medical, Inc. Basilar invagination implant
US8521331B2 (en) 2009-11-13 2013-08-27 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
US20110137152A1 (en) 2009-12-03 2011-06-09 General Electric Company System and method for cooling components of a surgical navigation system
US8277509B2 (en) 2009-12-07 2012-10-02 Globus Medical, Inc. Transforaminal prosthetic spinal disc apparatus
WO2011070519A1 (en) 2009-12-10 2011-06-16 Koninklijke Philips Electronics N.V. Scanning system for differential phase contrast imaging
US8694075B2 (en) 2009-12-21 2014-04-08 General Electric Company Intra-operative registration for navigated surgical procedures
US8353963B2 (en) 2010-01-12 2013-01-15 Globus Medical Expandable spacer and method for use thereof
US9381045B2 (en) 2010-01-13 2016-07-05 Jcbd, Llc Sacroiliac joint implant and sacroiliac joint instrument for fusing a sacroiliac joint
JP5795599B2 (en) 2010-01-13 2015-10-14 コーニンクレッカ フィリップス エヌ ヴェ Image integration based registration and navigation for endoscopic surgery
US9030444B2 (en) 2010-01-14 2015-05-12 Brainlab Ag Controlling and/or operating a medical device by means of a light pointer
US9039769B2 (en) 2010-03-17 2015-05-26 Globus Medical, Inc. Intervertebral nucleus and annulus implants and method of use thereof
US20110238080A1 (en) 2010-03-25 2011-09-29 Date Ranjit Robotic Surgical Instrument System
US20140330288A1 (en) 2010-03-25 2014-11-06 Precision Automation And Robotics India Ltd. Articulating Arm for a Robotic Surgical Instrument System
IT1401669B1 (en) 2010-04-07 2013-08-02 Sofar Spa ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.
US8870880B2 (en) 2010-04-12 2014-10-28 Globus Medical, Inc. Angling inserter tool for expandable vertebral implant
US8717430B2 (en) 2010-04-26 2014-05-06 Medtronic Navigation, Inc. System and method for radio-frequency imaging, registration, and localization
IT1399603B1 (en) 2010-04-26 2013-04-26 Scuola Superiore Di Studi Universitari E Di Perfez ROBOTIC SYSTEM FOR MINIMUM INVASIVE SURGERY INTERVENTIONS
CA2797302C (en) 2010-04-28 2019-01-15 Ryerson University System and methods for intraoperative guidance feedback
JP2013530028A (en) 2010-05-04 2013-07-25 パスファインダー セラピューティクス,インコーポレイテッド System and method for abdominal surface matching using pseudo features
US8738115B2 (en) 2010-05-11 2014-05-27 Siemens Aktiengesellschaft Method and apparatus for selective internal radiation therapy planning and implementation
DE102010020284A1 (en) 2010-05-12 2011-11-17 Siemens Aktiengesellschaft Determination of 3D positions and orientations of surgical objects from 2D X-ray images
US8603077B2 (en) 2010-05-14 2013-12-10 Intuitive Surgical Operations, Inc. Force transmission for robotic surgical instrument
US8883210B1 (en) 2010-05-14 2014-11-11 Musculoskeletal Transplant Foundation Tissue-derived tissuegenic implants, and methods of fabricating and using same
US8746252B2 (en) 2010-05-14 2014-06-10 Intuitive Surgical Operations, Inc. Surgical system sterile drape
KR101181569B1 (en) 2010-05-25 2012-09-10 정창욱 Surgical robot system capable of implementing both of single port surgery mode and multi-port surgery mode and method for controlling same
US20110295370A1 (en) 2010-06-01 2011-12-01 Sean Suh Spinal Implants and Methods of Use Thereof
DE102010026674B4 (en) 2010-07-09 2012-09-27 Siemens Aktiengesellschaft Imaging device and radiotherapy device
US8675939B2 (en) 2010-07-13 2014-03-18 Stryker Leibinger Gmbh & Co. Kg Registration of anatomical data sets
WO2012007036A1 (en) 2010-07-14 2012-01-19 Brainlab Ag Method and system for determining an imaging direction and calibration of an imaging apparatus
US20120035507A1 (en) 2010-07-22 2012-02-09 Ivan George Device and method for measuring anatomic geometries
US8740882B2 (en) 2010-07-30 2014-06-03 Lg Electronics Inc. Medical robotic system and method of controlling the same
WO2012024686A2 (en) 2010-08-20 2012-02-23 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation
JP2012045278A (en) 2010-08-30 2012-03-08 Fujifilm Corp X-ray imaging apparatus and x-ray imaging method
US8764448B2 (en) 2010-09-01 2014-07-01 Agency For Science, Technology And Research Robotic device for use in image-guided robot assisted surgical training
KR20120030174A (en) 2010-09-17 2012-03-28 삼성전자주식회사 Surgery robot system and surgery apparatus and method for providing tactile feedback
EP2431003B1 (en) 2010-09-21 2018-03-21 Medizinische Universität Innsbruck Registration device, system, kit and method for a patient registration
US8679125B2 (en) 2010-09-22 2014-03-25 Biomet Manufacturing, Llc Robotic guided femoral head reshaping
US8657809B2 (en) 2010-09-29 2014-02-25 Stryker Leibinger Gmbh & Co., Kg Surgical navigation system
US8526700B2 (en) 2010-10-06 2013-09-03 Robert E. Isaacs Imaging system and method for surgical and interventional medical procedures
US8718346B2 (en) 2011-10-05 2014-05-06 Saferay Spine Llc Imaging system and method for use in surgical and interventional medical procedures
US9913693B2 (en) 2010-10-29 2018-03-13 Medtronic, Inc. Error correction techniques in surgical navigation
CA2821110A1 (en) 2010-12-13 2012-06-21 Ortho Kinematics, Inc. Methods, systems and devices for clinical data reporting and surgical navigation
US8876866B2 (en) 2010-12-13 2014-11-04 Globus Medical, Inc. Spinous process fusion devices and methods thereof
AU2011348240B2 (en) 2010-12-22 2015-03-26 Viewray Technologies, Inc. System and method for image guidance during medical procedures
WO2012095755A1 (en) 2011-01-13 2012-07-19 Koninklijke Philips Electronics N.V. Intraoperative camera calibration for endoscopic surgery
KR101181613B1 (en) 2011-02-21 2012-09-10 윤상진 Surgical robot system for performing surgery based on displacement information determined by user designation and control method therefor
US20120226145A1 (en) 2011-03-03 2012-09-06 National University Of Singapore Transcutaneous robot-assisted ablation-device insertion navigation system
US9026247B2 (en) 2011-03-30 2015-05-05 University of Washington through its Center for Communication Motion and video capture for tracking and evaluating robotic surgery and associated systems and methods
US9308050B2 (en) 2011-04-01 2016-04-12 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system and method for spinal and other surgeries
US20150213633A1 (en) 2011-04-06 2015-07-30 The Trustees Of Columbia University In The City Of New York System, method and computer-accessible medium for providing a panoramic cone beam computed tomography (cbct)
US20120256092A1 (en) 2011-04-06 2012-10-11 General Electric Company Ct system for use in multi-modality imaging system
WO2012149548A2 (en) 2011-04-29 2012-11-01 The Johns Hopkins University System and method for tracking and navigation
WO2012169642A1 (en) 2011-06-06 2012-12-13 株式会社大野興業 Method for manufacturing registration template
US8498744B2 (en) 2011-06-30 2013-07-30 Mako Surgical Corporation Surgical robotic systems with manual and haptic and/or active control modes
US9089353B2 (en) 2011-07-11 2015-07-28 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
US8818105B2 (en) 2011-07-14 2014-08-26 Accuray Incorporated Image registration for image-guided surgery
KR20130015146A (en) 2011-08-02 2013-02-13 삼성전자주식회사 Method and apparatus for processing medical image, robotic surgery system using image guidance
US10866783B2 (en) 2011-08-21 2020-12-15 Transenterix Europe S.A.R.L. Vocally activated surgical control system
US9427330B2 (en) 2011-09-06 2016-08-30 Globus Medical, Inc. Spinal plate
US8864833B2 (en) 2011-09-30 2014-10-21 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US9060794B2 (en) 2011-10-18 2015-06-23 Mako Surgical Corp. System and method for robotic surgery
US8894688B2 (en) 2011-10-27 2014-11-25 Globus Medical Inc. Adjustable rod devices and methods of using the same
DE102011054910B4 (en) 2011-10-28 2013-10-10 Ovesco Endoscopy Ag Magnetic end effector and means for guiding and positioning same
CA2854829C (en) 2011-11-15 2019-07-02 Manickam UMASUTHAN Method of real-time tracking of moving/flexible surfaces
FR2983059B1 (en) 2011-11-30 2014-11-28 Medtech ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD
WO2013084221A1 (en) 2011-12-05 2013-06-13 Mazor Robotics Ltd. Active bed mount for surgical robot
KR101901580B1 (en) 2011-12-23 2018-09-28 삼성전자주식회사 Surgical robot and control method thereof
FR2985167A1 (en) 2011-12-30 2013-07-05 Medtech ROBOTISE MEDICAL METHOD FOR MONITORING PATIENT BREATHING AND CORRECTION OF ROBOTIC TRAJECTORY.
WO2013101917A1 (en) 2011-12-30 2013-07-04 Mako Surgical Corp. System for image-based robotic surgery
US9265583B2 (en) 2011-12-30 2016-02-23 Mako Surgical Corp. Method for image-based robotic surgery
KR20130080909A (en) 2012-01-06 2013-07-16 삼성전자주식회사 Surgical robot and method for controlling the same
US9138297B2 (en) 2012-02-02 2015-09-22 Intuitive Surgical Operations, Inc. Systems and methods for controlling a robotic surgical system
EP2816966B1 (en) 2012-02-22 2023-10-25 Veran Medical Technologies, Inc. Steerable surgical catheter comprising a biopsy device at the distal end portion thereof
US11207132B2 (en) 2012-03-12 2021-12-28 Nuvasive, Inc. Systems and methods for performing spinal surgery
US8855822B2 (en) 2012-03-23 2014-10-07 Innovative Surgical Solutions, Llc Robotic surgical system with mechanomyography feedback
KR101946000B1 (en) 2012-03-28 2019-02-08 삼성전자주식회사 Robot system and Control Method thereof for surgery
US8888821B2 (en) 2012-04-05 2014-11-18 Warsaw Orthopedic, Inc. Spinal implant measuring system and method
EP2838432A4 (en) 2012-04-16 2015-12-30 Neurologica Corp Wireless imaging system
WO2013158655A1 (en) 2012-04-16 2013-10-24 Neurologica Corp. Imaging system with rigidly mounted fiducial markers
US20140142591A1 (en) 2012-04-24 2014-05-22 Auris Surgical Robotics, Inc. Method, apparatus and a system for robotic assisted surgery
US10383765B2 (en) 2012-04-24 2019-08-20 Auris Health, Inc. Apparatus and method for a global coordinate system for use in robotic surgery
WO2013166098A1 (en) 2012-05-01 2013-11-07 The Johns Hopkins University Improved method and apparatus for robotically assisted cochlear implant surgery
US20140234804A1 (en) 2012-05-02 2014-08-21 Eped Inc. Assisted Guidance and Navigation Method in Intraoral Surgery
US9125556B2 (en) 2012-05-14 2015-09-08 Mazor Robotics Ltd. Robotic guided endoscope
JP2015516278A (en) 2012-05-18 2015-06-11 ケアストリーム ヘルス インク Volumetric imaging system for cone-beam computed tomography
KR20130132109A (en) 2012-05-25 2013-12-04 삼성전자주식회사 Supporting device and surgical robot system adopting the same
EP2854688B1 (en) 2012-06-01 2022-08-17 Intuitive Surgical Operations, Inc. Manipulator arm-to-patient collision avoidance using a null-space
KR102849844B1 (en) 2012-06-01 2025-08-26 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Multi­port surgical robotic system architecture
US20130345757A1 (en) 2012-06-22 2013-12-26 Shawn D. Stad Image Guided Intra-Operative Contouring Aid
EP4234185A3 (en) 2012-06-22 2023-09-20 Board of Regents of the University of Nebraska Local control robotic surgical devices
US20140001234A1 (en) 2012-06-28 2014-01-02 Ethicon Endo-Surgery, Inc. Coupling arrangements for attaching surgical end effectors to drive systems therefor
US8880223B2 (en) 2012-07-16 2014-11-04 Florida Institute for Human & Maching Cognition Anthro-centric multisensory interface for sensory augmentation of telesurgery
US20140031664A1 (en) 2012-07-30 2014-01-30 Mako Surgical Corp. Radiographic imaging device
KR101997566B1 (en) 2012-08-07 2019-07-08 삼성전자주식회사 Surgical robot system and control method thereof
US9770305B2 (en) 2012-08-08 2017-09-26 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
CA2880622C (en) 2012-08-08 2021-01-12 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems and related methods
US10110785B2 (en) 2012-08-10 2018-10-23 Karl Storz Imaging, Inc. Deployable imaging system equipped with solid state imager
WO2014028703A1 (en) 2012-08-15 2014-02-20 Intuitive Surgical Operations, Inc. Systems and methods for cancellation of joint motion using the null-space
MX2015002400A (en) 2012-08-24 2015-11-09 Univ Houston Robotic device and systems for image-guided and robot-assisted surgery.
US20140080086A1 (en) 2012-09-20 2014-03-20 Roger Chen Image Navigation Integrated Dental Implant System
US8892259B2 (en) 2012-09-26 2014-11-18 Innovative Surgical Solutions, LLC. Robotic surgical system with mechanomyography feedback
US9757160B2 (en) 2012-09-28 2017-09-12 Globus Medical, Inc. Device and method for treatment of spinal deformity
KR102038632B1 (en) 2012-11-06 2019-10-30 삼성전자주식회사 surgical instrument, supporting device, and surgical robot system adopting the same
CN104780862A (en) 2012-11-14 2015-07-15 直观外科手术操作公司 Smart hangers for collision avoidance
KR102079945B1 (en) 2012-11-22 2020-02-21 삼성전자주식회사 Surgical robot and method for controlling the surgical robot
US9008752B2 (en) 2012-12-14 2015-04-14 Medtronic, Inc. Method to determine distribution of a material by an infused magnetic resonance image contrast agent
US9393361B2 (en) 2012-12-14 2016-07-19 Medtronic, Inc. Method to determine a material distribution
US9001962B2 (en) 2012-12-20 2015-04-07 Triple Ring Technologies, Inc. Method and apparatus for multiple X-ray imaging applications
DE102013004459A1 (en) 2012-12-20 2014-06-26 avateramedical GmBH Holding and positioning device of a surgical instrument and / or an endoscope for minimally invasive surgery and a robotic surgical system
DE102012025101A1 (en) 2012-12-20 2014-06-26 avateramedical GmBH Active positioning device of a surgical instrument and a surgical robotic system comprising it
US9002437B2 (en) 2012-12-27 2015-04-07 General Electric Company Method and system for position orientation correction in navigation
WO2014106262A1 (en) 2012-12-31 2014-07-03 Mako Surgical Corp. System for image-based robotic surgery
KR20140090374A (en) 2013-01-08 2014-07-17 삼성전자주식회사 Single port surgical robot and control method thereof
CN103969269B (en) 2013-01-31 2018-09-18 Ge医疗系统环球技术有限公司 Method and apparatus for geometric calibration CT scanner
US20140221819A1 (en) 2013-02-01 2014-08-07 David SARMENT Apparatus, system and method for surgical navigation
CN105101903B (en) 2013-02-04 2018-08-24 儿童国家医疗中心 Hybrid Control Surgical Robotic System
KR20140102465A (en) 2013-02-14 2014-08-22 삼성전자주식회사 Surgical robot and method for controlling the same
KR102117270B1 (en) 2013-03-06 2020-06-01 삼성전자주식회사 Surgical robot system and method for controlling the same
KR20140110620A (en) 2013-03-08 2014-09-17 삼성전자주식회사 surgical robot system and operating method thereof
KR20140110685A (en) 2013-03-08 2014-09-17 삼성전자주식회사 Method for controlling of single port surgical robot
KR20140112207A (en) 2013-03-13 2014-09-23 삼성전자주식회사 Augmented reality imaging display system and surgical robot system comprising the same
US9314308B2 (en) 2013-03-13 2016-04-19 Ethicon Endo-Surgery, Llc Robotic ultrasonic surgical device with articulating end effector
KR102119534B1 (en) 2013-03-13 2020-06-05 삼성전자주식회사 Surgical robot and method for controlling the same
CA2905948C (en) 2013-03-14 2022-01-11 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers
US9629595B2 (en) 2013-03-15 2017-04-25 Hansen Medical, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
EP4628042A2 (en) 2013-03-15 2025-10-08 Virtual Incision Corporation Robotic surgical devices and systems
KR102117273B1 (en) 2013-03-21 2020-06-01 삼성전자주식회사 Surgical robot system and method for controlling the same
KR20140121581A (en) 2013-04-08 2014-10-16 삼성전자주식회사 Surgical robot system
KR20140123122A (en) 2013-04-10 2014-10-22 삼성전자주식회사 Surgical Robot and controlling method of thereof
US9414859B2 (en) 2013-04-19 2016-08-16 Warsaw Orthopedic, Inc. Surgical rod measuring system and method
US8964934B2 (en) 2013-04-25 2015-02-24 Moshe Ein-Gal Cone beam CT scanning
KR20140129702A (en) 2013-04-30 2014-11-07 삼성전자주식회사 Surgical robot system and method for controlling the same
US20140364720A1 (en) 2013-06-10 2014-12-11 General Electric Company Systems and methods for interactive magnetic resonance imaging
DE102013012397B4 (en) 2013-07-26 2018-05-24 Rg Mechatronics Gmbh Surgical robot system
US10786283B2 (en) 2013-08-01 2020-09-29 Musc Foundation For Research Development Skeletal bone fixation mechanism
US20150085970A1 (en) 2013-09-23 2015-03-26 General Electric Company Systems and methods for hybrid scanning
CN105813585B (en) 2013-10-07 2020-01-10 泰克尼恩研究和发展基金有限公司 Needle steering by lever manipulation
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
EP3973899B1 (en) 2013-10-09 2024-10-30 Nuvasive, Inc. Surgical spinal correction
ITBO20130599A1 (en) 2013-10-31 2015-05-01 Cefla Coop METHOD AND APPARATUS TO INCREASE THE FIELD OF VIEW IN A COMPUTERIZED TOMOGRAPHIC ACQUISITION WITH CONE-BEAM TECHNIQUE
US20150146847A1 (en) 2013-11-26 2015-05-28 General Electric Company Systems and methods for providing an x-ray imaging system with nearly continuous zooming capability
US10034717B2 (en) 2014-03-17 2018-07-31 Intuitive Surgical Operations, Inc. System and method for breakaway clutching in an articulated arm
CN110367988A (en) 2014-06-17 2019-10-25 纽文思公司 Plan and assess the device of deformity of spinal column correction during vertebra program of performing the operation in operation
EP3193768A4 (en) 2014-09-17 2018-05-09 Intuitive Surgical Operations, Inc. Systems and methods for utilizing augmented jacobian to control manipulator joint movement
EP3226790B1 (en) 2014-12-04 2023-09-13 Mazor Robotics Ltd. Shaper for vertebral fixation rods
US20160166329A1 (en) 2014-12-15 2016-06-16 General Electric Company Tomographic imaging for interventional tool guidance
CN107645924B (en) 2015-04-15 2021-04-20 莫比乌斯成像公司 Integrated medical imaging and surgical robotic system
US10180404B2 (en) 2015-04-30 2019-01-15 Shimadzu Corporation X-ray analysis device
US20170143284A1 (en) 2015-11-25 2017-05-25 Carestream Health, Inc. Method to detect a retained surgical object
US10070939B2 (en) 2015-12-04 2018-09-11 Zaki G. Ibrahim Methods for performing minimally invasive transforaminal lumbar interbody fusion using guidance
WO2017127838A1 (en) 2016-01-22 2017-07-27 Nuvasive, Inc. Systems and methods for facilitating spine surgery
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US9962133B2 (en) 2016-03-09 2018-05-08 Medtronic Navigation, Inc. Transformable imaging system
US9931025B1 (en) 2016-09-30 2018-04-03 Auris Surgical Robotics, Inc. Automated calibration of endoscopes with pull wires
CN114126527B (en) * 2019-05-31 2025-01-21 直观外科手术操作公司 Composite medical imaging system and method
TWI772917B (en) * 2020-10-08 2022-08-01 國立中央大學 Computer-implemented method, computer-assisted processing device and computer program product for computer-assisted planning of surgical path

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070238981A1 (en) * 2006-03-13 2007-10-11 Bracco Imaging Spa Methods and apparatuses for recording and reviewing surgical navigation processes
US20090128618A1 (en) * 2007-11-16 2009-05-21 Samsung Electronics Co., Ltd. System and method for object selection in a handheld image capture device
US20190239926A1 (en) * 2007-12-18 2019-08-08 Howmedica Osteonics Corporation System and method for image segmentation, bone model generation and modification, and surgical planning
US20190247130A1 (en) * 2009-02-17 2019-08-15 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
EP3861956A1 (en) * 2020-02-04 2021-08-11 Globus Medical, Inc. Extended reality instrument interaction zone for navigated robotic surgery
WO2022072700A1 (en) * 2020-10-01 2022-04-07 Covidien Lp Systems and method of planning thoracic surgery

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12070272B2 (en) 2013-10-10 2024-08-27 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
US12137982B2 (en) 2013-10-10 2024-11-12 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
US12133691B2 (en) 2013-10-10 2024-11-05 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
US12362057B2 (en) 2018-06-19 2025-07-15 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12237066B2 (en) 2018-06-19 2025-02-25 Howmedica Osteonics Corp. Multi-user collaboration and workflow techniques for orthopedic surgical procedures using mixed reality
US12125577B2 (en) 2018-06-19 2024-10-22 Howmedica Osteonics Corp. Mixed reality-aided education using virtual models or virtual representations for orthopedic surgical procedures
US12050999B2 (en) 2018-06-19 2024-07-30 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12046349B2 (en) 2018-06-19 2024-07-23 Howmedica Osteonics Corp. Visualization of intraoperatively modified surgical plans
US12380986B2 (en) 2018-06-19 2025-08-05 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12170139B2 (en) 2018-06-19 2024-12-17 Howmedica Osteonics Corp. Virtual checklists for orthopedic surgery
US12112269B2 (en) 2018-06-19 2024-10-08 Howmedica Osteonics Corp. Mixed reality-aided surgical assistance in orthopedic surgical procedures
US12266440B2 (en) 2018-06-19 2025-04-01 Howmedica Osteonics Corp. Automated instrument or component assistance using mixed reality in orthopedic surgical procedures
US12020801B2 (en) 2018-06-19 2024-06-25 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12458446B2 (en) 2019-05-14 2025-11-04 Howmedica Osteonics Corp. Bone wall tracking and guidance for orthopedic implant placement
US12472013B2 (en) 2019-11-26 2025-11-18 Howmedica Osteonics Corp. Virtual guidance for correcting surgical pin installation
US12465374B2 (en) 2019-12-18 2025-11-11 Howmedica Osteonics Corp. Surgical guidance for surgical tools
US12496135B2 (en) 2021-02-02 2025-12-16 Howmedica Osteonics Corp. Mixed-reality humeral-head sizing and placement
WO2024253828A1 (en) * 2023-06-05 2024-12-12 Apple Inc. Techniques for three-dimensional environments
EP4620421A1 (en) * 2024-03-19 2025-09-24 Stryker European Operations Limited Automatic recentering of surgical navigation view on instrument tip

Also Published As

Publication number Publication date
US12232820B2 (en) 2025-02-25
US20230165640A1 (en) 2023-06-01
US20250228624A1 (en) 2025-07-17

Similar Documents

Publication Publication Date Title
US12232820B2 (en) Extended reality systems with three-dimensional visualizations of medical image scan slices
CN113259584B (en) Camera tracking system
CN113243990B (en) Surgical system
CN113558762B (en) Registering a surgical tool with a reference array tracked by a camera of an augmented reality headset for assisted navigation during surgery
US12115028B2 (en) Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
CN113274128B (en) Surgical systems
US12220176B2 (en) Extended reality instrument interaction zone for navigated robotic
US11737831B2 (en) Surgical object tracking template generation for computer assisted navigation during surgical procedure
CN113768620B (en) Camera tracking pole for a camera tracking system for computer-assisted navigation during surgery
JP7282816B2 (en) Extended Reality Instrument Interaction Zones for Navigated Robotic Surgery
US12201375B2 (en) Extended reality systems for visualizing and controlling operating room equipment
JP2022553385A (en) ENT treatment visualization system and method
US12178523B2 (en) Computer assisted surgical navigation system for spine procedures
JP2021194538A (en) Subject tracking and synthetic imaging in visible light surgery via reference seed
US12394086B2 (en) Accuracy check and automatic calibration of tracked instruments
US20240335240A1 (en) Camera tracking system identifying phantom markers during computer assisted surgery navigation
US20240164844A1 (en) Bone landmarks extraction by bone surface palpation using ball tip stylus for computer assisted surgery navigation
HK40049294A (en) Extended reality instrument interaction zone for navigated robotic surgery
HK40053176A (en) Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: GLOBUS MEDICAL, INC., PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DULIN, ISAAC;CALLOWAY, TOM;SIGNING DATES FROM 20211213 TO 20211217;REEL/FRAME:058428/0761

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED