[go: up one dir, main page]

US20230368418A1 - Accuracy check and automatic calibration of tracked instruments - Google Patents

Accuracy check and automatic calibration of tracked instruments Download PDF

Info

Publication number
US20230368418A1
US20230368418A1 US17/663,024 US202217663024A US2023368418A1 US 20230368418 A1 US20230368418 A1 US 20230368418A1 US 202217663024 A US202217663024 A US 202217663024A US 2023368418 A1 US2023368418 A1 US 2023368418A1
Authority
US
United States
Prior art keywords
tracked instrument
determining
virtual position
virtual
instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/663,024
Inventor
Sanjay M. Joshi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Globus Medical Inc
Original Assignee
Globus Medical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Globus Medical Inc filed Critical Globus Medical Inc
Priority to US17/663,024 priority Critical patent/US20230368418A1/en
Assigned to GLOBUS MEDICAL, INC. reassignment GLOBUS MEDICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JOSHI, SANJAY M.
Publication of US20230368418A1 publication Critical patent/US20230368418A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • A61B2034/207Divots for calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • A61B2090/3764Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT] with a rotating C-arm having a cone beam emitting source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30021Catheter; Guide wire
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present disclosure relates to medical devices and systems, and more particularly, checking accuracy and performing automatic calibration of tracked instruments in a camera tracking systems used for computer assisted navigation during surgery.
  • Surgical operating rooms can contain a diverse range of medical equipment, which can include computer assisted surgical navigation systems, medical imaging devices (e.g., computerized tomography (“CT”) scanners, fluoroscopy imaging, etc.), and surgical robots.
  • medical imaging devices e.g., computerized tomography (“CT”) scanners, fluoroscopy imaging, etc.
  • a computer assisted surgical navigation system can provide a surgeon with computerized visualization of the present pose of a surgical tool relative to medical images of a patient’s anatomy.
  • Camera tracking systems for computer assisted surgical navigation typically use a set of cameras to track pose of a reference array on a surgical tool, which is being positioned by a surgeon during surgery, relative to a patient reference array (also “dynamic reference base” (“DRB”)) attached to a patient.
  • the reference arrays allow the camera tracking system to determine a pose of the surgical tool relative to anatomical structure imaged by a medical image of the patient and relative to the patient. The surgeon can thereby use real-time visual feedback of the pose to navigate the surgical tool during a surgical procedure on the patient.
  • FIG. 10 illustrates an example of a trackable instrument 1010 .
  • the CAD model of an instrument 1010 is associated with a reference element 1020 , so that the CAD model can be overlaid on registered images of patient’s anatomy.
  • accuracy of the instrument 1010 needs to be verified prior to use.
  • the accuracy check is typically done via bringing the tip 1040 of the tracked instrument into a divot 1050 associated with another reference element.
  • the divot 1050 is typically a cone-shaped depression ending in an apex.
  • the theoretical position of the tip 1040 is then compared with theoretical position of the divot 1050 . Assuming the user has properly positioned the instrument 1010 in the divot 1050 , the distance between the two positions determines the accuracy of tracked instrument 1010 . If the accuracy check does not pass, that instrument 1010 may not be used.
  • a source of inaccuracy during the accuracy check arises due to it being challenging for a user to place an instrument accurately in the divot.
  • the ideal position for a sharp instrument is along normal from the apex to the base of the cone of the divot. Any deviation of the angle introduces small errors.
  • a bad-acting user may move the position of the instrument to produce a false accuracy number (that appears more accurate).
  • a source of inaccuracy during the accuracy check arises due inaccuracy in tracking of the two reference elements (one associated with the tracked instrument and one associated with the divot).
  • the reference element arrays are typically small in size (e.g., on a few centimeters wide) to minimize obstruction of the surgical area.
  • the number of markers is also usually limited to optimize costs and workflow. A larger array with more markers can improve the accuracy of divot position.
  • a source of inaccuracy during the accuracy check arises due to a shape of the instrument tip.
  • Blunt tip instruments may not fit well inside the divot and instruments with angled tips or a hook shape can make it even more difficult to properly place the instrument tip in the divot.
  • a sources of inaccuracies during the accuracy check includes a deformed instrument.
  • the source of inaccuracies includes a deformed reference element. Note that a slight angular shift in the reference element can result in very small error for tracking of the reference element, but may result in a much larger error at instrument tip.
  • the source of inaccuracies include inaccuracies in optical markers due to manufacturing defects, smudges, or inaccurate mounting of optical markers on mounting posts. All these are solvable problems, though. If an instrument can be calibrated at the time of use, the fidelity of tracking can be improved so that the physical tip matches the estimated tip.
  • Some embodiments of the present disclosure are directed to performing an accuracy check and calibrating tracked instruments used in surgical procedures.
  • a system configured to perform an accuracy check of a tracked instrument.
  • the system includes processing circuitry and memory coupled to the processing circuitry.
  • the memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations.
  • the operations include determining a virtual position within a virtual space of a display device.
  • the operations further include determining a virtual position within the virtual space of the tracked instrument.
  • the operations further include determining a point of contact on the display device between the tracked instrument and the display device.
  • the operations further include determining an expected point of contact on the display device between the tracked instrument and the display device based on the virtual position of the display device and the virtual position of the tracked instrument.
  • the operations further include determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • a system configured to perform an accuracy check of a tracked instrument.
  • the system includes processing circuitry and memory coupled to the processing circuitry.
  • the memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations.
  • the operations include determining a first virtual position within a virtual space of an emitter of an imaging device.
  • the operations further include determining a first virtual position within the virtual space of a detector of the imaging device.
  • the operations further include determining a first virtual position within the virtual space of the tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector.
  • the operations further include determining a first expected image of the tracked instrument based on the first virtual position of the emitter, the first virtual position of the detector, and the first virtual position of the tracked instrument.
  • the operations further include obtaining a first image of the tracked instrument while it is positioned at the first physical position between the emitter and the detector.
  • the operations further include determining a second virtual position within the virtual space of the emitter of the imaging device.
  • the operations further include determining a second virtual position within the virtual space of the detector of the imaging device.
  • the operations further include determining a second virtual position within the virtual space of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector.
  • the operations further include determining a second expected image of the tracked instrument based on the second virtual position of the emitter, the second virtual position of the detector, and the second virtual position of the tracked instrument.
  • the operations further include obtaining a second image of the tracked instrument while it is positioned between the emitter and the detector, the second image being different than the first image.
  • the operations further include determining whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
  • a system configured to perform an accuracy check of a tracked instrument.
  • the system includes processing circuitry and memory coupled to the processing circuitry.
  • the memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations.
  • the operations include determining a virtual position within a virtual space of the tracked instrument relative to a display device.
  • the operations further include displaying an indication of the virtual position of the tracked instrument on the display device.
  • the operations further include receiving an indication of an actual position of the tracked instrument relative to the display device.
  • the operations further include determining whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and which may further include a surgical robot for robotic assistance according to some embodiments;
  • XR extended reality
  • FIG. 2 illustrates the camera tracking system and the surgical robot positioned relative to a patient according to some embodiments
  • FIG. 3 further illustrates the camera tracking system and the surgical robot configured according to some embodiments
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset, a computer platform, imaging devices, and a surgical robot which are configured to operate according to some embodiments;
  • FIG. 5 illustrates a patient reference array (“DRB”) and a surveillance marker
  • FIGS. 6 A-C respectively illustrate a surgical robot with an end-effector, an expanded view of the end-effector, and a surgical tool in accordance with some embodiments;
  • FIGS. 7 A-B are schematic diagrams illustrating examples of imaging devices according to some embodiments.
  • FIG. 8 is a block diagram illustrating an example of an imaging system according to some embodiments.
  • FIG. 9 is a block diagram illustrating an example of an accuracy and calibration module according to some embodiments.
  • FIG. 10 is a schematic diagram illustrating an example of a tracked instrument according to some embodiments.
  • FIG. 11 is a schematic diagram illustrating an example of a set of display devices configured to interact with a tracked instrument according to some embodiments
  • FIG. 12 is a schematic diagram illustrating an example of the set of display devices of FIG. 11 being contacted by a tracked instrument according to some embodiments;
  • FIG. 13 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on contact with a display device according to some embodiments
  • FIG. 14 is a schematic diagram illustrating an example of a C-arm imaging device according to some embodiments.
  • FIGS. 15 A-B are schematic diagrams illustrating images taken of a tracked instrument using the C-arm imaging device at two different positions according to some embodiments
  • FIG. 16 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on images taken of the tracked instrument according to some embodiments;
  • FIG. 17 is a schematic diagram of a display device configured to show an expected position of a tracked instrument according to some embodiments.
  • FIGS. 18 - 20 are flowcharts of operations performed by a system to perform an accuracy check of tracked instruments according to some embodiments.
  • Various embodiments of the present disclosure are directed to providing operations by the camera tracking system to improve registration of candidate markers, such as a surveillance marker, when phantom markers appear in frames of tracking data from tracking cameras.
  • candidate markers such as a surveillance marker
  • FIGS. 1 - 9 various components that may be used for performing embodiments in a navigated surgery system are described with reference to FIGS. 1 - 9 .
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets 150 during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery during a surgical procedure and which may further include a surgical robot 100 for robotic assistance, according to some embodiments.
  • FIG. 2 illustrates the camera tracking system 200 and the surgical robot 100 positioned relative to a patient, according to some embodiments.
  • FIG. 3 further illustrates the camera tracking system 200 and the surgical robot 100 configured according to some embodiments.
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150 , a computer platform 400 , imaging devices 420 , and the surgical robot 100 which are configured to operate according to some embodiments.
  • FIG. 5 illustrates a patient reference array 116 (also “dynamic reference base” (DRB)) and a surveillance marker 500 .
  • DRB dynamic reference base
  • the XR headset 150 may be configured to augment a real-world scene with computer generated XR images.
  • the XR headset 150 may be configured to provide an augmented reality (“AR”) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user.
  • AR augmented reality
  • VR virtual reality
  • the XR headset 150 may be configured to provide a virtual reality (“VR”) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer-generated AR images on a display screen.
  • the XR headset 150 can be configured to provide both AR and VR viewing environments.
  • the term XR headset can referred to as an AR headset or a VR headset.
  • the surgical robot 100 may include, for example, one or more robot arms 104 , a display 110 , an end-effector 112 , for example, including a guide tube 114 , and an end effector reference array which can include one or more tracking markers.
  • a patient reference array 116 (“DRB”) has a plurality of tracking markers 117 and is secured directly to the patient 210 (e.g., to a bone of the patient 210 ).
  • a spaced apart surveillance marker 500 ( FIG. 5 ) has a single marker 502 connected to a shaft that is secured directly to the patient 210 at a spaced apart location from the patient reference array 116 .
  • Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc.
  • the camera tracking system 200 includes tracking cameras 204 which may be spaced apart stereo cameras configured with partially overlapping field-of-views.
  • the camera tracking system 200 can have any suitable configuration of arm(s) 202 to move, orient, and support the tracking cameras 204 in a desired location, and may contain at least one processor operable to track location of an individual marker and pose of an array of markers.
  • the term “pose” refers to the location (e.g., along 3 orthogonal axes) and/or the rotation angle (e.g., about the 3 orthogonal axes) of markers (e.g., DRB) relative to another marker (e.g., surveillance marker) and/or to a defined coordinate system (e.g., camera coordinate system).
  • a pose may therefore be defined based on only the multidimensional location of the markers relative to another marker and/or relative to the defined coordinate system, based on only the multidimensional rotational angles of the markers relative to the other marker and/or to the defined coordinate system, or based on a combination of the multidimensional location and the multidimensional rotational angles.
  • the term “pose” therefore is used to refer to location, rotational angle, or combination thereof.
  • the tracking cameras 204 may include, e.g., infrared cameras (e.g., bifocal or stereophotogrammetric cameras), operable to identify, for example, active and passive tracking markers for single markers (e.g., surveillance marker 500 ) and reference arrays which can be formed on or attached to the patient 210 (e.g., patient reference array, DRB), end effector 112 (e.g., end effector reference array), XR headset(s) 150 worn by a surgeon 120 and/or a surgical assistant 126 , etc. in a given measurement volume of a camera coordinate system while viewable from the perspective of the tracking cameras 204 .
  • infrared cameras e.g., bifocal or stereophotogrammetric cameras
  • the tracking cameras 204 may scan the given measurement volume and detect light that is emitted or reflected from the markers in order to identify and determine locations of individual markers and poses of the reference arrays in three-dimensions.
  • active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (“LEDs”)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking cameras 204 or other suitable device.
  • the XR headsets 150 may each include tracking cameras (e.g., spaced apart stereo cameras) that can track location of a surveillance marker and poses of reference arrays within the XR camera headset field-of-views (“FOVs”) 152 and 154 , respectively. Accordingly, as illustrated in FIG. 1 , the location of the surveillance marker and the poses of reference arrays on various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 and/or a FOV 600 of the tracking cameras 204 .
  • tracking cameras e.g., spaced apart stereo cameras
  • FOVs XR camera headset field-of-views
  • FIGS. 1 - 2 illustrate a potential configuration for the placement of the camera tracking system 200 and the surgical robot 100 in an operating room environment.
  • Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 and/or other displays 34 , 36 , and 110 to display surgical procedure navigation information.
  • the surgical robot 100 is optional during computer-aided navigated surgery.
  • the camera tracking system 200 may operate using tracking information and other information provided by multiple XR headsets 150 such as inertial tracking information and optical tracking information (frames of tracking data).
  • the XR headsets 150 operate to display visual information and may play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 100 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment.
  • the camera tracking system 200 may track markers in 6 degrees-of-freedom (“6DOF”) relative to three axes of a 3D coordinate system and rotational angles about each axis.
  • 6DOF 6 degrees-of-freedom
  • the XR headsets 150 may also operate to track hand poses and gestures to enable gesture-based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 may have a 1-10x magnification digital color camera sensor called a digital loupe. In some embodiments, one or more of the XR headsets 150 are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
  • An “outside-in” machine vision navigation bar supports the tracking cameras 204 and may include a color camera.
  • the machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 while positioned on wearers’ heads.
  • the patient reference array 116 (DRB) is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112 , instrument reference array 170 , and reference arrays on the XR headsets 150 .
  • the surveillance marker 500 is affixed to the patient to provide information on whether the patient reference array 116 has shifted. For example, during a spinal fusion procedure with planned placement of pedicle screw fixation, two small incisions are made over the posterior superior iliac spine bilaterally. The DRB and the surveillance marker are then affixed to the posterior superior iliac spine bilaterally. If the surveillance marker’s 500 location changes relative to the patient reference array 116 , the camera tracking system 200 may display a meter indicating the amount of movement and/or may display a pop-up warning message to inform the user that the patient reference array may have been bumped. If the patient reference array has indeed been bumped, the registration of the patient reference array to the tracked coordinate system may be invalid and could result in erroneous navigation which is off target.
  • the surgical robot (also “robot”) may be positioned near or next to patient 210 .
  • the robot 100 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the surgical procedure.
  • the camera tracking system 200 may be separated from the robot system 100 and positioned at the foot of patient 210 . This location allows the tracking camera 200 to have a direct visual line of sight to the surgical area 208 .
  • the surgeon 120 may be positioned across from the robot 100 , but is still able to manipulate the end-effector 112 and the display 110 .
  • a surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110 . If desired, the locations of the surgeon 120 and the assistant 126 may be reversed.
  • An anesthesiologist 122 , nurse or scrub tech can operate equipment which may be connected to display information from the camera tracking system 200 on a display 34 .
  • the display 110 can be attached to the surgical robot 100 or in a remote location.
  • End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor.
  • end-effector 112 can comprise a guide tube 114 , which is configured to receive and orient a surgical instrument, tool, or implant used to perform a surgical procedure on the patient 210 .
  • end-effector is used interchangeably with the terms “end-effectuator” and “effectuator element.”
  • instrument is used in a nonlimiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein.
  • Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc.
  • end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery.
  • end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument in a desired manner.
  • the surgical robot 100 is operable to control the translation and orientation of the end-effector 112 .
  • the robot 100 may move the end-effector 112 under computer control along x-, y-, and z-axes, for example.
  • the end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis, such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled.
  • Euler Angles e.g., roll, pitch, and/or yaw
  • selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a 6DOF robot arm comprising only rotational axes.
  • the surgical robot 100 may be used to operate on patient 210 , and robot arm 104 can be positioned above the body of patient 210 , with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210 .
  • the XR headsets 150 can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
  • surgical robot 100 can be operable to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory.
  • the surgical robot 100 can be operable to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument.
  • a surgeon or other user can use the surgical robot 100 as part of computer assisted navigated surgery, and has the option to stop, modify, or manually control the autonomous or semi-autonomous movement of the end-effector 112 and/or the surgical instrument.
  • Reference arrays of markers can be formed on or connected to robot arms 102 and/or 104 , the end-effector 112 (e.g., end-effector array 114 in FIG. 2 ), and/or a surgical instrument (e.g., instrument array 170 ) to track poses in 6DOF along 3 orthogonal axes and rotation about the axes.
  • the end-effector 112 e.g., end-effector array 114 in FIG. 2
  • a surgical instrument e.g., instrument array 170
  • the reference arrays enable each of the marked objects (e.g., the end-effector 112 , the patient 210 , and the surgical instruments) to be tracked by the tracking camera 200 , and the tracked poses can be used to provide navigated guidance during a surgical procedure and/or used to control movement of the surgical robot 100 for guiding the end-effector 112 and/or an instrument manipulated by the end-effector 112 .
  • the marked objects e.g., the end-effector 112 , the patient 210 , and the surgical instruments
  • the surgical robot 100 may include a display 110 , upper arm 102 , lower arm 104 , end-effector 112 , vertical column 312 , casters 314 , a table 318 , and ring 324 which uses lights to indicate statuses and other information.
  • Cabinet 106 may house electrical components of surgical robot 100 including, but not limited, to a battery, a power distribution module, a platform interface board module, and a computer.
  • the camera tracking system 200 may include a display 36 , tracking cameras 204 , arm(s) 202 , a computer housed in cabinet 330 , and other components.
  • perpendicular 2D scan slices such as axial, sagittal, and/or coronal views, of patient anatomical structure are displayed to enable user visualization of the patient’s anatomy alongside the relative poses of surgical instruments.
  • An XR headset or other display can be controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy.
  • the 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn’t necessarily formed from a scan of the patient.
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150 , a computer platform 400 , imaging devices 420 , and a surgical robot 100 which are configured to operate according to some embodiments.
  • the imaging devices 420 may include a C-arm imaging device, an O-arm imaging device, and/or a patient image database.
  • the XR headset 150 provides an improved human interface for performing navigated surgical procedures.
  • the XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 400 , that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 438 of the XR headset 150 and/or another display device.
  • the display device 438 may include a video projector, flat panel display, etc.
  • the user may view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen.
  • the XR headset 150 may additionally or alternatively be configured to display on the display device 438 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
  • Electrical components of the XR headset 150 can include a plurality of cameras 430 , a microphone 432 , a gesture sensor 434 , a pose sensor (e.g., inertial measurement unit (“IMU”)) 436 , the display device 438 , and a wireless/wired communication interface 440 .
  • the cameras 430 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
  • the cameras 430 may be configured to operate as the gesture sensor 434 by tracking for identification user hand gestures performed within the field of view of the camera(s) 430 .
  • the gesture sensor 434 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 434 and/or senses physical contact, e.g., tapping on the sensor 434 or its enclosure.
  • the pose sensor 436 e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
  • a surgical system includes the camera tracking system 200 which may be connected to a computer platform 400 for operational processing and which may provide other operational functionality including a navigation controller 404 and/or of an XR headset controller 410 .
  • the surgical system may include the surgical robot 100 .
  • the navigation controller 404 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 200 .
  • the navigation controller 404 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 100 , where the steering information is displayed through the display device 438 of the XR headset 150 and/or another display device to indicate where the surgical tool and/or the end effector of the surgical robot 100 should be moved to perform the surgical plan.
  • the electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 400 through the wired/wireless interface 440 .
  • the electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 400 or directly connected, to various imaging devices 420 , e.g., the C-arm imaging device, the I/O-arm imaging device, the patient image database, and/or to other medical equipment through the wired/wireless interface 440 .
  • the surgical system may include a XR headset controller 410 that may at least partially reside in the XR headset 150 , the computer platform 400 , and/or in another system component connected via wired cables and/or wireless communication links.
  • Various functionality is provided by software executed by the XR headset controller 410 .
  • the XR headset controller 410 is configured to receive information from the camera tracking system 200 and the navigation controller 404 , and to generate an XR image based on the information for display on the display device 438 .
  • the XR headset controller 410 can be configured to operationally process frames of tracking data from tracking cameras from the cameras 430 (tracking cameras), signals from the microphone 1620 , and/or information from the pose sensor 436 and the gesture sensor 434 , to generate information for display as XR images on the display device 438 and/or as other for display on other display devices for user viewing.
  • the XR headset controller 410 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user.
  • the XR headset controller 410 may reside within the computer platform 400 which, in turn, may reside within the cabinet 330 of the camera tracking system 200 , the cabinet 106 of the surgical robot 100 , etc..
  • the surgical robot system 100 relies on accurate positioning of the end-effector 112 , surgical instruments 608 , and/or the patient 210 (e.g., patient reference array 116 ) relative to the desired surgical area.
  • the reference arrays include tracking markers 118 , 804 which are rigidly attached to a portion of the instrument 608 and/or end-effector 112 .
  • FIG. 6 A depicts part of the surgical robot system 100 with the robot 102 including base 106 , robot arm 104 , and end-effector 112 .
  • the other elements, not illustrated, such as the display, marker tracking cameras, etc. may also be present as described herein.
  • FIG. 6 B depicts a close-up view of the end-effector 112 with guide tube 114 and a reference array that includes a plurality of tracking markers 118 rigidly affixed to the end-effector 112 .
  • the plurality of tracking markers 118 are attached to the end-effector 112 configured as a guide tube.
  • FIG. 6 C depicts an instrument 608 (in this case, a probe) with a plurality of tracking markers 804 rigidly affixed to the instrument 608 .
  • the instrument 608 could include any suitable surgical instrument, such as, but not limited to, guide wire, cannula, a retractor, a drill, a reamer, a screwdriver, an insertion instrument, a removal instrument, or the like.
  • the reference array 612 functions as the handle 620 of the instrument 608 .
  • Four markers 804 are attached to the handle 620 in a manner that is out of the way of the shaft 622 and tip 624 .
  • Stereophotogrammetric tracking by the tracking camera 200 of these four markers 804 allows the instrument 608 to be tracked as a rigid body and for the system 100 to precisely determine the location of the tip 624 and the orientation of the shaft 622 while the instrument 608 is moved within view of tracking camera 200 .
  • the markers 118 , 804 on each instrument 608 , end-effector 112 , or the like may be arranged asymmetrically with a known inter-marker spacing.
  • the reason for asymmetric alignment is so that it is unambiguous which marker 118 , 804 corresponds to a particular pose on the rigid body and whether markers 118 , 804 are being viewed from the front or back, i.e., mirrored.
  • each array 612 and thus each instrument 608 , end-effector 112 , or other object to be tracked should have a unique marker pattern to allow it to be distinguished from other instruments 608 or other objects being tracked.
  • Asymmetry and unique marker patterns allow the tracking camera 200 and system 100 to detect individual markers 118 , 804 then to check the marker spacing against a stored template to determine which instrument 608 , end-effector 112 , or another object they represent. Detected markers 118 , 804 can then be sorted automatically and assigned to each tracked object in the correct order. Without this information, rigid body calculations could not then be performed to extract key geometric information, for example, such as instrument tip 624 and alignment of the shaft 622 , unless the user manually specified which detected marker 118 , 804 corresponded to which position on each rigid body.
  • FIGS. 7 A-B illustrate medical imaging systems 1304 that may be used in conjunction with robot system 100 and/or navigation systems to acquire pre-operative, intra-operative, post-operative, and/or real-time image data of patient 210 .
  • Any appropriate subject matter may be imaged for any appropriate procedure using the imaging system 1304 .
  • the imaging system 1304 may be any imaging device such as a C-arm 1308 device, an O-arm 1306 device, a fluoroscopy imaging device, a magnetic resonance imaging scanner, etc. It may be desirable to take x-rays of patient 210 from a number of different positions, without the need for frequent manual repositioning of patient 210 which may be required in an x-ray system. As illustrated in FIG.
  • the imaging system 1304 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape.
  • C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316 .
  • the space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318 .
  • the imaging system 1304 may include an O-arm imaging device 1306 having a gantry housing 1324 attached to a support structure imaging device support structure 1328 , such as a wheeled mobile cart 1330 with wheels 1332 , which may enclose an image capturing portion, not illustrated.
  • the image capturing portion may include an x-ray source and/or emission portion and an x-ray receiving and/or image receiving portion, which may be disposed about one hundred and eighty degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion.
  • the image capturing portion may be operable to rotate three hundred and sixty degrees during image acquisition.
  • the image capturing portion may rotate around a central point and/or axis, allowing image data of patient 210 to be acquired from multiple directions or in multiple planes.
  • FIG. 8 illustrates a block diagram of components of a medical imaging system configured in accordance with some embodiments of the present disclosure.
  • the medical imaging system includes a controller 3200 , a imaging arm 3240 (e.g., a C-arm or an O-arm), a linear actuator and/or rotary actuator 3250 connected to an X-ray beam emitter or collector 3260 .
  • the controller 3200 includes an image processor 3210 , a general processor 3220 , and an I/O interface 3230 .
  • the image processor 3210 performs image processing to combine sets of images to generate a three-dimensional image of the scanned volume.
  • the general processor 3220 is used to perform various embodiments of the present disclosure.
  • the I/O interface 3230 communicatively couples the controller 3200 to other components of the medical imaging system.
  • the imaging arm 3240 includes motors 3245 used to move the collector and emitter along an arc, e.g., three hundred and sixty degrees, during image acquisition. Motors 3245 are controlled by C-arm the controller 3200 .
  • the controller 3200 can also control movement of the linear actuator and/or rotary actuator 3250 .
  • FIG. 9 illustrates an example of an accuracy and calibration module 3300 .
  • the accuracy and calibration module 3330 can include an interface 3310 , a processing circuitry 3320 , and a memory 3330 .
  • the accuracy and calibration module is part of a system (e.g., an imaging system or a camera tracking system).
  • the memory 3330 can include instructions stored therein that are executable by the processing circuitry to perform operations according to some embodiments herein.
  • Embodiments that include performing an accuracy check and/or calibrating of a tracked instrument based on contact with a touch sensor are described below.
  • multiple points of contact can be detected by one or more touchpads that are themselves tracked by navigation camera.
  • the instruments and the pressure touchpads can each have associated reference elements that are tracked by the navigation camera.
  • the touchpads are sensitive to pressure, capacitance, or resistance.
  • FIG. 11 illustrates an example of a set of touchpads 1110 coupled together to create an opening for accepting a tip of the tracked instrument.
  • the associated reference element 1120 is coupled to the touchpads.
  • the touchpads and reference arrays are securely housed in a supporting structure 1130 to reduce movement.
  • the touchpads 1110 can capture location of pressure points. Resistive touchpads are especially useful, since they do not rely on capacitance of the object.
  • an instrument When an instrument is brought in the wedge, it touches at least two points on the touchpads 1110 .
  • the touchpads 1110 then send the location of sensed points to the system.
  • the system also receives the position of pose of the touchpads and instruments via their associated reference elements 1120 . Thus, the system can calculate the theoretical position of the tip of the instrument under test. It can then compare the tip location to the location reported by the three touchpads 1110 .
  • the bottom touchpad would report position of a sharp or semi-sharp instrument tip.
  • a broader instrument such as an Osteotome
  • the approximate position of the CAD model with respect to the touchpads is known already to the system based on the tracking information reported by the camera. Thus, the accuracy of the physical model can be calculated.
  • FIG. 12 illustrates an example of a tip of a tracked instrument 1240 contacting the touchpads 1110 .
  • the wedge shape of the opening between the touchpads 1110 allows an accuracy check of instruments with tips that are too big to fit in a typical divot used in navigation arrays.
  • FIG. 13 illustrates an example of operations to perform an accuracy check and calibrate a tracked instrument based on contact between the tracked instrument and the display devices.
  • the reported touchpad points are compared against the theoretical model.
  • the user touches instrument tip on all three touchpads in a way that reference elements of both the instrument and the touchpad structure are visible to the tracking camera.
  • the theoretical position of the instrument tip with respect to touchpads is then calculated. This serves as the initial position estimate of the instrument tip. Since the relative position of three touchpads is known, the theoretical touchpoints of the CAD model for each touchpad are then calculated.
  • the optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the theoretical touchpoints and the actual ones as shown in the algorithm below.
  • these operations improve accuracy checks for instruments without a sharp tip or instruments that are too wide to fit in a traditional divot. In additional or alternative embodiments, these operation allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • FIG. 18 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a point of contact between the tracked instrument and a touch sensor.
  • the operations are described below as being performed by the accuracy and calibration module 3300 , any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • processing circuitry 3320 determines a virtual position of the touch sensor.
  • the term virtual position is used herein to describe a virtual location and a virtual pose of an object.
  • the system includes a camera. Determining the virtual position of the touch sensor includes: determining information about a shape of the touch sensor relative to a reference element coupled to the touch sensor; capturing, via the camera, an image of the reference element coupled to the touch sensor; determining a virtual position of the reference element coupled to the touch sensor relative to a dynamic reference base (“DRB”) based on the image of the reference element coupled to the touch sensor; and determining the virtual position of the touch sensor based on the information about the shape of the touch sensor and the virtual position of the reference element coupled to the touch sensor.
  • DRB dynamic reference base
  • processing circuitry 3320 determines a virtual position of the tracked instrument.
  • the virtual position of the touch sensor and the virtual position of the tracked instrument are within the same virtual space (e.g., relative to a common reference point).
  • the system includes a camera.
  • determining the virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument relative to the DRB based on the image of the reference element coupled to the tracked instrument; and determining the virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • processing circuitry 3320 determines a point of contact on a touch sensor between the tacked instrument and the touch sensor.
  • the system includes the touch sensor and the touch sensor includes a touchscreen (e.g., a pressure sensitive, resistance sensitive, or capacitance sensitive touchscreen).
  • the touch sensor is part of a display device. Determining the point of contact includes detecting a location on the touchscreen that the tracked instrument is touching.
  • the touch sensor includes a plurality of touch sensors coupled together to form an opening. Determining the point of contact on the touch sensor includes determining a plurality of points of contact, each point of contact between one of the touch sensors of the plurality of touch sensors and the tracked instrument while the tracked instrument is positioned in the opening.
  • processing circuitry 3320 determines an expected point of contact on the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • information about the shape of the tracked instrument is determined and the information an intended position of a tip of the tracked instrument relative to a reference element coupled to the tracked instrument.
  • Determining the point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor.
  • Determining the expected point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • processing circuitry 3320 displays an indication of the expected point of contact.
  • the system includes a display device that includes the touch sensor. Determining the point of contact on the touch sensor between the tracked instrument and the touch sensor includes receiving an indication of the point of contact on the touch sensor from a user in response to displaying the indication of the expected point of contact.
  • processing circuitry 3320 determines whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • determining the point of contact on the touch sensor includes determining a plurality of points of contact between the tracked instrument and the touch sensor.
  • processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold. In some examples, performing the action includes outputting an indication that the tracked instrument is not suitable for use. In additional or alternative examples, performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the point of contact, the expected point of contact, and the difference.
  • FIG. 18 may be optional.
  • blocks 1850 and 1870 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on an image taken by a tracked imaging device are described below.
  • multiple x-ray views of one or more tracked instruments are taken with a Fluoroscope that is tracked by a navigation camera using an attached registration fixture.
  • registration fixtures are commonly used for surgical navigation using fluoroscopy.
  • FIG. 14 illustrates an example of an imaging device 1410 including an x-ray emitter 1420 and a x-ray detector 1430 .
  • the registration fixture 1440 is coupled to a predetermine portion of the imaging device 1410 .
  • the registration fixture 1440 typically includes fiducials in two planes at known positions. These fiducials are then detected in images captured by a navigation camera. Using the known positions, the relative position of the emitter 1420 is then computed. The position of the detector 1440 is tracked using the attached reference element 1440 via a navigation camera. When an instrument tracked with a reference element is brought between the emitter and detector, its relative position with respect to registration fixture 1440 is calculated.
  • the CAD model of the associated instrument tip can then be projected on the fluoroscopy image to achieve navigation. Since the registration fixture can move after the x-ray image is captured, often a different reference element, called a DRB is solidly attached to the patient, so that all tracked positions are relative to the fixed DRB.
  • a DRB a different reference element
  • FIGS. 15 A-B illustrate an example in which a wedge-shaped tracked instrument is placed between the emitter 1420 and detector 1430 , such that its views are captured by the fluoroscope in two positions.
  • the corresponding images 1570a-b below the fluoroscope show the instrument profile in different angles. Note that most instruments are solid and are made up of metal, which absorbs most x-rays and shows up dark on an x-ray image.
  • the accuracy of the projection can be compared to the theoretical projection by detecting the dark instrument shape in a bright image. Thus, the accuracy can be calculated without needing a divot.
  • FIG. 16 illustrates an example of operations for performing an accuracy check and/or calibrating a tracked instrument using images of the tracked instrument.
  • the x-ray views of an instrument are obtained as described above.
  • the theoretical position of the instrument tip projected in the views then calculated. This serves as the initial position estimate of the instrument tip.
  • the theoretical view of the CAD model in each x-ray is then calculated.
  • the optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the CAD view and actual image as shown in the algorithm below.
  • this is the same problem as matching a CT scan to multiple Fluoroscopy images in CTFluoro registration, except in this case a CAD model is used instead of a CT scan to compute dynamically rendered radiograph (“DRR”).
  • DRR dynamically rendered radiograph
  • these operations do not rely on a sharp tipped instrument fitting snugly in a divot, and can be used for accuracy checks of all types of instrument tips.
  • these operations improve accuracy checks for instruments without a sharp or straight tip.
  • these operations allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • these operations enable accuracy checks and re-calibration of multiple instruments simultaneously.
  • FIG. 19 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a pair of images taken by an imaging device.
  • the operations are described below as being performed by the accuracy and calibration module 3300 , any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • processing circuitry 3320 determines a first virtual position of an emitter.
  • the system includes a tracking camera and an imaging device including the emitter and a detector. Determining the first virtual position of the emitter includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a dynamic reference base (“DRB”)) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the emitter based on predetermined information indicating a position of the emitter relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device.
  • the virtual position of the emitter is determined based on predetermined information indicating a position of the emitter relative to the detector and a virtual position of the detector.
  • processing circuitry 3320 determines a first virtual position of a detector.
  • the system includes a tracking camera and an imaging device including the emitter and the detector. Determining the first virtual position of the detector includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a DRB) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the detector based on predetermined information indicating a position of the detector relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device.
  • processing circuitry 3320 determines a first virtual position of a tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector.
  • the system includes a tracking camera. Determining the first virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument (e.g., relative to the DRB) based on the image of the reference element coupled to the tracked instrument; and determining the first virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument.
  • processing circuitry 3320 determines a first expected image of the tracked instrument.
  • the first expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the first virtual position of the emitter, the first virtual position of the detector, the first virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • processing circuitry 3320 obtains a first image of the tracked instrument.
  • obtaining the first image of the tracked instrument includes receiving the first image from the imaging device.
  • processing circuitry 3320 rotates the imaging device (including the emitter and the detector).
  • the imaging device includes a C-arm or an O-arm imaging device.
  • processing circuitry 3320 determines a second virtual position of the emitter. In some embodiments, determining the second virtual position of the emitter includes receiving the second virtual position from a tracking system.
  • processing circuitry 3320 determines a second virtual position of the detector. In some embodiments, determining the second virtual position of the detector includes receiving the second virtual position from a tracking system.
  • processing circuitry 3320 determines a second virtual position of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector. In some embodiments, determining the second virtual position of the tracked instrument includes receiving the second virtual position from a tracking system.
  • the first virtual position of the tracked instrument is the second virtual position of the tracked instrument.
  • the imaging device can include at least one of a C-arm and a O-arm and responsive to obtaining the first image, the imaging device can be rotated (block 1935 ) such that the second virtual position of the emitter is different than the first virtual position of the emitter and that the second virtual position of the detector is different than the first virtual position of the detector.
  • the first virtual position of the tracked instrument is different than the second virtual position of the tracked instrument.
  • the first virtual position of the emitter is the second virtual position of the emitter.
  • the first virtual position of the detector is the second virtual position. For example, without rotating the imaging device an image of the tracked instrument can be taken from a different perspective by moving the tracked instrument.
  • processing circuitry 3320 determines a second expected image of the tracked instrument.
  • the second expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the second virtual position of the emitter, the second virtual position of the detector, the second virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • processing circuitry 3320 obtains a second image of the tracked instrument.
  • obtaining the second image of the tracked instrument includes receiving the second image from the imaging device.
  • processing circuitry 3320 determines whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
  • the first expected image, the second expected image, the first image, and the second image each include an image of the tip of the tracked instrument.
  • processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold.
  • performing the action includes outputting an indication that the tracked instrument is not suitable for use.
  • performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the first expected image, the second expected image, the first image, and the second image.
  • FIG. 19 Various operations of FIG. 19 may be optional. For example, blocks 1935 , 1940 , 1945 , and 1970 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on comparison of an actual position with an expected position on a display device are described below.
  • a display screen is available to show tracked instruments.
  • the display screen is near the surgical area and is already covered with sterile drape.
  • the screen may be large size (e.g., 22 inches or larger).
  • a reference element can be coupled to the display screen to allow it to be tracked by a navigation camera.
  • a large reference element array can yield improved accuracy of tracking and, in some examples, due to the large physical size, more than four optical markers can be used to improve the fidelity of tracking.
  • a user when a user brings a navigated instrument near the display screen, its position with respect to the reference element on the display screen is calculated.
  • the theoretical position of the tracked tip of the instrument CAD is then shown on the display screen.
  • the user can visually compare the accuracy of the physical position of the instrument tip with the position displayed on the screen. With aid of a virtual measurement tool, the user can then assess the accuracy.
  • FIG. 17 illustrates an example of a display device 1710 displaying a theoretical position (front view 1730 and side view 1740 ) of the tip of a tracked instrument 1750 .
  • the display device 1710 has reference elements 1720 and the tracked instrument 1750 has reference elements 1760 for being tracked by a navigation camera.
  • the front view 1730 of the theoretical position of the tip of the tracked instrument 1750 is shown as a hollow triangle on the right half of the screen.
  • the left half of the screen shows a side view 1740 of the theoretical position of the tip of the tracked instrument 1750 , allowing assessment of theoretical height above the screen of the tracked instrument 1750 .
  • the display device can be used for performing an accuracy check of any shape of tracked instrument tip. Even unconventional tips, such as a hook can be easily visualized on the screen.
  • the same display screen can be used for an accuracy check of multiple instruments.
  • the screen array is unlikely to be damaged during surgery due to splatter of blood or other smudges, since it is typically much farther from the surgical field compared to tracked instruments.
  • the surface of the display screen can sense the touch of the instrument tip, the accuracy can be calculated as well instead of relying on visual assessment.
  • using the display device to perform an accuracy check of a tracked instrument can improve fidelity of reference element array used for accuracy check and consistency of accuracy checks .
  • using the display device to perform an accuracy check of a tracked instrument can improve accuracy check workflow for instruments without a sharp, straight tip.
  • using the display device to perform an accuracy check of a tracked instrument can allow user for visual inspection and assessment of accuracy.
  • FIG. 20 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on displaying a virtual position of the tracked instrument on a display device.
  • the operations are described below as being performed by the accuracy and calibration module 3300 , any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • processing circuitry 3320 determines a virtual position of a tracked instrument relative to a display device.
  • processing circuitry 3320 displays an indication of the virtual position of the tracked instrument on the display device.
  • the processing circuitry determines an intended shape of the tracked instrument. For example, an accurate and/or undamaged shape of the tracked instrument.
  • Displaying the indication of the virtual position of the tracked instrument includes: displaying on a first part of the display device, a first portion of the intended shape of the tracked instrument in a front view perspective based on the virtual position of the tracked instrument; and displaying on a second part of the display device, a second portion of the tracked instrument in a side view perspective based on the virtual position of the tracked instrument.
  • processing circuitry 3320 receives an indication of an actual position of the tracked instrument relative to the display device.
  • receiving the actual position of the tracked instrument includes receiving an indication from a user.
  • processing circuitry 3320 determines whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • performing the action includes, responsive to determining whether the tracked instrument is accurate, outputting an indication of whether the tracked instrument is suitable for use.
  • performing the action includes, responsive to determining whether the tracked instrument is accurate, calibrating a tracking system used to track the tracked instrument using at least one of the virtual position of the tracked instrument and the actual position of the tracked instrument.
  • block 2050 may be optional in some embodiments.
  • the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
  • the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
  • the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • inventions of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, microcode, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

A system configured to perform an accuracy check of a tracked instrument can include a processing circuitry and memory coupled to the processing circuitry. The memory can include instructions to cause the system to perform operations. The operations can include determining a virtual position of a display device. The operations can further include determining a virtual position of the tracked instrument. The operations can further include determining a point of contact on the display device between the tracked instrument and the display device. The operations can further include determining an expected point of contact on the display device between the tracked instrument and the display device based on the virtual position of the display device and the virtual position of the tracked instrument. The operations can further include determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.

Description

    CROSS-REFRENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. Pat. Application No. 17/662,666, filed, May 10, 2022, which is incorporated herein by reference.
  • FIELD
  • The present disclosure relates to medical devices and systems, and more particularly, checking accuracy and performing automatic calibration of tracked instruments in a camera tracking systems used for computer assisted navigation during surgery.
  • BACKGROUND
  • Surgical operating rooms can contain a diverse range of medical equipment, which can include computer assisted surgical navigation systems, medical imaging devices (e.g., computerized tomography (“CT”) scanners, fluoroscopy imaging, etc.), and surgical robots.
  • A computer assisted surgical navigation system can provide a surgeon with computerized visualization of the present pose of a surgical tool relative to medical images of a patient’s anatomy. Camera tracking systems for computer assisted surgical navigation typically use a set of cameras to track pose of a reference array on a surgical tool, which is being positioned by a surgeon during surgery, relative to a patient reference array (also “dynamic reference base” (“DRB”)) attached to a patient. The reference arrays allow the camera tracking system to determine a pose of the surgical tool relative to anatomical structure imaged by a medical image of the patient and relative to the patient. The surgeon can thereby use real-time visual feedback of the pose to navigate the surgical tool during a surgical procedure on the patient.
  • Surgical navigation of instruments using reference elements has become a well-established technique in the operating room. FIG. 10 illustrates an example of a trackable instrument 1010. The CAD model of an instrument 1010 is associated with a reference element 1020, so that the CAD model can be overlaid on registered images of patient’s anatomy. To ensure fidelity of the overlay, accuracy of the instrument 1010 needs to be verified prior to use. The accuracy check is typically done via bringing the tip 1040 of the tracked instrument into a divot 1050 associated with another reference element. The divot 1050 is typically a cone-shaped depression ending in an apex.
  • The theoretical position of the tip 1040 is then compared with theoretical position of the divot 1050. Assuming the user has properly positioned the instrument 1010 in the divot 1050, the distance between the two positions determines the accuracy of tracked instrument 1010. If the accuracy check does not pass, that instrument 1010 may not be used.
  • In some examples, a source of inaccuracy during the accuracy check arises due to it being challenging for a user to place an instrument accurately in the divot. The ideal position for a sharp instrument is along normal from the apex to the base of the cone of the divot. Any deviation of the angle introduces small errors. Furthermore, a bad-acting user may move the position of the instrument to produce a false accuracy number (that appears more accurate).
  • In additional or alternative examples, a source of inaccuracy during the accuracy check arises due inaccuracy in tracking of the two reference elements (one associated with the tracked instrument and one associated with the divot). The reference element arrays are typically small in size (e.g., on a few centimeters wide) to minimize obstruction of the surgical area. The number of markers is also usually limited to optimize costs and workflow. A larger array with more markers can improve the accuracy of divot position.
  • In additional or alternative examples, a source of inaccuracy during the accuracy check arises due to a shape of the instrument tip. Blunt tip instruments may not fit well inside the divot and instruments with angled tips or a hook shape can make it even more difficult to properly place the instrument tip in the divot.
  • In additional or alternative examples, a sources of inaccuracies during the accuracy check includes a deformed instrument. In additional or alternative examples, the source of inaccuracies includes a deformed reference element. Note that a slight angular shift in the reference element can result in very small error for tracking of the reference element, but may result in a much larger error at instrument tip. In additional or alternative examples, the source of inaccuracies include inaccuracies in optical markers due to manufacturing defects, smudges, or inaccurate mounting of optical markers on mounting posts. All these are solvable problems, though. If an instrument can be calibrated at the time of use, the fidelity of tracking can be improved so that the physical tip matches the estimated tip.
  • SUMMARY
  • Some embodiments of the present disclosure are directed to performing an accuracy check and calibrating tracked instruments used in surgical procedures.
  • In some embodiments, a system configured to perform an accuracy check of a tracked instrument is provided. The system includes processing circuitry and memory coupled to the processing circuitry. The memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations. The operations include determining a virtual position within a virtual space of a display device. The operations further include determining a virtual position within the virtual space of the tracked instrument. The operations further include determining a point of contact on the display device between the tracked instrument and the display device. The operations further include determining an expected point of contact on the display device between the tracked instrument and the display device based on the virtual position of the display device and the virtual position of the tracked instrument. The operations further include determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • In other embodiments, a system configured to perform an accuracy check of a tracked instrument is provided. The system includes processing circuitry and memory coupled to the processing circuitry. The memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations. The operations include determining a first virtual position within a virtual space of an emitter of an imaging device. The operations further include determining a first virtual position within the virtual space of a detector of the imaging device. The operations further include determining a first virtual position within the virtual space of the tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector. The operations further include determining a first expected image of the tracked instrument based on the first virtual position of the emitter, the first virtual position of the detector, and the first virtual position of the tracked instrument. The operations further include obtaining a first image of the tracked instrument while it is positioned at the first physical position between the emitter and the detector. The operations further include determining a second virtual position within the virtual space of the emitter of the imaging device. The operations further include determining a second virtual position within the virtual space of the detector of the imaging device. The operations further include determining a second virtual position within the virtual space of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector. The operations further include determining a second expected image of the tracked instrument based on the second virtual position of the emitter, the second virtual position of the detector, and the second virtual position of the tracked instrument. The operations further include obtaining a second image of the tracked instrument while it is positioned between the emitter and the detector, the second image being different than the first image. The operations further include determining whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
  • In other embodiments, a system configured to perform an accuracy check of a tracked instrument is provided. The system includes processing circuitry and memory coupled to the processing circuitry. The memory has instructions stored therein that are executable by the processing circuitry to cause the system to perform operations. The operations include determining a virtual position within a virtual space of the tracked instrument relative to a display device. The operations further include displaying an indication of the virtual position of the tracked instrument on the display device. The operations further include receiving an indication of an actual position of the tracked instrument relative to the display device. The operations further include determining whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • Other systems and corresponding methods and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional camera tracking system, methods. and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
  • DESCRIPTION OF THE DRAWINGS
  • Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets during a surgical procedure in a surgical room that includes a camera tracking system for navigated surgery and which may further include a surgical robot for robotic assistance according to some embodiments;
  • FIG. 2 illustrates the camera tracking system and the surgical robot positioned relative to a patient according to some embodiments;
  • FIG. 3 further illustrates the camera tracking system and the surgical robot configured according to some embodiments;
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset, a computer platform, imaging devices, and a surgical robot which are configured to operate according to some embodiments;
  • FIG. 5 illustrates a patient reference array (“DRB”) and a surveillance marker;
  • FIGS. 6A-C respectively illustrate a surgical robot with an end-effector, an expanded view of the end-effector, and a surgical tool in accordance with some embodiments;
  • FIGS. 7A-B are schematic diagrams illustrating examples of imaging devices according to some embodiments;
  • FIG. 8 is a block diagram illustrating an example of an imaging system according to some embodiments;
  • FIG. 9 is a block diagram illustrating an example of an accuracy and calibration module according to some embodiments;
  • FIG. 10 is a schematic diagram illustrating an example of a tracked instrument according to some embodiments;
  • FIG. 11 is a schematic diagram illustrating an example of a set of display devices configured to interact with a tracked instrument according to some embodiments;
  • FIG. 12 is a schematic diagram illustrating an example of the set of display devices of FIG. 11 being contacted by a tracked instrument according to some embodiments;
  • FIG. 13 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on contact with a display device according to some embodiments;
  • FIG. 14 is a schematic diagram illustrating an example of a C-arm imaging device according to some embodiments;
  • FIGS. 15A-B are schematic diagrams illustrating images taken of a tracked instrument using the C-arm imaging device at two different positions according to some embodiments;
  • FIG. 16 is a flow chart illustrating an example of operations for performing an accuracy check on a tracked instrument based on images taken of the tracked instrument according to some embodiments;
  • FIG. 17 is a schematic diagram of a display device configured to show an expected position of a tracked instrument according to some embodiments; and
  • FIGS. 18-20 are flowcharts of operations performed by a system to perform an accuracy check of tracked instruments according to some embodiments.
  • DETAILED DESCRIPTION
  • It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
  • The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.
  • Various embodiments of the present disclosure are directed to providing operations by the camera tracking system to improve registration of candidate markers, such as a surveillance marker, when phantom markers appear in frames of tracking data from tracking cameras. Before describing these embodiments is detail, various components that may be used for performing embodiments in a navigated surgery system are described with reference to FIGS. 1-9 .
  • FIG. 1 is an overhead view of personnel wearing extended reality (“XR”) headsets 150 during a surgical procedure in a surgical room that includes a camera tracking system 200 for navigated surgery during a surgical procedure and which may further include a surgical robot 100 for robotic assistance, according to some embodiments. FIG. 2 illustrates the camera tracking system 200 and the surgical robot 100 positioned relative to a patient, according to some embodiments. FIG. 3 further illustrates the camera tracking system 200 and the surgical robot 100 configured according to some embodiments. FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 400, imaging devices 420, and the surgical robot 100 which are configured to operate according to some embodiments. FIG. 5 illustrates a patient reference array 116 (also “dynamic reference base” (DRB)) and a surveillance marker 500.
  • The XR headset 150 may be configured to augment a real-world scene with computer generated XR images. The XR headset 150 may be configured to provide an augmented reality (“AR”) viewing environment by displaying the computer generated XR images on a see-through display screen that allows light from the real-world scene to pass therethrough for combined viewing by the user. Alternatively, the XR headset 150 may be configured to provide a virtual reality (“VR”) viewing environment by preventing or substantially preventing light from the real-world scene from being directly viewed by the user while the user is viewing the computer-generated AR images on a display screen. The XR headset 150 can be configured to provide both AR and VR viewing environments. Thus, the term XR headset can referred to as an AR headset or a VR headset.
  • Referring to FIGS. 1-5 , the surgical robot 100 may include, for example, one or more robot arms 104, a display 110, an end-effector 112, for example, including a guide tube 114, and an end effector reference array which can include one or more tracking markers. A patient reference array 116 (“DRB”) has a plurality of tracking markers 117 and is secured directly to the patient 210 (e.g., to a bone of the patient 210). A spaced apart surveillance marker 500 (FIG. 5 ) has a single marker 502 connected to a shaft that is secured directly to the patient 210 at a spaced apart location from the patient reference array 116. Another reference array 170 is attached or formed on an instrument, surgical tool, surgical implant device, etc.
  • The camera tracking system 200 includes tracking cameras 204 which may be spaced apart stereo cameras configured with partially overlapping field-of-views. The camera tracking system 200 can have any suitable configuration of arm(s) 202 to move, orient, and support the tracking cameras 204 in a desired location, and may contain at least one processor operable to track location of an individual marker and pose of an array of markers. As used herein, the term “pose” refers to the location (e.g., along 3 orthogonal axes) and/or the rotation angle (e.g., about the 3 orthogonal axes) of markers (e.g., DRB) relative to another marker (e.g., surveillance marker) and/or to a defined coordinate system (e.g., camera coordinate system). A pose may therefore be defined based on only the multidimensional location of the markers relative to another marker and/or relative to the defined coordinate system, based on only the multidimensional rotational angles of the markers relative to the other marker and/or to the defined coordinate system, or based on a combination of the multidimensional location and the multidimensional rotational angles. The term “pose” therefore is used to refer to location, rotational angle, or combination thereof.
  • The tracking cameras 204 may include, e.g., infrared cameras (e.g., bifocal or stereophotogrammetric cameras), operable to identify, for example, active and passive tracking markers for single markers (e.g., surveillance marker 500) and reference arrays which can be formed on or attached to the patient 210 (e.g., patient reference array, DRB), end effector 112 (e.g., end effector reference array), XR headset(s) 150 worn by a surgeon 120 and/or a surgical assistant 126, etc. in a given measurement volume of a camera coordinate system while viewable from the perspective of the tracking cameras 204. The tracking cameras 204 may scan the given measurement volume and detect light that is emitted or reflected from the markers in order to identify and determine locations of individual markers and poses of the reference arrays in three-dimensions. For example, active reference arrays may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (“LEDs”)), and passive reference arrays may include retro-reflective markers that reflect infrared light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the tracking cameras 204 or other suitable device.
  • The XR headsets 150 may each include tracking cameras (e.g., spaced apart stereo cameras) that can track location of a surveillance marker and poses of reference arrays within the XR camera headset field-of-views (“FOVs”) 152 and 154, respectively. Accordingly, as illustrated in FIG. 1 , the location of the surveillance marker and the poses of reference arrays on various objects can be tracked while in the FOVs 152 and 154 of the XR headsets 150 and/or a FOV 600 of the tracking cameras 204.
  • FIGS. 1-2 illustrate a potential configuration for the placement of the camera tracking system 200 and the surgical robot 100 in an operating room environment. Computer-aided navigated surgery can be provided by the camera tracking system controlling the XR headsets 150 and/or other displays 34, 36, and 110 to display surgical procedure navigation information. The surgical robot 100 is optional during computer-aided navigated surgery.
  • The camera tracking system 200 may operate using tracking information and other information provided by multiple XR headsets 150 such as inertial tracking information and optical tracking information (frames of tracking data). The XR headsets 150 operate to display visual information and may play-out audio information to the wearer. This information can be from local sources (e.g., the surgical robot 100 and/or other medical), remote sources (e.g., patient medical image server), and/or other electronic equipment. The camera tracking system 200 may track markers in 6 degrees-of-freedom (“6DOF”) relative to three axes of a 3D coordinate system and rotational angles about each axis. The XR headsets 150 may also operate to track hand poses and gestures to enable gesture-based interactions with “virtual” buttons and interfaces displayed through the XR headsets 150 and can also interpret hand or finger pointing or gesturing as various defined commands. Additionally, the XR headsets 150 may have a 1-10x magnification digital color camera sensor called a digital loupe. In some embodiments, one or more of the XR headsets 150 are minimalistic XR headsets that display local or remote information but include fewer sensors and are therefore more lightweight.
  • An “outside-in” machine vision navigation bar supports the tracking cameras 204 and may include a color camera. The machine vision navigation bar generally has a more stable view of the environment because it does not move as often or as quickly as the XR headsets 150 while positioned on wearers’ heads. The patient reference array 116 (DRB) is generally rigidly attached to the patient with stable pitch and roll relative to gravity. This local rigid patient reference 116 can serve as a common reference for reference frames relative to other tracked arrays, such as a reference array on the end effector 112, instrument reference array 170, and reference arrays on the XR headsets 150.
  • During a surgical procedure using surgical navigation, the surveillance marker 500 is affixed to the patient to provide information on whether the patient reference array 116 has shifted. For example, during a spinal fusion procedure with planned placement of pedicle screw fixation, two small incisions are made over the posterior superior iliac spine bilaterally. The DRB and the surveillance marker are then affixed to the posterior superior iliac spine bilaterally. If the surveillance marker’s 500 location changes relative to the patient reference array 116, the camera tracking system 200 may display a meter indicating the amount of movement and/or may display a pop-up warning message to inform the user that the patient reference array may have been bumped. If the patient reference array has indeed been bumped, the registration of the patient reference array to the tracked coordinate system may be invalid and could result in erroneous navigation which is off target.
  • When present, the surgical robot (also “robot”) may be positioned near or next to patient 210. The robot 100 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the surgical procedure. The camera tracking system 200 may be separated from the robot system 100 and positioned at the foot of patient 210. This location allows the tracking camera 200 to have a direct visual line of sight to the surgical area 208. In the configuration shown, the surgeon 120 may be positioned across from the robot 100, but is still able to manipulate the end-effector 112 and the display 110. A surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110. If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. An anesthesiologist 122, nurse or scrub tech can operate equipment which may be connected to display information from the camera tracking system 200 on a display 34.
  • With respect to the other components of the robot 100, the display 110 can be attached to the surgical robot 100 or in a remote location. End-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor. In some embodiments, end-effector 112 can comprise a guide tube 114, which is configured to receive and orient a surgical instrument, tool, or implant used to perform a surgical procedure on the patient 210.
  • As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” The term “instrument” is used in a nonlimiting manner and can be used interchangeably with “tool” and “implant” to generally refer to any type of device that can be used during a surgical procedure in accordance with embodiments disclosed herein. Example instruments, tools, and implants include, without limitation, drills, screwdrivers, saws, dilators, retractors, probes, implant inserters, and implant devices such as a screws, spacers, interbody fusion devices, plates, rods, etc. Although generally shown with a guide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument in a desired manner.
  • The surgical robot 100 is operable to control the translation and orientation of the end-effector 112. The robot 100 may move the end-effector 112 under computer control along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis, and a Z Frame axis, such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively computer controlled. In some embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that utilize, for example, a 6DOF robot arm comprising only rotational axes. For example, the surgical robot 100 may be used to operate on patient 210, and robot arm 104 can be positioned above the body of patient 210, with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210.
  • In some example embodiments, the XR headsets 150 can be controlled to dynamically display an updated graphical indication of the pose of the surgical instrument so that the user can be aware of the pose of the surgical instrument at all times during the procedure.
  • In some further embodiments, surgical robot 100 can be operable to correct the path of a surgical instrument guided by the robot arm 104 if the surgical instrument strays from the selected, preplanned trajectory. The surgical robot 100 can be operable to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument. Thus, in use, a surgeon or other user can use the surgical robot 100 as part of computer assisted navigated surgery, and has the option to stop, modify, or manually control the autonomous or semi-autonomous movement of the end-effector 112 and/or the surgical instrument.
  • Reference arrays of markers can be formed on or connected to robot arms 102 and/or 104, the end-effector 112 (e.g., end-effector array 114 in FIG. 2 ), and/or a surgical instrument (e.g., instrument array 170) to track poses in 6DOF along 3 orthogonal axes and rotation about the axes. The reference arrays enable each of the marked objects (e.g., the end-effector 112, the patient 210, and the surgical instruments) to be tracked by the tracking camera 200, and the tracked poses can be used to provide navigated guidance during a surgical procedure and/or used to control movement of the surgical robot 100 for guiding the end-effector 112 and/or an instrument manipulated by the end-effector 112.
  • Referring to FIG. 3 the surgical robot 100 may include a display 110, upper arm 102, lower arm 104, end-effector 112, vertical column 312, casters 314, a table 318, and ring 324 which uses lights to indicate statuses and other information. Cabinet 106 may house electrical components of surgical robot 100 including, but not limited, to a battery, a power distribution module, a platform interface board module, and a computer. The camera tracking system 200 may include a display 36, tracking cameras 204, arm(s) 202, a computer housed in cabinet 330, and other components.
  • In computer-assisted navigated surgeries, perpendicular 2D scan slices, such as axial, sagittal, and/or coronal views, of patient anatomical structure are displayed to enable user visualization of the patient’s anatomy alongside the relative poses of surgical instruments. An XR headset or other display can be controlled to display one or more 2D scan slices of patient anatomy along with a 3D graphical model of anatomy. The 3D graphical model may be generated from a 3D scan of the patient, e.g., by a CT scan device, and/or may be generated based on a baseline model of anatomy which isn’t necessarily formed from a scan of the patient.
  • Example Surgical System
  • FIG. 4 illustrates a block diagram of a surgical system that includes an XR headset 150, a computer platform 400, imaging devices 420, and a surgical robot 100 which are configured to operate according to some embodiments.
  • The imaging devices 420 may include a C-arm imaging device, an O-arm imaging device, and/or a patient image database. The XR headset 150 provides an improved human interface for performing navigated surgical procedures. The XR headset 150 can be configured to provide functionalities, e.g., via the computer platform 400, that include without limitation any one or more of: identification of hand gesture based commands, display XR graphical objects on a display device 438 of the XR headset 150 and/or another display device. The display device 438 may include a video projector, flat panel display, etc. The user may view the XR graphical objects as an overlay anchored to particular real-world objects viewed through a see-through display screen. The XR headset 150 may additionally or alternatively be configured to display on the display device 438 video streams from cameras mounted to one or more XR headsets 150 and other cameras.
  • Electrical components of the XR headset 150 can include a plurality of cameras 430, a microphone 432, a gesture sensor 434, a pose sensor (e.g., inertial measurement unit (“IMU”)) 436, the display device 438, and a wireless/wired communication interface 440. The cameras 430 of the XR headset 150 may be visible light capturing cameras, near infrared capturing cameras, or a combination of both.
  • The cameras 430 may be configured to operate as the gesture sensor 434 by tracking for identification user hand gestures performed within the field of view of the camera(s) 430. Alternatively, the gesture sensor 434 may be a proximity sensor and/or a touch sensor that senses hand gestures performed proximately to the gesture sensor 434 and/or senses physical contact, e.g., tapping on the sensor 434 or its enclosure. The pose sensor 436, e.g., IMU, may include a multi-axis accelerometer, a tilt sensor, and/or another sensor that can sense rotation and/or acceleration of the XR headset 150 along one or more defined coordinate axes. Some or all of these electrical components may be contained in a head-worn component enclosure or may be contained in another enclosure configured to be worn elsewhere, such as on the hip or shoulder.
  • As explained above, a surgical system includes the camera tracking system 200 which may be connected to a computer platform 400 for operational processing and which may provide other operational functionality including a navigation controller 404 and/or of an XR headset controller 410. The surgical system may include the surgical robot 100. The navigation controller 404 can be configured to provide visual navigation guidance to an operator for moving and positioning a surgical tool relative to patient anatomical structure based on a surgical plan, e.g., from a surgical planning function, defining where a surgical procedure is to be performed using the surgical tool on the anatomical structure and based on a pose of the anatomical structure determined by the camera tracking system 200. The navigation controller 404 may be further configured to generate navigation information based on a target pose for a surgical tool, a pose of the anatomical structure, and a pose of the surgical tool and/or an end effector of the surgical robot 100, where the steering information is displayed through the display device 438 of the XR headset 150 and/or another display device to indicate where the surgical tool and/or the end effector of the surgical robot 100 should be moved to perform the surgical plan.
  • The electrical components of the XR headset 150 can be operatively connected to the electrical components of the computer platform 400 through the wired/wireless interface 440. The electrical components of the XR headset 150 may be operatively connected, e.g., through the computer platform 400 or directly connected, to various imaging devices 420, e.g., the C-arm imaging device, the I/O-arm imaging device, the patient image database, and/or to other medical equipment through the wired/wireless interface 440.
  • The surgical system may include a XR headset controller 410 that may at least partially reside in the XR headset 150, the computer platform 400, and/or in another system component connected via wired cables and/or wireless communication links. Various functionality is provided by software executed by the XR headset controller 410. The XR headset controller 410 is configured to receive information from the camera tracking system 200 and the navigation controller 404, and to generate an XR image based on the information for display on the display device 438.
  • The XR headset controller 410 can be configured to operationally process frames of tracking data from tracking cameras from the cameras 430 (tracking cameras), signals from the microphone 1620, and/or information from the pose sensor 436 and the gesture sensor 434, to generate information for display as XR images on the display device 438 and/or as other for display on other display devices for user viewing. Thus, the XR headset controller 410 illustrated as a circuit block within the XR headset 150 is to be understood as being operationally connected to other illustrated components of the XR headset 150 but not necessarily residing within a common housing or being otherwise transportable by the user. For example, the XR headset controller 410 may reside within the computer platform 400 which, in turn, may reside within the cabinet 330 of the camera tracking system 200, the cabinet 106 of the surgical robot 100, etc..
  • Turning now to FIGS. 6A-6C, the surgical robot system 100 relies on accurate positioning of the end-effector 112, surgical instruments 608, and/or the patient 210 (e.g., patient reference array 116) relative to the desired surgical area. In the embodiments shown in FIGS. FIGS. 6A-6C, the reference arrays include tracking markers 118, 804 which are rigidly attached to a portion of the instrument 608 and/or end-effector 112.
  • FIG. 6A depicts part of the surgical robot system 100 with the robot 102 including base 106, robot arm 104, and end-effector 112. The other elements, not illustrated, such as the display, marker tracking cameras, etc. may also be present as described herein. FIG. 6B depicts a close-up view of the end-effector 112 with guide tube 114 and a reference array that includes a plurality of tracking markers 118 rigidly affixed to the end-effector 112. In this embodiment, the plurality of tracking markers 118 are attached to the end-effector 112 configured as a guide tube. FIG. 6C depicts an instrument 608 (in this case, a probe) with a plurality of tracking markers 804 rigidly affixed to the instrument 608. As described elsewhere herein, the instrument 608 could include any suitable surgical instrument, such as, but not limited to, guide wire, cannula, a retractor, a drill, a reamer, a screwdriver, an insertion instrument, a removal instrument, or the like.
  • In FIG. 6C, the reference array 612 functions as the handle 620 of the instrument 608. Four markers 804 are attached to the handle 620 in a manner that is out of the way of the shaft 622 and tip 624. Stereophotogrammetric tracking by the tracking camera 200 of these four markers 804 allows the instrument 608 to be tracked as a rigid body and for the system 100 to precisely determine the location of the tip 624 and the orientation of the shaft 622 while the instrument 608 is moved within view of tracking camera 200.
  • To enable automatic tracking of one or more instruments 608, end-effector 112, or other object to be tracked in 3D (e.g., multiple rigid bodies), the markers 118, 804 on each instrument 608, end-effector 112, or the like, may be arranged asymmetrically with a known inter-marker spacing. The reason for asymmetric alignment is so that it is unambiguous which marker 118, 804 corresponds to a particular pose on the rigid body and whether markers 118, 804 are being viewed from the front or back, i.e., mirrored. For example, if the markers 118, 804 were arranged in a square on the instrument 608 or end-effector 112, it would be unclear to the system 100, 300, 600 which marker 118, 804 corresponded to which corner of the square. For example, for the instrument 608, it would be unclear which marker 804 was closest to the shaft 622. Thus, it would be unknown which way the shaft 622 was extending from the array 612. Accordingly, each array 612 and thus each instrument 608, end-effector 112, or other object to be tracked should have a unique marker pattern to allow it to be distinguished from other instruments 608 or other objects being tracked.
  • Asymmetry and unique marker patterns allow the tracking camera 200 and system 100 to detect individual markers 118, 804 then to check the marker spacing against a stored template to determine which instrument 608, end-effector 112, or another object they represent. Detected markers 118, 804 can then be sorted automatically and assigned to each tracked object in the correct order. Without this information, rigid body calculations could not then be performed to extract key geometric information, for example, such as instrument tip 624 and alignment of the shaft 622, unless the user manually specified which detected marker 118, 804 corresponded to which position on each rigid body.
  • FIGS. 7A-B illustrate medical imaging systems 1304 that may be used in conjunction with robot system 100 and/or navigation systems to acquire pre-operative, intra-operative, post-operative, and/or real-time image data of patient 210. Any appropriate subject matter may be imaged for any appropriate procedure using the imaging system 1304. The imaging system 1304 may be any imaging device such as a C-arm 1308 device, an O-arm 1306 device, a fluoroscopy imaging device, a magnetic resonance imaging scanner, etc. It may be desirable to take x-rays of patient 210 from a number of different positions, without the need for frequent manual repositioning of patient 210 which may be required in an x-ray system. As illustrated in FIG. 7A, the imaging system 1304 may be in the form of a C-arm 1308 that includes an elongated C-shaped member terminating in opposing distal ends 1312 of the “C” shape. C-shaped member 1130 may further comprise an x-ray source 1314 and an image receptor 1316. The space within C-arm 1308 of the arm may provide room for the physician to attend to the patient substantially free of interference from x-ray support structure 1318. As illustrated in FIG. 7B, the imaging system 1304 may include an O-arm imaging device 1306 having a gantry housing 1324 attached to a support structure imaging device support structure 1328, such as a wheeled mobile cart 1330 with wheels 1332, which may enclose an image capturing portion, not illustrated. The image capturing portion may include an x-ray source and/or emission portion and an x-ray receiving and/or image receiving portion, which may be disposed about one hundred and eighty degrees from each other and mounted on a rotor (not illustrated) relative to a track of the image capturing portion. The image capturing portion may be operable to rotate three hundred and sixty degrees during image acquisition. The image capturing portion may rotate around a central point and/or axis, allowing image data of patient 210 to be acquired from multiple directions or in multiple planes. Although certain imaging systems 1304 are exemplified herein, it will be appreciated that any suitable imaging system may be selected by one of ordinary skill in the art.
  • FIG. 8 illustrates a block diagram of components of a medical imaging system configured in accordance with some embodiments of the present disclosure. The medical imaging system includes a controller 3200, a imaging arm 3240 (e.g., a C-arm or an O-arm), a linear actuator and/or rotary actuator 3250 connected to an X-ray beam emitter or collector 3260. The controller 3200 includes an image processor 3210, a general processor 3220, and an I/O interface 3230. The image processor 3210 performs image processing to combine sets of images to generate a three-dimensional image of the scanned volume. The general processor 3220 is used to perform various embodiments of the present disclosure. The I/O interface 3230 communicatively couples the controller 3200 to other components of the medical imaging system. The imaging arm 3240 includes motors 3245 used to move the collector and emitter along an arc, e.g., three hundred and sixty degrees, during image acquisition. Motors 3245 are controlled by C-arm the controller 3200. The controller 3200 can also control movement of the linear actuator and/or rotary actuator 3250.
  • FIG. 9 illustrates an example of an accuracy and calibration module 3300. The accuracy and calibration module 3330 can include an interface 3310, a processing circuitry 3320, and a memory 3330. In some examples, the accuracy and calibration module is part of a system (e.g., an imaging system or a camera tracking system). The memory 3330 can include instructions stored therein that are executable by the processing circuitry to perform operations according to some embodiments herein.
  • Embodiments that include performing an accuracy check and/or calibrating of a tracked instrument based on contact with a touch sensor (e.g., a touchscreen of a display device) are described below.
  • In some embodiments, multiple points of contact (e.g., touch positions from the tip of a tracked instrument) can be detected by one or more touchpads that are themselves tracked by navigation camera. The instruments and the pressure touchpads can each have associated reference elements that are tracked by the navigation camera. In some examples, the touchpads are sensitive to pressure, capacitance, or resistance.
  • FIG. 11 illustrates an example of a set of touchpads 1110 coupled together to create an opening for accepting a tip of the tracked instrument. The associated reference element 1120 is coupled to the touchpads. In this example, the touchpads and reference arrays are securely housed in a supporting structure 1130 to reduce movement.
  • The touchpads 1110 can capture location of pressure points. Resistive touchpads are especially useful, since they do not rely on capacitance of the object. When an instrument is brought in the wedge, it touches at least two points on the touchpads 1110. The touchpads 1110 then send the location of sensed points to the system. The system also receives the position of pose of the touchpads and instruments via their associated reference elements 1120. Thus, the system can calculate the theoretical position of the tip of the instrument under test. It can then compare the tip location to the location reported by the three touchpads 1110.
  • Typically, the bottom touchpad would report position of a sharp or semi-sharp instrument tip. For a broader instrument, such as an Osteotome, there will be multiple touch-points on the bottom touchpads while the side touchpads will report straight lines of touch-points. The approximate position of the CAD model with respect to the touchpads is known already to the system based on the tracking information reported by the camera. Thus, the accuracy of the physical model can be calculated.
  • FIG. 12 illustrates an example of a tip of a tracked instrument 1240 contacting the touchpads 1110. The wedge shape of the opening between the touchpads 1110 allows an accuracy check of instruments with tips that are too big to fit in a typical divot used in navigation arrays.
  • FIG. 13 illustrates an example of operations to perform an accuracy check and calibrate a tracked instrument based on contact between the tracked instrument and the display devices. To calibrate an instrument, the reported touchpad points are compared against the theoretical model. First, the user touches instrument tip on all three touchpads in a way that reference elements of both the instrument and the touchpad structure are visible to the tracking camera. The theoretical position of the instrument tip with respect to touchpads is then calculated. This serves as the initial position estimate of the instrument tip. Since the relative position of three touchpads is known, the theoretical touchpoints of the CAD model for each touchpad are then calculated. The optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the theoretical touchpoints and the actual ones as shown in the algorithm below.
  • In some embodiments, these operations improve accuracy checks for instruments without a sharp tip or instruments that are too wide to fit in a traditional divot. In additional or alternative embodiments, these operation allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • FIG. 18 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a point of contact between the tracked instrument and a touch sensor. Although the operations are described below as being performed by the accuracy and calibration module 3300, any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • At block 1810, processing circuitry 3320 determines a virtual position of the touch sensor. In some examples, the term virtual position is used herein to describe a virtual location and a virtual pose of an object. In some embodiments, the system includes a camera. Determining the virtual position of the touch sensor includes: determining information about a shape of the touch sensor relative to a reference element coupled to the touch sensor; capturing, via the camera, an image of the reference element coupled to the touch sensor; determining a virtual position of the reference element coupled to the touch sensor relative to a dynamic reference base (“DRB”) based on the image of the reference element coupled to the touch sensor; and determining the virtual position of the touch sensor based on the information about the shape of the touch sensor and the virtual position of the reference element coupled to the touch sensor.
  • At block 1820, processing circuitry 3320 determines a virtual position of the tracked instrument. In some embodiments, the virtual position of the touch sensor and the virtual position of the tracked instrument are within the same virtual space (e.g., relative to a common reference point).
  • In additional or alternative embodiments, the system includes a camera. determining the virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument relative to the DRB based on the image of the reference element coupled to the tracked instrument; and determining the virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • At block 1830, processing circuitry 3320 determines a point of contact on a touch sensor between the tacked instrument and the touch sensor. In some embodiments, the system includes the touch sensor and the touch sensor includes a touchscreen (e.g., a pressure sensitive, resistance sensitive, or capacitance sensitive touchscreen). In some examples the touch sensor is part of a display device. Determining the point of contact includes detecting a location on the touchscreen that the tracked instrument is touching.
  • In additional or alternative embodiments, the touch sensor includes a plurality of touch sensors coupled together to form an opening. Determining the point of contact on the touch sensor includes determining a plurality of points of contact, each point of contact between one of the touch sensors of the plurality of touch sensors and the tracked instrument while the tracked instrument is positioned in the opening.
  • At block 1840, processing circuitry 3320 determines an expected point of contact on the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • In some embodiments, information about the shape of the tracked instrument is determined and the information an intended position of a tip of the tracked instrument relative to a reference element coupled to the tracked instrument. Determining the point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor. Determining the expected point of contact on the touch sensor can include determining a point of contact between the tip of the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
  • At block 1850, processing circuitry 3320 displays an indication of the expected point of contact. In some embodiments, the system includes a display device that includes the touch sensor. Determining the point of contact on the touch sensor between the tracked instrument and the touch sensor includes receiving an indication of the point of contact on the touch sensor from a user in response to displaying the indication of the expected point of contact.
  • At block 1860, processing circuitry 3320 determines whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
  • In some embodiments, determining the point of contact on the touch sensor includes determining a plurality of points of contact between the tracked instrument and the touch sensor. Determining the expected point of contact on the touch sensor includes determining a plurality of expected points of contact between the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument. Determining whether the tracked instrument is accurate includes determining whether the tracked instrument is accurate based on a difference between the plurality of points of contact and the plurality of expected points of contact.
  • At block 1870, processing circuitry 3320 performs an action based on whether the tracked instrument is accurate.
  • In some embodiments, determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold. In some examples, performing the action includes outputting an indication that the tracked instrument is not suitable for use. In additional or alternative examples, performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the point of contact, the expected point of contact, and the difference.
  • Various operations of FIG. 18 may be optional. For example, blocks 1850 and 1870 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on an image taken by a tracked imaging device are described below.
  • In some embodiments, multiple x-ray views of one or more tracked instruments are taken with a Fluoroscope that is tracked by a navigation camera using an attached registration fixture. Such registration fixtures are commonly used for surgical navigation using fluoroscopy.
  • FIG. 14 illustrates an example of an imaging device 1410 including an x-ray emitter 1420 and a x-ray detector 1430. The registration fixture 1440 is coupled to a predetermine portion of the imaging device 1410.
  • The registration fixture 1440 typically includes fiducials in two planes at known positions. These fiducials are then detected in images captured by a navigation camera. Using the known positions, the relative position of the emitter 1420 is then computed. The position of the detector 1440 is tracked using the attached reference element 1440 via a navigation camera. When an instrument tracked with a reference element is brought between the emitter and detector, its relative position with respect to registration fixture 1440 is calculated.
  • The CAD model of the associated instrument tip can then be projected on the fluoroscopy image to achieve navigation. Since the registration fixture can move after the x-ray image is captured, often a different reference element, called a DRB is solidly attached to the patient, so that all tracked positions are relative to the fixed DRB.
  • Since the rendered position of an instrument is only in 2D, at least two views, roughly orthogonal to each other, are used to track the instrument on two roughly orthogonal views to obtain pseudo-3D navigation.
  • FIGS. 15A-B illustrate an example in which a wedge-shaped tracked instrument is placed between the emitter 1420 and detector 1430, such that its views are captured by the fluoroscope in two positions. The corresponding images 1570a-b below the fluoroscope show the instrument profile in different angles. Note that most instruments are solid and are made up of metal, which absorbs most x-rays and shows up dark on an x-ray image.
  • Since the theoretical position of the tip of the instrument 1550 is known via the attached reference element 1560, the accuracy of the projection can be compared to the theoretical projection by detecting the dark instrument shape in a bright image. Thus, the accuracy can be calculated without needing a divot.
  • If multiple instruments can be placed within the field of view of the x-ray image, accuracy of all of them can be calculated simultaneously.
  • FIG. 16 illustrates an example of operations for performing an accuracy check and/or calibrating a tracked instrument using images of the tracked instrument. The x-ray views of an instrument are obtained as described above. The theoretical position of the instrument tip projected in the views then calculated. This serves as the initial position estimate of the instrument tip. Using the projection matrix, the theoretical view of the CAD model in each x-ray is then calculated. The optimization tweaks the position and pose of the CAD model of the instrument to obtain a close match between the CAD view and actual image as shown in the algorithm below.
  • In some examples, this is the same problem as matching a CT scan to multiple Fluoroscopy images in CTFluoro registration, except in this case a CAD model is used instead of a CT scan to compute dynamically rendered radiograph (“DRR”).
  • In some embodiments, these operations do not rely on a sharp tipped instrument fitting snugly in a divot, and can be used for accuracy checks of all types of instrument tips.
  • In additional or alternative embodiments, these operations improve accuracy checks for instruments without a sharp or straight tip.
  • In additional or alternative embodiments, these operations allow re-calibration or correction of theoretical instrument tip location based on actual measurements.
  • In additional or alternative embodiments, these operations enable accuracy checks and re-calibration of multiple instruments simultaneously.
  • FIG. 19 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on a pair of images taken by an imaging device. Although the operations are described below as being performed by the accuracy and calibration module 3300, any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • At block 1910, processing circuitry 3320 determines a first virtual position of an emitter. In some embodiments, the system includes a tracking camera and an imaging device including the emitter and a detector. Determining the first virtual position of the emitter includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a dynamic reference base (“DRB”)) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the emitter based on predetermined information indicating a position of the emitter relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device. In additional or alternative embodiments, the virtual position of the emitter is determined based on predetermined information indicating a position of the emitter relative to the detector and a virtual position of the detector.
  • At block 1915, processing circuitry 3320 determines a first virtual position of a detector. In some embodiments, the system includes a tracking camera and an imaging device including the emitter and the detector. Determining the first virtual position of the detector includes: capturing, via the camera, an image of a reference element coupled to the imaging device; determining a virtual position of the reference element coupled to the imaging device (e.g., relative to a DRB) based on the image of the reference element coupled to the imaging device; and determining the virtual position of the detector based on predetermined information indicating a position of the detector relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device.
  • At block 1920, processing circuitry 3320 determines a first virtual position of a tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector. In some embodiments, the system includes a tracking camera. Determining the first virtual position of the tracked instrument includes: determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument; capturing, via the camera, an image of the reference element coupled to the tracked instrument; determining a virtual position of the reference element coupled to the tracked instrument (e.g., relative to the DRB) based on the image of the reference element coupled to the tracked instrument; and determining the first virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument.
  • In additional or alternative embodiments, determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument.
  • At block 1925, processing circuitry 3320 determines a first expected image of the tracked instrument. In some embodiments, the first expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the first virtual position of the emitter, the first virtual position of the detector, the first virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • At block 1930, processing circuitry 3320 obtains a first image of the tracked instrument. In some embodiments, obtaining the first image of the tracked instrument includes receiving the first image from the imaging device.
  • At block 1935, processing circuitry 3320 rotates the imaging device (including the emitter and the detector). In some examples, the imaging device includes a C-arm or an O-arm imaging device.
  • At block 1940, processing circuitry 3320 determines a second virtual position of the emitter. In some embodiments, determining the second virtual position of the emitter includes receiving the second virtual position from a tracking system.
  • At block 1945, processing circuitry 3320 determines a second virtual position of the detector. In some embodiments, determining the second virtual position of the detector includes receiving the second virtual position from a tracking system.
  • At block 1950, processing circuitry 3320 determines a second virtual position of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector. In some embodiments, determining the second virtual position of the tracked instrument includes receiving the second virtual position from a tracking system.
  • In additional or alternative embodiments, the first virtual position of the tracked instrument is the second virtual position of the tracked instrument. For example, the imaging device can include at least one of a C-arm and a O-arm and responsive to obtaining the first image, the imaging device can be rotated (block 1935) such that the second virtual position of the emitter is different than the first virtual position of the emitter and that the second virtual position of the detector is different than the first virtual position of the detector. As a result an image of the tracked instrument from a different perspective can be taken without moving the tracked instrument.
  • In additional or alternative embodiments, the first virtual position of the tracked instrument is different than the second virtual position of the tracked instrument. The first virtual position of the emitter is the second virtual position of the emitter The first virtual position of the detector is the second virtual position. For example, without rotating the imaging device an image of the tracked instrument can be taken from a different perspective by moving the tracked instrument.
  • At block 1955, processing circuitry 3320 determines a second expected image of the tracked instrument. In some embodiments, the second expected image of the tracked instrument is determined by simulating operation of the emitter and the detector based on the second virtual position of the emitter, the second virtual position of the detector, the second virtual position of the tracked instrument, and a predetermined shape of the tracked instrument.
  • At block 1960, processing circuitry 3320 obtains a second image of the tracked instrument. In some embodiments, obtaining the second image of the tracked instrument includes receiving the second image from the imaging device.
  • At block 1965, processing circuitry 3320 determines whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image. In some embodiments, the first expected image, the second expected image, the first image, and the second image each include an image of the tip of the tracked instrument.
  • At block 1970, processing circuitry 3320 performs an action based on whether the tracked instrument is accurate. In some embodiments, determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold. In some examples, performing the action includes outputting an indication that the tracked instrument is not suitable for use. In additional or alternative examples, performing the action includes calibrating a tracking system used to track the tracked instrument using at least one of the first expected image, the second expected image, the first image, and the second image.
  • Various operations of FIG. 19 may be optional. For example, blocks 1935, 1940, 1945, and 1970 may be optional in some embodiments.
  • Embodiments that include performing an accuracy check and/or calibrating a tracked instrument based on comparison of an actual position with an expected position on a display device are described below.
  • In some embodiments, a display screen is available to show tracked instruments. In some examples, the display screen is near the surgical area and is already covered with sterile drape. The screen may be large size (e.g., 22 inches or larger). A reference element can be coupled to the display screen to allow it to be tracked by a navigation camera. A large reference element array can yield improved accuracy of tracking and, in some examples, due to the large physical size, more than four optical markers can be used to improve the fidelity of tracking.
  • In additional or alternative embodiments, when a user brings a navigated instrument near the display screen, its position with respect to the reference element on the display screen is calculated. The theoretical position of the tracked tip of the instrument CAD is then shown on the display screen. The user can visually compare the accuracy of the physical position of the instrument tip with the position displayed on the screen. With aid of a virtual measurement tool, the user can then assess the accuracy.
  • FIG. 17 illustrates an example of a display device 1710 displaying a theoretical position (front view 1730 and side view 1740) of the tip of a tracked instrument 1750. The display device 1710 has reference elements 1720 and the tracked instrument 1750 has reference elements 1760 for being tracked by a navigation camera.
  • In this example, the front view 1730 of the theoretical position of the tip of the tracked instrument 1750 is shown as a hollow triangle on the right half of the screen. The left half of the screen shows a side view 1740 of the theoretical position of the tip of the tracked instrument 1750, allowing assessment of theoretical height above the screen of the tracked instrument 1750.
  • In some embodiments, the display device can be used for performing an accuracy check of any shape of tracked instrument tip. Even unconventional tips, such as a hook can be easily visualized on the screen.
  • In additional or alternative embodiments, the same display screen can be used for an accuracy check of multiple instruments. In additional or alternative embodiments, the screen array is unlikely to be damaged during surgery due to splatter of blood or other smudges, since it is typically much farther from the surgical field compared to tracked instruments.
  • In additional or alternative embodiments, if the surface of the display screen can sense the touch of the instrument tip, the accuracy can be calculated as well instead of relying on visual assessment.
  • In some embodiments, using the display device to perform an accuracy check of a tracked instrument can improve fidelity of reference element array used for accuracy check and consistency of accuracy checks .
  • In additional or alternative embodiments, using the display device to perform an accuracy check of a tracked instrument can improve accuracy check workflow for instruments without a sharp, straight tip.
  • In additional or alternative embodiments, using the display device to perform an accuracy check of a tracked instrument can allow user for visual inspection and assessment of accuracy.
  • FIG. 20 illustrates an example of operations performed by a system to perform an accuracy check and/or calibration of a tracked instrument based on displaying a virtual position of the tracked instrument on a display device. Although the operations are described below as being performed by the accuracy and calibration module 3300, any suitable system (e.g., an imaging system or a tracking system) can perform these operations.
  • At block 2010, processing circuitry 3320 determines a virtual position of a tracked instrument relative to a display device.
  • At block 2020, processing circuitry 3320 displays an indication of the virtual position of the tracked instrument on the display device. In some embodiments, the processing circuitry determines an intended shape of the tracked instrument. For example, an accurate and/or undamaged shape of the tracked instrument. Displaying the indication of the virtual position of the tracked instrument includes: displaying on a first part of the display device, a first portion of the intended shape of the tracked instrument in a front view perspective based on the virtual position of the tracked instrument; and displaying on a second part of the display device, a second portion of the tracked instrument in a side view perspective based on the virtual position of the tracked instrument.
  • At block 2030, processing circuitry 3320 receives an indication of an actual position of the tracked instrument relative to the display device. In some embodiments, receiving the actual position of the tracked instrument includes receiving an indication from a user.
  • At block 2040, processing circuitry 3320 determines whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
  • At block 2050, processing circuitry 3320 performs an action based on whether the tracked instrument is accurate. In some embodiments, performing the action includes, responsive to determining whether the tracked instrument is accurate, outputting an indication of whether the tracked instrument is suitable for use. In additional or alternative embodiments, performing the action includes, responsive to determining whether the tracked instrument is accurate, calibrating a tracking system used to track the tracked instrument using at least one of the virtual position of the tracked instrument and the actual position of the tracked instrument.
  • Various operations of FIG. 20 may be optional. For example, block 2050 may be optional in some embodiments.
  • Further Definitions and Embodiments:
  • In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
  • When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
  • It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
  • As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
  • These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, microcode, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
  • It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
  • Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims (20)

What is claimed is:
1. A method of performing an accuracy check of a tracked instrument, the method comprising:
determining a virtual position within a virtual space of a touch sensor;
determining a virtual position within the virtual space of the tracked instrument;
determining a point of contact on the touch sensor between the tracked instrument and the touch sensor;
determining an expected point of contact on the touch sensor between the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument; and
determining whether the tracked instrument is accurate based on a difference between the point of contact and the expected point of contact.
2. The method of claim 1, wherein determining the virtual position of the touch sensor includes:
determining information about a shape of the touch sensor relative to a reference element coupled to the touch sensor;
capturing an image of the reference element coupled to the touch sensor;
determining a virtual position of the reference element coupled to the touch sensor based on the image of the reference element coupled to the touch sensor, the virtual position of the reference element coupled to the touch sensor including a virtual location and a virtual pose of the reference element coupled to the touch sensor; and
determining the virtual position of the touch sensor based on the information about the shape of the touch sensor and the virtual position of the reference element coupled to the touch sensor, the virtual position of the touch sensor including a virtual location and a virtual pose of the touch sensor, and
wherein determining the virtual position of the tracked instrument includes:
determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument;
capturing an image of the reference element coupled to the tracked instrument;
determining a virtual position of the reference element coupled to the tracked instrument based on the image of the reference element coupled to the tracked instrument, the virtual position of the reference element coupled to the tracked instrument including a virtual location and a virtual pose of the reference element coupled to the tracked instrument; and
determining the virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument, the virtual position of the tracked instrument including a virtual location and a virtual pose of the tracked instrument.
3. The method of claim 2, wherein determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument,
wherein determining the point of contact on the touch sensor includes determining a point of contact between the tip of the tracked instrument and the touch sensor, and
wherein determining the expected point of contact on the touch sensor includes determining a point of contact between the tip of the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument.
4. The method of claim 1, wherein the touch sensor includes a touchscreen, and
wherein determining the point of contact includes detecting a location on the touchscreen that the tracked instrument is touching.
5. The method of claim 4, wherein the touch sensor includes a plurality of touch sensors coupled together to form an opening, and
wherein determining the point of contact on the touch sensor includes determining a plurality of points of contact, each point of contact between one of the touch sensors of the plurality of touch sensors and the tracked instrument while the tracked instrument is positioned in the opening.
6. The method of claim 1, wherein determining the point of contact on the touch sensor includes determining a plurality of points of contact between the tracked instrument and the touch sensor,
wherein determining the expected point of contact on the touch sensor includes determining a plurality of expected points of contact between the tracked instrument and the touch sensor based on the virtual position of the touch sensor and the virtual position of the tracked instrument, and
wherein determining whether the tracked instrument is accurate includes determining whether the tracked instrument is accurate based on a difference between the plurality of points of contact and the plurality of expected points of contact.
7. The method of claim 1, wherein determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold,
the method further comprising:
outputting an indication that the tracked instrument is not suitable for use.
8. The method of claim 1, wherein determining whether the tracked instrument is accurate includes determining that the difference exceeds a predetermined threshold,
the method further comprising:
calibrating a tracking system used to track the tracked instrument using at least one of the point of contact, the expected point of contact, and the difference.
9. The method of claim 1, wherein a display device includes the touch sensor
the method further comprising:
displaying, via the display device, an indication of the expected point of contact,
wherein determining the point of contact on the display device between the tracked instrument and the display device includes receiving an indication of the point of contact on the display device from a user.
10. A method of performing an accuracy check of a tracked instrument, the method comprising:
determining a first virtual position within a virtual space of an emitter of an imaging device;
determining a first virtual position within the virtual space of a detector of the image device;
determining a first virtual position within the virtual space of the tracked instrument while the tracked instrument is at a first physical position between the emitter and the detector;
determining a first expected image of the tracked instrument based on the first virtual position of the emitter, the first virtual position of the detector, and the first virtual position of the tracked instrument;
obtaining a first image of the tracked instrument while it is positioned at the first physical position between the emitter and the detector;
determining a second virtual position within the virtual space of the emitter of the imaging device;
determining a second virtual position within the virtual space of the detector of the imaging device;
determining a second virtual position within the virtual space of the tracked instrument while the tracked instrument is at a second physical position between the emitter and the detector;
determining a second expected image of the tracked instrument based on the second virtual position of the emitter, the second virtual position of the detector, and the second virtual position of the tracked instrument;
obtaining a second image of the tracked instrument while it is positioned between the emitter and the detector, the second image being different than the first image; and determining whether the tracked instrument is accurate based on the first expected image, the second expected image, the first image, and the second image.
11. The method of claim 10, wherein the imaging device includes the emitter and the detector,
wherein determining the first virtual position of the detector includes:
capturing an image of a reference element coupled to the imaging device;
determining a virtual position of the reference element coupled to the imaging device based on the image of the reference element coupled to the imaging device, the virtual position of the refence element coupled to the imaging device including a virtual location and a virtual pose of the reference element coupled to the imaging device; and
determining the virtual position of the detector based on predetermined information indicating a position of the detector relative to the reference element coupled to the imaging device and the virtual position of the reference element coupled to the imaging device, the virtual position of the detector including a virtual location and a virtual pose of the detector,
wherein determining the first virtual position of the emitter includes:
determining the virtual position of the emitter based on predetermined information indicating a position of the emitter relative to the detector and the virtual position of the detector, the virtual position of the emitter including a virtual location and a virtual pose of the emitter, and
wherein determining the first virtual position of the tracked instrument includes:
determining information about a shape of the tracked instrument relative to a reference element coupled to the tracked instrument;
capturing an image of the reference element coupled to the tracked instrument;
determining a virtual position of the reference element coupled to the tracked instrument relative to the DRB based on the image of the reference element coupled to the tracked instrument, the virtual position of the refence element coupled to the tracked instrument including a virtual location and a virtual pose of the reference element coupled to the tracked instrument; and
determining the first virtual position of the tracked instrument based on the shape of the tracked instrument and the reference element coupled to the tracked instrument, the virtual position of the tracked instrument including a virtual location and a virtual pose of the tracked instrument.
12. The method of claim 11, wherein determining the information about the shape of the tracked instrument includes determining an intended position of a tip of the tracked instrument relative to the reference element coupled to the tracked instrument, and
wherein the first expected image, the second expected image, the first image, and the second image each include an image of the tip.
13. The method of claim 11, wherein the first virtual position of the tracked instrument is the second virtual position of the tracked instrument, and
wherein the imaging device includes at least one of a C-arm and a O-arm,
the method further comprising:
responsive to obtaining the first image, rotating the imaging device such that the second virtual position of the emitter is different than the first virtual position of the emitter and that the second virtual position of the detector is different than the first virtual position of the detector.
14. The method of claim 10, wherein the first virtual position of the tracked instrument is different than the second virtual position of the tracked instrument,
wherein the first virtual position of the emitter is the second virtual position of the emitter, and
wherein the first virtual position of the detector is the second virtual position of the detector.
15. The method of claim 10, wherein determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold,
the method further comprising:
outputting an indication that the tracked instrument is not suitable for use.
16. The method of claim 10, wherein determining whether the tracked instrument is accurate includes determining that a difference between the first expected image and/or the second expected image and the first image and/or the second image exceeds a predetermined threshold,
the method further comprising:
calibrating a tracking system used to track the tracked instrument using at least one of the first expected image, the second expected image, the first image, and the second image.
17. A method of performing an accuracy check of a tracked instrument, the method comprising:
determining a virtual position within a virtual space of the tracked instrument relative to a display device;
displaying an indication of the virtual position of the tracked instrument on the display device;
receiving an indication of an actual position of the tracked instrument relative to the display device; and
determining whether the tracked instrument is accurate based on the indication of the actual position relative to the virtual position of the tracked instrument.
18. The method of claim 17, further comprising:
determining an intended shape of the tracked instrument,
wherein displaying the indication of the virtual position of the tracked instrument includes:
displaying on a first part of the display device, a first portion of the intended shape of the tracked instrument in a front view perspective based on the virtual position of the tracked instrument; and
displaying on a second part of the display device, a second portion of the tracked instrument in a side view perspective based on the virtual position of the tracked instrument.
19. The method of claim 17, wherein receiving the actual position of the tracked instrument includes receiving an indication from a user.
20. The method of claim 17, further comprising at least one of:
responsive to determining whether the tracked instrument is accurate, outputting an indication of whether the tracked instrument is suitable for use; and
responsive to determining whether the tracked instrument is accurate, calibrating a tracking system used to track the tracked instrument using at least one of the virtual position of the tracked instrument and the actual position of the tracked instrument.
US17/663,024 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments Pending US20230368418A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/663,024 US20230368418A1 (en) 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/662,666 US12394086B2 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments
US17/663,024 US20230368418A1 (en) 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/662,666 Continuation US12394086B2 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments

Publications (1)

Publication Number Publication Date
US20230368418A1 true US20230368418A1 (en) 2023-11-16

Family

ID=88699268

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/662,666 Active 2043-10-06 US12394086B2 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments
US17/663,024 Pending US20230368418A1 (en) 2022-05-10 2022-05-12 Accuracy check and automatic calibration of tracked instruments

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/662,666 Active 2043-10-06 US12394086B2 (en) 2022-05-10 2022-05-10 Accuracy check and automatic calibration of tracked instruments

Country Status (1)

Country Link
US (2) US12394086B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230225797A1 (en) * 2022-01-18 2023-07-20 Stryker European Operations Limited Technique For Determining A Need For A Re-Registration Of A Patient Tracker

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4397270A1 (en) * 2023-01-04 2024-07-10 Stryker European Operations Limited Technique for determining an object marker arrangement

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140357989A1 (en) * 2012-01-03 2014-12-04 Koninklijke Philips N.V. Position determining apparatus
EP3628263A1 (en) * 2018-09-27 2020-04-01 Koninklijke Philips N.V. Guidance in lung intervention procedures
US11269406B1 (en) * 2019-10-24 2022-03-08 Facebook Technologies, Llc Systems and methods for calibrating eye tracking

Family Cites Families (554)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2614083B2 (en) 1976-04-01 1979-02-08 Siemens Ag, 1000 Berlin Und 8000 Muenchen X-ray film device for the production of transverse slice images
US5354314A (en) 1988-12-23 1994-10-11 Medical Instrumentation And Diagnostics Corporation Three-dimensional beam localization apparatus and microscope for stereotactic diagnoses or surgery mounted on robotic type arm
US5246010A (en) 1990-12-11 1993-09-21 Biotrine Corporation Method and apparatus for exhalation analysis
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5631973A (en) 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
US6963792B1 (en) 1992-01-21 2005-11-08 Sri International Surgical method
US5657429A (en) 1992-08-10 1997-08-12 Computer Motion, Inc. Automated endoscope system optimal positioning
US5397323A (en) 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
WO1996011624A2 (en) 1994-10-07 1996-04-25 St. Louis University Surgical navigation systems including reference and localization frames
DE69417229T2 (en) 1993-05-14 1999-07-08 Sri International, Menlo Park, Calif. SURGERY DEVICE
JP3378401B2 (en) 1994-08-30 2003-02-17 株式会社日立メディコ X-ray equipment
US6646541B1 (en) 1996-06-24 2003-11-11 Computer Motion, Inc. General purpose distributed operating room control system
US6978166B2 (en) 1994-10-07 2005-12-20 Saint Louis University System for use in displaying images of a body part
US5882206A (en) 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
US5887121A (en) 1995-04-21 1999-03-23 International Business Machines Corporation Method of constrained Cartesian control of robotic mechanisms with active and passive joints
US6122541A (en) 1995-05-04 2000-09-19 Radionics, Inc. Head band for frameless stereotactic registration
US5649956A (en) 1995-06-07 1997-07-22 Sri International System and method for releasably holding a surgical instrument
US5825982A (en) 1995-09-15 1998-10-20 Wright; James Head cursor control interface for an automated endoscope system for optimal positioning
US5772594A (en) 1995-10-17 1998-06-30 Barrick; Earl F. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US5855583A (en) 1996-02-20 1999-01-05 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
SG64340A1 (en) 1996-02-27 1999-04-27 Inst Of Systems Science Nation Curved surgical instruments and methods of mapping a curved path for stereotactic surgery
US6167145A (en) 1996-03-29 2000-12-26 Surgical Navigation Technologies, Inc. Bone navigation system
US5792135A (en) 1996-05-20 1998-08-11 Intuitive Surgical, Inc. Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
US6167296A (en) 1996-06-28 2000-12-26 The Board Of Trustees Of The Leland Stanford Junior University Method for volumetric image navigation
US7302288B1 (en) 1996-11-25 2007-11-27 Z-Kat, Inc. Tool position indicator
US8529582B2 (en) 1996-12-12 2013-09-10 Intuitive Surgical Operations, Inc. Instrument interface of a robotic surgical system
US7727244B2 (en) 1997-11-21 2010-06-01 Intuitive Surgical Operation, Inc. Sterile surgical drape
US6205411B1 (en) 1997-02-21 2001-03-20 Carnegie Mellon University Computer-assisted surgery planner and intra-operative guidance system
US6012216A (en) 1997-04-30 2000-01-11 Ethicon, Inc. Stand alone swage apparatus
US5820559A (en) 1997-03-20 1998-10-13 Ng; Wan Sing Computerized boundary estimation in medical images
US5911449A (en) 1997-04-30 1999-06-15 Ethicon, Inc. Semi-automated needle feed method and apparatus
US6231565B1 (en) 1997-06-18 2001-05-15 United States Surgical Corporation Robotic arm DLUs for performing surgical tasks
EP1015944B1 (en) 1997-09-19 2013-02-27 Massachusetts Institute Of Technology Surgical robotic apparatus
US6226548B1 (en) 1997-09-24 2001-05-01 Surgical Navigation Technologies, Inc. Percutaneous registration apparatus and method for use in computer-assisted surgical navigation
US5951475A (en) 1997-09-25 1999-09-14 International Business Machines Corporation Methods and apparatus for registering CT-scan data to multiple fluoroscopic images
US5987960A (en) 1997-09-26 1999-11-23 Picker International, Inc. Tool calibrator
US6212419B1 (en) 1997-11-12 2001-04-03 Walter M. Blume Method and apparatus using shaped field of repositionable magnet to guide implant
US6157853A (en) 1997-11-12 2000-12-05 Stereotaxis, Inc. Method and apparatus using shaped field of repositionable magnet to guide implant
US6031888A (en) 1997-11-26 2000-02-29 Picker International, Inc. Fluoro-assist feature for a diagnostic imaging device
US6165170A (en) 1998-01-29 2000-12-26 International Business Machines Corporation Laser dermablator and dermablation
US6949106B2 (en) 1998-02-24 2005-09-27 Endovia Medical, Inc. Surgical instrument
FR2779339B1 (en) 1998-06-09 2000-10-13 Integrated Surgical Systems Sa MATCHING METHOD AND APPARATUS FOR ROBOTIC SURGERY, AND MATCHING DEVICE COMPRISING APPLICATION
US6477400B1 (en) 1998-08-20 2002-11-05 Sofamor Danek Holdings, Inc. Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
DE19839825C1 (en) 1998-09-01 1999-10-07 Siemens Ag Diagnostic X=ray device
US6033415A (en) 1998-09-14 2000-03-07 Integrated Surgical Systems System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system
DE19842798C1 (en) 1998-09-18 2000-05-04 Howmedica Leibinger Gmbh & Co Calibration device
AU6421599A (en) 1998-10-09 2000-05-01 Surgical Navigation Technologies, Inc. Image guided vertebral distractor
US6659939B2 (en) 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US8527094B2 (en) 1998-11-20 2013-09-03 Intuitive Surgical Operations, Inc. Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures
US7125403B2 (en) 1998-12-08 2006-10-24 Intuitive Surgical In vivo accessories for minimally invasive robotic surgery
US6325808B1 (en) 1998-12-08 2001-12-04 Advanced Realtime Control Systems, Inc. Robotic system, docking station, and surgical tool for collaborative control in minimally invasive surgery
US6322567B1 (en) 1998-12-14 2001-11-27 Integrated Surgical Systems, Inc. Bone motion tracking system
US6451027B1 (en) 1998-12-16 2002-09-17 Intuitive Surgical, Inc. Devices and methods for moving an image capture device in telesurgical systems
US7016457B1 (en) 1998-12-31 2006-03-21 General Electric Company Multimode imaging system for generating high quality images
DE19905974A1 (en) 1999-02-12 2000-09-07 Siemens Ag Computer tomography scanning method using multi-line detector
US6560354B1 (en) 1999-02-16 2003-05-06 University Of Rochester Apparatus and method for registration of images to physical space using a weighted combination of points and surfaces
US6144875A (en) 1999-03-16 2000-11-07 Accuray Incorporated Apparatus and method for compensating for respiratory and patient motion during treatment
US6778850B1 (en) 1999-03-16 2004-08-17 Accuray, Inc. Frameless radiosurgery treatment system and method
US6501981B1 (en) 1999-03-16 2002-12-31 Accuray, Inc. Apparatus and method for compensating for respiratory and patient motions during treatment
US6470207B1 (en) 1999-03-23 2002-10-22 Surgical Navigation Technologies, Inc. Navigational guidance via computer-assisted fluoroscopic imaging
JP2000271110A (en) 1999-03-26 2000-10-03 Hitachi Medical Corp Medical x-ray system
US6594552B1 (en) 1999-04-07 2003-07-15 Intuitive Surgical, Inc. Grip strength with tactile feedback for robotic surgery
US6424885B1 (en) 1999-04-07 2002-07-23 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US6565554B1 (en) 1999-04-07 2003-05-20 Intuitive Surgical, Inc. Friction compensation in a minimally invasive surgical apparatus
US6301495B1 (en) 1999-04-27 2001-10-09 International Business Machines Corporation System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan
DE19927953A1 (en) 1999-06-18 2001-01-11 Siemens Ag X=ray diagnostic apparatus
US6314311B1 (en) 1999-07-28 2001-11-06 Picker International, Inc. Movable mirror laser registration system
US6788018B1 (en) 1999-08-03 2004-09-07 Intuitive Surgical, Inc. Ceiling and floor mounted surgical robot set-up arms
US7594912B2 (en) 2004-09-30 2009-09-29 Intuitive Surgical, Inc. Offset remote center manipulator for robotic surgery
US9492235B2 (en) 1999-09-17 2016-11-15 Intuitive Surgical Operations, Inc. Manipulator arm-to-patient collision avoidance using a null-space
US8271130B2 (en) 2009-03-09 2012-09-18 Intuitive Surgical Operations, Inc. Master controller having redundant degrees of freedom and added forces to create internal motion
US8004229B2 (en) 2005-05-19 2011-08-23 Intuitive Surgical Operations, Inc. Software center and highly configurable robotic systems for surgery and other uses
US6312435B1 (en) 1999-10-08 2001-11-06 Intuitive Surgical, Inc. Surgical instrument with extended reach for use in minimally invasive surgery
US7366562B2 (en) 2003-10-17 2008-04-29 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6499488B1 (en) 1999-10-28 2002-12-31 Winchester Development Associates Surgical sensor
US8644907B2 (en) 1999-10-28 2014-02-04 Medtronic Navigaton, Inc. Method and apparatus for surgical navigation
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
US8239001B2 (en) 2003-10-17 2012-08-07 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US6235038B1 (en) 1999-10-28 2001-05-22 Medtronic Surgical Navigation Technologies System for translation of electromagnetic and optical localization systems
US20010036302A1 (en) 1999-12-10 2001-11-01 Miller Michael I. Method and apparatus for cross modality image registration
US7635390B1 (en) 2000-01-14 2009-12-22 Marctec, Llc Joint replacement component having a modular articulating surface
US6377011B1 (en) 2000-01-26 2002-04-23 Massachusetts Institute Of Technology Force feedback user interface for minimally invasive surgical simulator and teleoperator and other similar apparatus
WO2001056007A1 (en) 2000-01-28 2001-08-02 Intersense, Inc. Self-referenced tracking
WO2001064124A1 (en) 2000-03-01 2001-09-07 Surgical Navigation Technologies, Inc. Multiple cannula image guided tool for image guided procedures
EP1265547A1 (en) 2000-03-15 2002-12-18 Orthosoft Inc. Automatic calibration system for computer-aided surgical instruments
US6535756B1 (en) 2000-04-07 2003-03-18 Surgical Navigation Technologies, Inc. Trajectory storage apparatus and method for surgical navigation system
US6490475B1 (en) 2000-04-28 2002-12-03 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6856827B2 (en) 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6856826B2 (en) 2000-04-28 2005-02-15 Ge Medical Systems Global Technology Company, Llc Fluoroscopic tracking and visualization system
US6614453B1 (en) 2000-05-05 2003-09-02 Koninklijke Philips Electronics, N.V. Method and apparatus for medical image display for surgical tool planning and navigation in clinical environments
US6645196B1 (en) 2000-06-16 2003-11-11 Intuitive Surgical, Inc. Guided tool change
US6782287B2 (en) 2000-06-27 2004-08-24 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for tracking a medical instrument based on image registration
US6837892B2 (en) 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
US6902560B1 (en) 2000-07-27 2005-06-07 Intuitive Surgical, Inc. Roll-pitch-roll surgical tool
DE10037491A1 (en) 2000-08-01 2002-02-14 Stryker Leibinger Gmbh & Co Kg Process for three-dimensional visualization of structures inside the body
US6823207B1 (en) 2000-08-26 2004-11-23 Ge Medical Systems Global Technology Company, Llc Integrated fluoroscopic surgical navigation and imaging workstation with command protocol
EP1323120B1 (en) 2000-09-25 2018-11-14 Z-Kat Inc. Fluoroscopic registration artifact with optical and/or magnetic markers
AU2002215822A1 (en) 2000-10-23 2002-05-06 Deutsches Krebsforschungszentrum Stiftung Des Offentlichen Rechts Method, device and navigation aid for navigation during medical interventions
US6718194B2 (en) 2000-11-17 2004-04-06 Ge Medical Systems Global Technology Company, Llc Computer assisted intramedullary rod surgery system with enhanced features
US6666579B2 (en) 2000-12-28 2003-12-23 Ge Medical Systems Global Technology Company, Llc Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system
US6840938B1 (en) 2000-12-29 2005-01-11 Intuitive Surgical, Inc. Bipolar cauterizing instrument
US7043961B2 (en) 2001-01-30 2006-05-16 Z-Kat, Inc. Tool calibrator and tracker system
US7220262B1 (en) 2001-03-16 2007-05-22 Sdgi Holdings, Inc. Spinal fixation system and related methods
FR2822674B1 (en) 2001-04-03 2003-06-27 Scient X STABILIZED INTERSOMATIC MELTING SYSTEM FOR VERTEBERS
WO2002083003A1 (en) 2001-04-11 2002-10-24 Clarke Dana S Tissue structure identification in advance of instrument
US6994708B2 (en) 2001-04-19 2006-02-07 Intuitive Surgical Robotic tool with monopolar electro-surgical scissors
US7824401B2 (en) 2004-10-08 2010-11-02 Intuitive Surgical Operations, Inc. Robotic tool with wristed monopolar electrosurgical end effectors
US8398634B2 (en) 2002-04-18 2013-03-19 Intuitive Surgical Operations, Inc. Wristed robotic surgical tool for pluggable end-effectors
US6783524B2 (en) 2001-04-19 2004-08-31 Intuitive Surgical, Inc. Robotic surgical tool with ultrasound cauterizing and cutting instrument
US6636757B1 (en) 2001-06-04 2003-10-21 Surgical Navigation Technologies, Inc. Method and apparatus for electromagnetic navigation of a surgical probe near a metal object
US7607440B2 (en) 2001-06-07 2009-10-27 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
WO2002100284A1 (en) 2001-06-13 2002-12-19 Volume Interactions Pte Ltd A guide system
US6584339B2 (en) 2001-06-27 2003-06-24 Vanderbilt University Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery
US7063705B2 (en) 2001-06-29 2006-06-20 Sdgi Holdings, Inc. Fluoroscopic locator and registration device
AU2002322374B2 (en) 2001-06-29 2006-10-26 Intuitive Surgical, Inc. Platform link wrist mechanism
US20040243147A1 (en) 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
ITMI20011759A1 (en) 2001-08-09 2003-02-09 Nuovo Pignone Spa SCRAPER DEVICE FOR PISTON ROD OF ALTERNATIVE COMPRESSORS
US7708741B1 (en) 2001-08-28 2010-05-04 Marctec, Llc Method of preparing bones for knee replacement surgery
US6728599B2 (en) 2001-09-07 2004-04-27 Computer Motion, Inc. Modularity system for computer assisted surgery
US6587750B2 (en) 2001-09-25 2003-07-01 Intuitive Surgical, Inc. Removable infinite roll master grip handle and touch sensor for robotic surgery
US6619840B2 (en) 2001-10-15 2003-09-16 Koninklijke Philips Electronics N.V. Interventional volume scanner
US6839612B2 (en) 2001-12-07 2005-01-04 Institute Surgical, Inc. Microwrist system for surgical procedures
US6947786B2 (en) 2002-02-28 2005-09-20 Surgical Navigation Technologies, Inc. Method and apparatus for perspective inversion
US8996169B2 (en) 2011-12-29 2015-03-31 Mako Surgical Corp. Neural monitor-based dynamic haptics
EP1485697A2 (en) 2002-03-19 2004-12-15 Breakaway Imaging, Llc Computer tomograph with a detector following the movement of a pivotable x-ray source
WO2003086714A2 (en) 2002-04-05 2003-10-23 The Trustees Of Columbia University In The City Of New York Robotic scrub nurse
US7099428B2 (en) 2002-06-25 2006-08-29 The Regents Of The University Of Michigan High spatial resolution X-ray computed tomography (CT) system
US7248914B2 (en) 2002-06-28 2007-07-24 Stereotaxis, Inc. Method of navigating medical devices in the presence of radiopaque material
US7630752B2 (en) 2002-08-06 2009-12-08 Stereotaxis, Inc. Remote control of medical devices using a virtual device interface
US7231063B2 (en) 2002-08-09 2007-06-12 Intersense, Inc. Fiducial detection system
WO2004015369A2 (en) 2002-08-09 2004-02-19 Intersense, Inc. Motion tracking system and method
CA2437286C (en) 2002-08-13 2008-04-29 Garnette Roy Sutherland Microsurgical robot system
US6892090B2 (en) 2002-08-19 2005-05-10 Surgical Navigation Technologies, Inc. Method and apparatus for virtual endoscopy
US7331967B2 (en) 2002-09-09 2008-02-19 Hansen Medical, Inc. Surgical instrument coupling mechanism
ES2204322B1 (en) 2002-10-01 2005-07-16 Consejo Sup. De Invest. Cientificas FUNCTIONAL BROWSER.
JP3821435B2 (en) 2002-10-18 2006-09-13 松下電器産業株式会社 Ultrasonic probe
US7319897B2 (en) 2002-12-02 2008-01-15 Aesculap Ag & Co. Kg Localization device display method and apparatus
US7318827B2 (en) 2002-12-02 2008-01-15 Aesculap Ag & Co. Kg Osteotomy procedure
US8814793B2 (en) 2002-12-03 2014-08-26 Neorad As Respiration monitor
US7386365B2 (en) 2004-05-04 2008-06-10 Intuitive Surgical, Inc. Tool grip calibration for robotic surgery
US7945021B2 (en) 2002-12-18 2011-05-17 Varian Medical Systems, Inc. Multi-mode cone beam CT radiotherapy simulator and treatment machine with a flat panel imager
US7505809B2 (en) 2003-01-13 2009-03-17 Mediguide Ltd. Method and system for registering a first image with a second image relative to the body of a patient
US7660623B2 (en) 2003-01-30 2010-02-09 Medtronic Navigation, Inc. Six degree of freedom alignment display for medical procedures
US7542791B2 (en) 2003-01-30 2009-06-02 Medtronic Navigation, Inc. Method and apparatus for preplanning a surgical procedure
WO2004069040A2 (en) 2003-02-04 2004-08-19 Z-Kat, Inc. Method and apparatus for computer assistance with intramedullary nail procedure
US6988009B2 (en) 2003-02-04 2006-01-17 Zimmer Technology, Inc. Implant registration device for surgical navigation system
US7083615B2 (en) 2003-02-24 2006-08-01 Intuitive Surgical Inc Surgical tool having electrocautery energy supply conductor with inhibited current leakage
JP4163991B2 (en) 2003-04-30 2008-10-08 株式会社モリタ製作所 X-ray CT imaging apparatus and imaging method
US9060770B2 (en) 2003-05-20 2015-06-23 Ethicon Endo-Surgery, Inc. Robotically-driven surgical instrument with E-beam driver
US7194120B2 (en) 2003-05-29 2007-03-20 Board Of Regents, The University Of Texas System Methods and systems for image-guided placement of implants
US7171257B2 (en) 2003-06-11 2007-01-30 Accuray Incorporated Apparatus and method for radiosurgery
US9002518B2 (en) 2003-06-30 2015-04-07 Intuitive Surgical Operations, Inc. Maximum torque driving of robotic surgical tools in robotic surgical systems
US7960935B2 (en) 2003-07-08 2011-06-14 The Board Of Regents Of The University Of Nebraska Robotic devices with agent delivery components and related methods
US7042184B2 (en) 2003-07-08 2006-05-09 Board Of Regents Of The University Of Nebraska Microrobot for surgical applications
US7324623B2 (en) 2003-07-15 2008-01-29 Koninklijke Philips Electronics N. V. Computed tomography scanner with large gantry bore
US7313430B2 (en) 2003-08-28 2007-12-25 Medtronic Navigation, Inc. Method and apparatus for performing stereotactic surgery
US7835778B2 (en) 2003-10-16 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for surgical navigation of a multiple piece construct for implantation
US20050171558A1 (en) 2003-10-17 2005-08-04 Abovitz Rony A. Neurosurgery targeting and delivery system for brain structures
US7840253B2 (en) 2003-10-17 2010-11-23 Medtronic Navigation, Inc. Method and apparatus for surgical navigation
US20050096502A1 (en) 2003-10-29 2005-05-05 Khalili Theodore M. Robotic surgical device
US9393039B2 (en) 2003-12-17 2016-07-19 Brainlab Ag Universal instrument or instrument set for computer guided surgery
US7466303B2 (en) 2004-02-10 2008-12-16 Sunnybrook Health Sciences Center Device and process for manipulating real and virtual objects in three-dimensional space
US7974681B2 (en) 2004-03-05 2011-07-05 Hansen Medical, Inc. Robotic catheter system
WO2005086062A2 (en) 2004-03-05 2005-09-15 Depuy International Limited Registration methods and apparatus
US20080269596A1 (en) 2004-03-10 2008-10-30 Ian Revie Orthpaedic Monitoring Systems, Methods, Implants and Instruments
US7657298B2 (en) 2004-03-11 2010-02-02 Stryker Leibinger Gmbh & Co. Kg System, device, and method for determining a position of an object
US8475495B2 (en) 2004-04-08 2013-07-02 Globus Medical Polyaxial screw
WO2005112563A2 (en) 2004-04-13 2005-12-01 The University Of Georgia Research Foundation, Inc. Virtual surgical system and methods
KR100617974B1 (en) 2004-04-22 2006-08-31 한국과학기술원 Laparoscopic device capable of command following
US7567834B2 (en) 2004-05-03 2009-07-28 Medtronic Navigation, Inc. Method and apparatus for implantation between two vertebral bodies
US7379790B2 (en) 2004-05-04 2008-05-27 Intuitive Surgical, Inc. Tool memory-based software upgrades for robotic surgery
US7974674B2 (en) 2004-05-28 2011-07-05 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for surface modeling
US8528565B2 (en) 2004-05-28 2013-09-10 St. Jude Medical, Atrial Fibrillation Division, Inc. Robotic surgical system and method for automated therapy delivery
FR2871363B1 (en) 2004-06-15 2006-09-01 Medtech Sa ROBOTIZED GUIDING DEVICE FOR SURGICAL TOOL
US7327865B2 (en) 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
ITMI20041448A1 (en) 2004-07-20 2004-10-20 Milano Politecnico APPARATUS FOR THE MERGER AND NAVIGATION OF ECOGRAPHIC AND VOLUMETRIC IMAGES OF A PATIENT USING A COMBINATION OF ACTIVE AND PASSIVE OPTICAL MARKERS FOR THE LOCALIZATION OF ECHOGRAPHIC PROBES AND SURGICAL INSTRUMENTS COMPARED TO THE PATIENT
US7440793B2 (en) 2004-07-22 2008-10-21 Sunita Chauhan Apparatus and method for removing abnormal tissue
CA2513202C (en) 2004-07-23 2015-03-31 Mehran Anvari Multi-purpose robotic operating system and method
US9072535B2 (en) 2011-05-27 2015-07-07 Ethicon Endo-Surgery, Inc. Surgical stapling instruments with rotatable staple deployment arrangements
GB2422759B (en) 2004-08-05 2008-07-16 Elekta Ab Rotatable X-ray scan apparatus with cone beam offset
US7702379B2 (en) 2004-08-25 2010-04-20 General Electric Company System and method for hybrid tracking in surgical navigation
US7555331B2 (en) 2004-08-26 2009-06-30 Stereotaxis, Inc. Method for surgical navigation utilizing scale-invariant registration between a navigation system and a localization system
DE102004042489B4 (en) 2004-08-31 2012-03-29 Siemens Ag Medical examination or treatment facility with associated method
US7726171B2 (en) 2004-09-15 2010-06-01 Ao Technology Ag Device and process for calibrating geometrical measurements of surgical tools and orienting the same in space
WO2006038145A1 (en) 2004-10-06 2006-04-13 Philips Intellectual Property & Standards Gmbh Computed tomography method
US7831294B2 (en) 2004-10-07 2010-11-09 Stereotaxis, Inc. System and method of surgical imagining with anatomical overlay for navigation of surgical devices
US7983733B2 (en) 2004-10-26 2011-07-19 Stereotaxis, Inc. Surgical navigation using a three-dimensional user interface
US7062006B1 (en) 2005-01-19 2006-06-13 The Board Of Trustees Of The Leland Stanford Junior University Computed tomography with increased field of view
US7763015B2 (en) 2005-01-24 2010-07-27 Intuitive Surgical Operations, Inc. Modular manipulator support for robotic surgery
US7837674B2 (en) 2005-01-24 2010-11-23 Intuitive Surgical Operations, Inc. Compact counter balance for robotic surgical systems
US20060184396A1 (en) 2005-01-28 2006-08-17 Dennis Charles L System and method for surgical navigation
US7231014B2 (en) 2005-02-14 2007-06-12 Varian Medical Systems Technologies, Inc. Multiple mode flat panel X-ray imaging system
US7623902B2 (en) 2005-03-07 2009-11-24 Leucadia 6, Llc System and methods for improved access to vertebral bodies for kyphoplasty, vertebroplasty, vertebral body biopsy or screw placement
US8375808B2 (en) 2005-12-30 2013-02-19 Intuitive Surgical Operations, Inc. Force sensing for surgical instruments
US8465771B2 (en) 2005-03-30 2013-06-18 The University Of Western Ontario Anisotropic hydrogels
US8496647B2 (en) 2007-12-18 2013-07-30 Intuitive Surgical Operations, Inc. Ribbed force sensor
US7720523B2 (en) 2005-04-20 2010-05-18 General Electric Company System and method for managing power deactivation within a medical imaging system
US8208988B2 (en) 2005-05-13 2012-06-26 General Electric Company System and method for controlling a medical imaging device
US8398541B2 (en) 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
EP1887961B1 (en) 2005-06-06 2012-01-11 Intuitive Surgical Operations, Inc. Laparoscopic ultrasound robotic surgical system
JP2007000406A (en) 2005-06-24 2007-01-11 Ge Medical Systems Global Technology Co Llc X-ray ct method and x-ray ct apparatus
US7840256B2 (en) 2005-06-27 2010-11-23 Biomet Manufacturing Corporation Image guided tracking array and method
US20070005002A1 (en) 2005-06-30 2007-01-04 Intuitive Surgical Inc. Robotic surgical instruments for irrigation, aspiration, and blowing
US20070038059A1 (en) 2005-07-07 2007-02-15 Garrett Sheffer Implant and instrument morphing
US20080302950A1 (en) 2005-08-11 2008-12-11 The Brigham And Women's Hospital, Inc. System and Method for Performing Single Photon Emission Computed Tomography (Spect) with a Focal-Length Cone-Beam Collimation
US7787699B2 (en) 2005-08-17 2010-08-31 General Electric Company Real-time integration and recording of surgical image data
US8800838B2 (en) 2005-08-31 2014-08-12 Ethicon Endo-Surgery, Inc. Robotically-controlled cable-based surgical end effectors
US20070073133A1 (en) 2005-09-15 2007-03-29 Schoenefeld Ryan J Virtual mouse for use in surgical navigation
US7643862B2 (en) * 2005-09-15 2010-01-05 Biomet Manufacturing Corporation Virtual mouse for use in surgical navigation
US7835784B2 (en) 2005-09-21 2010-11-16 Medtronic Navigation, Inc. Method and apparatus for positioning a reference frame
US8079950B2 (en) 2005-09-29 2011-12-20 Intuitive Surgical Operations, Inc. Autofocus and/or autoscaling in telesurgery
WO2007044301A2 (en) 2005-10-04 2007-04-19 Intersense, Inc. Tracking objects with markers
WO2007061890A2 (en) 2005-11-17 2007-05-31 Calypso Medical Technologies, Inc. Apparatus and methods for using an electromagnetic transponder in orthopedic procedures
US7711406B2 (en) 2005-11-23 2010-05-04 General Electric Company System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging
DE602005007509D1 (en) 2005-11-24 2008-07-24 Brainlab Ag Medical referencing system with gamma camera
US7762825B2 (en) 2005-12-20 2010-07-27 Intuitive Surgical Operations, Inc. Electro-mechanical interfaces to mount robotic surgical arms
US8672922B2 (en) 2005-12-20 2014-03-18 Intuitive Surgical Operations, Inc. Wireless communication in a robotic surgical system
US7689320B2 (en) 2005-12-20 2010-03-30 Intuitive Surgical Operations, Inc. Robotic surgical system with joint motion controller adapted to reduce instrument tip vibrations
US7819859B2 (en) 2005-12-20 2010-10-26 Intuitive Surgical Operations, Inc. Control system for reducing internally generated frictional and inertial resistance to manual positioning of a surgical manipulator
US8182470B2 (en) 2005-12-20 2012-05-22 Intuitive Surgical Operations, Inc. Telescoping insertion axis of a robotic surgical system
US7955322B2 (en) 2005-12-20 2011-06-07 Intuitive Surgical Operations, Inc. Wireless communication in a robotic surgical system
US8054752B2 (en) 2005-12-22 2011-11-08 Intuitive Surgical Operations, Inc. Synchronous data communication
ES2292327B1 (en) 2005-12-26 2009-04-01 Consejo Superior Investigaciones Cientificas MINI CAMERA GAMMA AUTONOMA AND WITH LOCATION SYSTEM, FOR INTRACHIRURGICAL USE.
JP5152993B2 (en) 2005-12-30 2013-02-27 インテュイティブ サージカル インコーポレイテッド Modular force sensor
US7930065B2 (en) 2005-12-30 2011-04-19 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
US7907166B2 (en) 2005-12-30 2011-03-15 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US7533892B2 (en) 2006-01-05 2009-05-19 Intuitive Surgical, Inc. Steering system for heavy mobile medical equipment
KR100731052B1 (en) 2006-01-23 2007-06-22 한양대학교 산학협력단 Computer Integrated Surgery Support System for Microinvasive Surgery
US8142420B2 (en) 2006-01-25 2012-03-27 Intuitive Surgical Operations Inc. Robotic arm with five-bar spherical linkage
US8162926B2 (en) 2006-01-25 2012-04-24 Intuitive Surgical Operations Inc. Robotic arm with five-bar spherical linkage
US20110290856A1 (en) 2006-01-31 2011-12-01 Ethicon Endo-Surgery, Inc. Robotically-controlled surgical instrument with force-feedback capabilities
US7845537B2 (en) 2006-01-31 2010-12-07 Ethicon Endo-Surgery, Inc. Surgical instrument having recording capabilities
EP1815950A1 (en) 2006-02-03 2007-08-08 The European Atomic Energy Community (EURATOM), represented by the European Commission Robotic surgical system for performing minimally invasive medical procedures
US8219177B2 (en) 2006-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US8526688B2 (en) 2006-03-09 2013-09-03 General Electric Company Methods and systems for registration of surgical navigation data and image data
US8208708B2 (en) 2006-03-30 2012-06-26 Koninklijke Philips Electronics N.V. Targeting method, targeting device, computer readable medium and program element
US20070233238A1 (en) 2006-03-31 2007-10-04 Medtronic Vascular, Inc. Devices for Imaging and Navigation During Minimally Invasive Non-Bypass Cardiac Procedures
CN101466313B (en) 2006-04-14 2012-11-14 威廉博蒙特医院 Scanning slot cone beam computed tomography and scanning focused spot cone beam computed tomography
US8021310B2 (en) 2006-04-21 2011-09-20 Nellcor Puritan Bennett Llc Work of breathing display for a ventilation system
US8112292B2 (en) 2006-04-21 2012-02-07 Medtronic Navigation, Inc. Method and apparatus for optimizing a therapy
US7940999B2 (en) 2006-04-24 2011-05-10 Siemens Medical Solutions Usa, Inc. System and method for learning-based 2D/3D rigid registration for image-guided surgery using Jensen-Shannon divergence
DE112007001214T5 (en) 2006-05-16 2009-04-02 Surgiceye Gmbh Method and apparatus for 3D acquisition, 3D visualization and computer-guided operation with nuclear probes
US20080004523A1 (en) 2006-06-29 2008-01-03 General Electric Company Surgical tool guide
DE102006032127B4 (en) 2006-07-05 2008-04-30 Aesculap Ag & Co. Kg Calibration method and calibration device for a surgical referencing unit
US20080013809A1 (en) 2006-07-14 2008-01-17 Bracco Imaging, Spa Methods and apparatuses for registration in image guided surgery
EP1886640B1 (en) 2006-08-08 2009-11-18 BrainLAB AG Planning method and system for adjusting a free-shaped bone implant
WO2008021671A2 (en) 2006-08-17 2008-02-21 Koninklijke Philips Electronics N. V. Computed tomography image acquisition
US8442619B2 (en) * 2006-08-30 2013-05-14 General Electric Company System and method for detecting errors in position tracking systems used for medical applications
DE102006041033B4 (en) 2006-09-01 2017-01-19 Siemens Healthcare Gmbh Method for reconstructing a three-dimensional image volume
US8231610B2 (en) 2006-09-06 2012-07-31 National Cancer Center Robotic surgical system for laparoscopic surgery
US8532741B2 (en) 2006-09-08 2013-09-10 Medtronic, Inc. Method and apparatus to optimize electrode placement for neurological stimulation
US8150497B2 (en) 2006-09-08 2012-04-03 Medtronic, Inc. System for navigating a planned procedure within a body
WO2008031077A2 (en) 2006-09-08 2008-03-13 Hansen Medical, Inc. Robotic surgical system with forward-oriented field of view guide instrument navigation
US8150498B2 (en) 2006-09-08 2012-04-03 Medtronic, Inc. System for identification of anatomical landmarks
US8248413B2 (en) 2006-09-18 2012-08-21 Stryker Corporation Visual navigation system for endoscopic surgery
EP2074383B1 (en) 2006-09-25 2016-05-11 Mazor Robotics Ltd. C-arm computerized tomography
US8660635B2 (en) 2006-09-29 2014-02-25 Medtronic, Inc. Method and apparatus for optimizing a computer assisted surgical procedure
US8052688B2 (en) 2006-10-06 2011-11-08 Wolf Ii Erich Electromagnetic apparatus and method for nerve localization during spinal surgery
US20080144906A1 (en) 2006-10-09 2008-06-19 General Electric Company System and method for video capture for fluoroscopy and navigation
US20080109012A1 (en) 2006-11-03 2008-05-08 General Electric Company System, method and apparatus for tableside remote connections of medical instruments and systems using wireless communications
US8551114B2 (en) 2006-11-06 2013-10-08 Human Robotics S.A. De C.V. Robotic surgical device
US20080108912A1 (en) 2006-11-07 2008-05-08 General Electric Company System and method for measurement of clinical parameters of the knee for use during knee replacement surgery
US20080108991A1 (en) 2006-11-08 2008-05-08 General Electric Company Method and apparatus for performing pedicle screw fusion surgery
US8682413B2 (en) 2006-11-15 2014-03-25 General Electric Company Systems and methods for automated tracker-driven image selection
US7935130B2 (en) 2006-11-16 2011-05-03 Intuitive Surgical Operations, Inc. Two-piece end-effectors for robotic surgical tools
CA2670261A1 (en) 2006-11-16 2008-05-29 Vanderbilt University Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same
US8727618B2 (en) 2006-11-22 2014-05-20 Siemens Aktiengesellschaft Robotic device and method for trauma patient diagnosis and therapy
US7835557B2 (en) 2006-11-28 2010-11-16 Medtronic Navigation, Inc. System and method for detecting status of imaging device
US8320991B2 (en) 2006-12-01 2012-11-27 Medtronic Navigation Inc. Portable electromagnetic navigation system
US7683331B2 (en) 2006-12-08 2010-03-23 Rush University Medical Center Single photon emission computed tomography (SPECT) system for cardiac imaging
US7683332B2 (en) 2006-12-08 2010-03-23 Rush University Medical Center Integrated single photon emission computed tomography (SPECT)/transmission computed tomography (TCT) system for cardiac imaging
US8556807B2 (en) 2006-12-21 2013-10-15 Intuitive Surgical Operations, Inc. Hermetically sealed distal sensor endoscope
DE102006061178A1 (en) 2006-12-22 2008-06-26 Siemens Ag Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit
US20080177203A1 (en) 2006-12-22 2008-07-24 General Electric Company Surgical navigation planning system and method for placement of percutaneous instrumentation and implants
US20080161680A1 (en) 2006-12-29 2008-07-03 General Electric Company System and method for surgical navigation of motion preservation prosthesis
US9220573B2 (en) 2007-01-02 2015-12-29 Medtronic Navigation, Inc. System and method for tracking positions of uniform marker geometries
US8684253B2 (en) 2007-01-10 2014-04-01 Ethicon Endo-Surgery, Inc. Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor
US8374673B2 (en) 2007-01-25 2013-02-12 Warsaw Orthopedic, Inc. Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control
WO2008095166A1 (en) 2007-02-01 2008-08-07 Interactive Neuroscience Center, Llc Surgical navigation
WO2008097540A2 (en) 2007-02-02 2008-08-14 Hansen Medical, Inc. Robotic surgical instrument and methods using bragg fiber sensors
US8600478B2 (en) 2007-02-19 2013-12-03 Medtronic Navigation, Inc. Automatic identification of instruments used with a surgical navigation system
US8233963B2 (en) 2007-02-19 2012-07-31 Medtronic Navigation, Inc. Automatic identification of tracked surgical devices using an electromagnetic localization system
DE102007009017B3 (en) 2007-02-23 2008-09-25 Siemens Ag Arrangement for supporting a percutaneous procedure
US10039613B2 (en) 2007-03-01 2018-08-07 Surgical Navigation Technologies, Inc. Method for localizing an imaging device with a surgical navigation system
US8098914B2 (en) 2007-03-05 2012-01-17 Siemens Aktiengesellschaft Registration of CT volumes with fluoroscopic images
US20080228068A1 (en) 2007-03-13 2008-09-18 Viswanathan Raju R Automated Surgical Navigation with Electro-Anatomical and Pre-Operative Image Data
US8821511B2 (en) 2007-03-15 2014-09-02 General Electric Company Instrument guide for use with a surgical navigation system
US20080235052A1 (en) 2007-03-19 2008-09-25 General Electric Company System and method for sharing medical information between image-guided surgery systems
US8150494B2 (en) 2007-03-29 2012-04-03 Medtronic Navigation, Inc. Apparatus for registering a physical space to image space
US7879045B2 (en) 2007-04-10 2011-02-01 Medtronic, Inc. System for guiding instruments having different sizes
EP2142132B1 (en) 2007-04-16 2012-09-26 NeuroArm Surgical, Ltd. System for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis
JP2010524547A (en) 2007-04-16 2010-07-22 ニューロアーム サージカル リミテッド Method, apparatus, and system for automated motion for medical robots
US8311611B2 (en) 2007-04-24 2012-11-13 Medtronic, Inc. Method for performing multiple registrations in a navigated procedure
US8108025B2 (en) 2007-04-24 2012-01-31 Medtronic, Inc. Flexible array for use in navigated surgery
US8301226B2 (en) 2007-04-24 2012-10-30 Medtronic, Inc. Method and apparatus for performing a navigated procedure
US20090012509A1 (en) 2007-04-24 2009-01-08 Medtronic, Inc. Navigated Soft Tissue Penetrating Laser System
US8010177B2 (en) 2007-04-24 2011-08-30 Medtronic, Inc. Intraoperative image registration
US8062364B1 (en) 2007-04-27 2011-11-22 Knee Creations, Llc Osteoarthritis treatment and device
DE102007022122B4 (en) 2007-05-11 2019-07-11 Deutsches Zentrum für Luft- und Raumfahrt e.V. Gripping device for a surgery robot arrangement
US8057397B2 (en) 2007-05-16 2011-11-15 General Electric Company Navigation and imaging system sychronized with respiratory and/or cardiac activity
US20080287771A1 (en) 2007-05-17 2008-11-20 General Electric Company Surgical navigation system with electrostatic shield
US8934961B2 (en) 2007-05-18 2015-01-13 Biomet Manufacturing, Llc Trackable diagnostic scope apparatus and methods of use
US20080300477A1 (en) 2007-05-30 2008-12-04 General Electric Company System and method for correction of automated image registration
US20080300478A1 (en) 2007-05-30 2008-12-04 General Electric Company System and method for displaying real-time state of imaged anatomy during a surgical procedure
US9301807B2 (en) 2007-06-13 2016-04-05 Intuitive Surgical Operations, Inc. Surgical system counterbalance
US9468412B2 (en) 2007-06-22 2016-10-18 General Electric Company System and method for accuracy verification for image based surgical navigation
EP3673855B1 (en) 2007-07-12 2021-09-08 Board of Regents of the University of Nebraska Systems of actuation in robotic devices
US7834484B2 (en) 2007-07-16 2010-11-16 Tyco Healthcare Group Lp Connection cable and method for activating a voltage-controlled generator
JP2009045428A (en) 2007-07-25 2009-03-05 Terumo Corp Operating mechanism, medical manipulator and surgical robot system
WO2009018086A2 (en) 2007-07-27 2009-02-05 The Cleveland Clinic Foundation Oblique lumbar interbody fusion
US8035685B2 (en) 2007-07-30 2011-10-11 General Electric Company Systems and methods for communicating video data between a mobile imaging system and a fixed monitor system
US8328818B1 (en) 2007-08-31 2012-12-11 Globus Medical, Inc. Devices and methods for treating bone
EP2197548B1 (en) 2007-09-19 2012-11-14 Walter A. Roberts Direct visualization robotic intra-operative radiation therapy applicator device
US20090080737A1 (en) 2007-09-25 2009-03-26 General Electric Company System and Method for Use of Fluoroscope and Computed Tomography Registration for Sinuplasty Navigation
US8224484B2 (en) 2007-09-30 2012-07-17 Intuitive Surgical Operations, Inc. Methods of user interface with alternate tool mode for robotic surgical tools
US9522046B2 (en) 2010-08-23 2016-12-20 Gip Robotic surgery system
CN101848679B (en) 2007-11-06 2014-08-06 皇家飞利浦电子股份有限公司 Nuclear medicine SPECT-CT machine with integrated asymmetric flat panel cone-beam CT and SPECT system
DE102007055203A1 (en) 2007-11-19 2009-05-20 Kuka Roboter Gmbh A robotic device, medical workstation and method for registering an object
US8561473B2 (en) 2007-12-18 2013-10-22 Intuitive Surgical Operations, Inc. Force sensor temperature compensation
US20100274120A1 (en) 2007-12-21 2010-10-28 Koninklijke Philips Electronics N.V. Synchronous interventional scanner
US8400094B2 (en) 2007-12-21 2013-03-19 Intuitive Surgical Operations, Inc. Robotic surgical system with patient support
US8864798B2 (en) 2008-01-18 2014-10-21 Globus Medical, Inc. Transverse connector
CA2716121A1 (en) 2008-01-30 2009-08-06 The Trustees Of Columbia University In The City Of New York Systems, devices, and methods for robot-assisted micro-surgical stenting
US20090198121A1 (en) 2008-02-01 2009-08-06 Martin Hoheisel Method and apparatus for coordinating contrast agent injection and image acquisition in c-arm computed tomography
US8573465B2 (en) 2008-02-14 2013-11-05 Ethicon Endo-Surgery, Inc. Robotically-controlled surgical end effector system with rotary actuated closure systems
US8696458B2 (en) 2008-02-15 2014-04-15 Thales Visionix, Inc. Motion tracking system and method using camera and non-camera sensors
US7925653B2 (en) 2008-02-27 2011-04-12 General Electric Company Method and system for accessing a group of objects in an electronic document
US20090228019A1 (en) 2008-03-10 2009-09-10 Yosef Gross Robotic surgical system
US8282653B2 (en) 2008-03-24 2012-10-09 Board Of Regents Of The University Of Nebraska System and methods for controlling surgical tool elements
US8808164B2 (en) 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
BRPI0822423B1 (en) 2008-03-28 2020-09-24 Telefonaktiebolaget Lm Ericsson (Publ) METHODS TO ENABLE DETECTION AND DETECTION OF A BASE STATION, BASE STATION OF A COMMUNICATION NETWORK, AND, NUCLEUS NETWORK NODE
US8333755B2 (en) 2008-03-31 2012-12-18 Intuitive Surgical Operations, Inc. Coupler to transfer controller motion from a robotic manipulator to an attached instrument
US7886743B2 (en) 2008-03-31 2011-02-15 Intuitive Surgical Operations, Inc. Sterile drape interface for robotic surgical instrument
US7843158B2 (en) 2008-03-31 2010-11-30 Intuitive Surgical Operations, Inc. Medical robotic system adapted to inhibit motions resulting in excessive end effector forces
US9002076B2 (en) 2008-04-15 2015-04-07 Medtronic, Inc. Method and apparatus for optimal trajectory planning
US9345875B2 (en) 2008-04-17 2016-05-24 Medtronic, Inc. Method and apparatus for cannula fixation for an array insertion tube set
US8803955B2 (en) 2008-04-26 2014-08-12 Intuitive Surgical Operations, Inc. Augmented stereoscopic visualization for a surgical robot using a camera unit with a modified prism
WO2009134367A1 (en) 2008-04-30 2009-11-05 Nanosys, Inc. Non-fouling surfaces for reflective spheres
US9579161B2 (en) 2008-05-06 2017-02-28 Medtronic Navigation, Inc. Method and apparatus for tracking a patient
US20110022229A1 (en) 2008-06-09 2011-01-27 Bae Sang Jang Master interface and driving method of surgical robot
TW201004607A (en) 2008-07-25 2010-02-01 Been-Der Yang Image guided navigation system and method thereof
US8054184B2 (en) 2008-07-31 2011-11-08 Intuitive Surgical Operations, Inc. Identification of surgical instrument attached to surgical robot
US8771170B2 (en) 2008-08-01 2014-07-08 Microaccess, Inc. Methods and apparatus for transesophageal microaccess surgery
JP2010035984A (en) 2008-08-08 2010-02-18 Canon Inc X-ray imaging apparatus
ES2608820T3 (en) 2008-08-15 2017-04-17 Stryker European Holdings I, Llc System and method of visualization of the inside of a body
US8500728B2 (en) 2008-08-18 2013-08-06 Encision, Inc. Enhanced control systems including flexible shielding and support systems for electrosurgical applications
DE102008041813B4 (en) 2008-09-04 2013-06-20 Carl Zeiss Microscopy Gmbh Method for the depth analysis of an organic sample
US7900524B2 (en) 2008-09-09 2011-03-08 Intersense, Inc. Monitoring tools
US8165658B2 (en) 2008-09-26 2012-04-24 Medtronic, Inc. Method and apparatus for positioning a guide relative to a base
US8073335B2 (en) 2008-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Operator input device for a robotic surgical system
EP2331945B1 (en) 2008-10-10 2018-05-30 Koninklijke Philips N.V. Method and apparatus to improve ct image acquisition using a displaced geometry
KR100944412B1 (en) 2008-10-13 2010-02-25 (주)미래컴퍼니 Surgical slave robot
US8781630B2 (en) 2008-10-14 2014-07-15 University Of Florida Research Foundation, Inc. Imaging platform to provide integrated navigation capabilities for surgical guidance
WO2010048160A2 (en) 2008-10-20 2010-04-29 The Johns Hopkins University Environment property estimation and graphical display
EP2179703B1 (en) 2008-10-21 2012-03-28 BrainLAB AG Integration of surgical instrument and display device for supporting image-based surgery
US8798933B2 (en) 2008-10-31 2014-08-05 The Invention Science Fund I, Llc Frozen compositions and methods for piercing a substrate
KR101075363B1 (en) 2008-10-31 2011-10-19 정창욱 Surgical Robot System Having Tool for Minimally Invasive Surgery
US9033958B2 (en) 2008-11-11 2015-05-19 Perception Raisonnement Action En Medecine Surgical robotic system
TWI435705B (en) 2008-11-20 2014-05-01 Been Der Yang Surgical position device and image guided navigation system using the same
JP5384521B2 (en) 2008-11-27 2014-01-08 株式会社日立メディコ Radiation imaging device
US8483800B2 (en) 2008-11-29 2013-07-09 General Electric Company Surgical navigation enabled imaging table environment
CA2745210C (en) 2008-12-01 2018-03-13 Mazor Robotics Ltd Robot guided oblique spinal stabilization
ES2341079B1 (en) 2008-12-11 2011-07-13 Fundacio Clinic Per A La Recerca Biomedica EQUIPMENT FOR IMPROVED VISION BY INFRARED VASCULAR STRUCTURES, APPLICABLE TO ASSIST PHYTOSCOPIC, LAPAROSCOPIC AND ENDOSCOPIC INTERVENTIONS AND SIGNAL TREATMENT PROCESS TO IMPROVE SUCH VISION.
US8021393B2 (en) 2008-12-12 2011-09-20 Globus Medical, Inc. Lateral spinous process spacer with deployable wings
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US8594841B2 (en) 2008-12-31 2013-11-26 Intuitive Surgical Operations, Inc. Visual force feedback in a minimally invasive surgical procedure
US8374723B2 (en) 2008-12-31 2013-02-12 Intuitive Surgical Operations, Inc. Obtaining force information in a minimally invasive surgical procedure
US8184880B2 (en) 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
EP2586374B1 (en) 2009-01-21 2015-03-18 Koninklijke Philips N.V. Method and apparatus for large field of view imaging and detection and compensation of motion artifacts
WO2010086374A1 (en) 2009-01-29 2010-08-05 Imactis Method and device for navigation of a surgical tool
KR101038417B1 (en) 2009-02-11 2011-06-01 주식회사 이턴 Surgical Robot System and Its Control Method
US8418073B2 (en) 2009-03-09 2013-04-09 Intuitive Surgical Operations, Inc. User interfaces for electrosurgical tools in robotic surgical systems
US8918207B2 (en) 2009-03-09 2014-12-23 Intuitive Surgical Operations, Inc. Operator input device for a robotic surgical system
US9737235B2 (en) 2009-03-09 2017-08-22 Medtronic Navigation, Inc. System and method for image-guided navigation
US8120301B2 (en) 2009-03-09 2012-02-21 Intuitive Surgical Operations, Inc. Ergonomic surgeon control console in robotic surgical systems
CA2755036A1 (en) 2009-03-10 2010-09-16 Mcmaster University Mobile robotic surgical system
US8335552B2 (en) 2009-03-20 2012-12-18 Medtronic, Inc. Method and apparatus for instrument placement
CN107510506A (en) * 2009-03-24 2017-12-26 伊顿株式会社 Utilize the surgical robot system and its control method of augmented reality
US20100249571A1 (en) 2009-03-31 2010-09-30 General Electric Company Surgical navigation system with wireless magnetoresistance tracking sensors
US8882803B2 (en) 2009-04-01 2014-11-11 Globus Medical, Inc. Orthopedic clamp and extension rod
WO2010124285A1 (en) 2009-04-24 2010-10-28 Medtronic Inc. Electromagnetic navigation of medical instruments for cardiothoracic surgery
CA2876807C (en) 2009-05-18 2016-07-12 Teleflex Medical Incorporated Method and devices for performing minimally invasive surgery
ES2388029B1 (en) 2009-05-22 2013-08-13 Universitat Politècnica De Catalunya ROBOTIC SYSTEM FOR LAPAROSCOPIC SURGERY.
CN101897593B (en) 2009-05-26 2014-08-13 清华大学 A computer tomography device and method
US8121249B2 (en) 2009-06-04 2012-02-21 Virginia Tech Intellectual Properties, Inc. Multi-parameter X-ray computed tomography
WO2011013164A1 (en) 2009-07-27 2011-02-03 株式会社島津製作所 Radiographic apparatus
US9001963B2 (en) 2009-08-06 2015-04-07 Koninklijke Philips N.V. Method and apparatus for generating computed tomography images with offset detector geometries
WO2011021192A1 (en) 2009-08-17 2011-02-24 Mazor Surgical Technologies Ltd. Device for improving the accuracy of manual operations
US9844414B2 (en) 2009-08-31 2017-12-19 Gregory S. Fischer System and method for robotic surgical intervention in a magnetic resonance imager
EP2298223A1 (en) 2009-09-21 2011-03-23 Stryker Leibinger GmbH & Co. KG Technique for registering image data of an object
US8465476B2 (en) 2009-09-23 2013-06-18 Intuitive Surgical Operations, Inc. Cannula mounting fixture
EP2482745B1 (en) 2009-09-30 2013-12-18 Brainlab AG Two-part medical tracking marker
NL1037348C2 (en) 2009-10-02 2011-04-05 Univ Eindhoven Tech Surgical robot, instrument manipulator, combination of an operating table and a surgical robot, and master-slave operating system.
US8685098B2 (en) 2010-06-25 2014-04-01 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US8679183B2 (en) 2010-06-25 2014-03-25 Globus Medical Expandable fusion device and method of installation thereof
US8556979B2 (en) 2009-10-15 2013-10-15 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US8062375B2 (en) 2009-10-15 2011-11-22 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US20110098553A1 (en) 2009-10-28 2011-04-28 Steven Robbins Automatic registration of images for image guided surgery
USD631966S1 (en) 2009-11-10 2011-02-01 Globus Medical, Inc. Basilar invagination implant
US8521331B2 (en) 2009-11-13 2013-08-27 Intuitive Surgical Operations, Inc. Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument
US20110137152A1 (en) 2009-12-03 2011-06-09 General Electric Company System and method for cooling components of a surgical navigation system
US8277509B2 (en) 2009-12-07 2012-10-02 Globus Medical, Inc. Transforaminal prosthetic spinal disc apparatus
US9750465B2 (en) 2009-12-10 2017-09-05 Koninklijke Philips N.V. Scanning system for differential phase contrast imaging
US8694075B2 (en) 2009-12-21 2014-04-08 General Electric Company Intra-operative registration for navigated surgical procedures
US8353963B2 (en) 2010-01-12 2013-01-15 Globus Medical Expandable spacer and method for use thereof
JP5795599B2 (en) 2010-01-13 2015-10-14 コーニンクレッカ フィリップス エヌ ヴェ Image integration based registration and navigation for endoscopic surgery
US9381045B2 (en) 2010-01-13 2016-07-05 Jcbd, Llc Sacroiliac joint implant and sacroiliac joint instrument for fusing a sacroiliac joint
EP2524289B1 (en) 2010-01-14 2016-12-07 Brainlab AG Controlling and/or operating a medical device by means of a light pointer
US9039769B2 (en) 2010-03-17 2015-05-26 Globus Medical, Inc. Intervertebral nucleus and annulus implants and method of use thereof
US20140330288A1 (en) 2010-03-25 2014-11-06 Precision Automation And Robotics India Ltd. Articulating Arm for a Robotic Surgical Instrument System
US20110238080A1 (en) 2010-03-25 2011-09-29 Date Ranjit Robotic Surgical Instrument System
IT1401669B1 (en) 2010-04-07 2013-08-02 Sofar Spa ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL.
US8870880B2 (en) 2010-04-12 2014-10-28 Globus Medical, Inc. Angling inserter tool for expandable vertebral implant
IT1399603B1 (en) 2010-04-26 2013-04-26 Scuola Superiore Di Studi Universitari E Di Perfez ROBOTIC SYSTEM FOR MINIMUM INVASIVE SURGERY INTERVENTIONS
US8717430B2 (en) 2010-04-26 2014-05-06 Medtronic Navigation, Inc. System and method for radio-frequency imaging, registration, and localization
WO2011134083A1 (en) 2010-04-28 2011-11-03 Ryerson University System and methods for intraoperative guidance feedback
JP2013530028A (en) 2010-05-04 2013-07-25 パスファインダー セラピューティクス,インコーポレイテッド System and method for abdominal surface matching using pseudo features
US8738115B2 (en) 2010-05-11 2014-05-27 Siemens Aktiengesellschaft Method and apparatus for selective internal radiation therapy planning and implementation
DE102010020284A1 (en) 2010-05-12 2011-11-17 Siemens Aktiengesellschaft Determination of 3D positions and orientations of surgical objects from 2D X-ray images
US8603077B2 (en) 2010-05-14 2013-12-10 Intuitive Surgical Operations, Inc. Force transmission for robotic surgical instrument
US8883210B1 (en) 2010-05-14 2014-11-11 Musculoskeletal Transplant Foundation Tissue-derived tissuegenic implants, and methods of fabricating and using same
KR101181569B1 (en) 2010-05-25 2012-09-10 정창욱 Surgical robot system capable of implementing both of single port surgery mode and multi-port surgery mode and method for controlling same
US20110295370A1 (en) 2010-06-01 2011-12-01 Sean Suh Spinal Implants and Methods of Use Thereof
DE102010026674B4 (en) 2010-07-09 2012-09-27 Siemens Aktiengesellschaft Imaging device and radiotherapy device
US8675939B2 (en) 2010-07-13 2014-03-18 Stryker Leibinger Gmbh & Co. Kg Registration of anatomical data sets
WO2012007036A1 (en) 2010-07-14 2012-01-19 Brainlab Ag Method and system for determining an imaging direction and calibration of an imaging apparatus
US20120035507A1 (en) 2010-07-22 2012-02-09 Ivan George Device and method for measuring anatomic geometries
US8740882B2 (en) 2010-07-30 2014-06-03 Lg Electronics Inc. Medical robotic system and method of controlling the same
US8696549B2 (en) 2010-08-20 2014-04-15 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
JP2012045278A (en) 2010-08-30 2012-03-08 Fujifilm Corp X-ray imaging apparatus and x-ray imaging method
SG188303A1 (en) 2010-09-01 2013-04-30 Agency Science Tech & Res A robotic device for use in image-guided robot assisted surgical training
KR20120030174A (en) 2010-09-17 2012-03-28 삼성전자주식회사 Surgery robot system and surgery apparatus and method for providing tactile feedback
EP2431003B1 (en) 2010-09-21 2018-03-21 Medizinische Universität Innsbruck Registration device, system, kit and method for a patient registration
US8679125B2 (en) 2010-09-22 2014-03-25 Biomet Manufacturing, Llc Robotic guided femoral head reshaping
US8657809B2 (en) 2010-09-29 2014-02-25 Stryker Leibinger Gmbh & Co., Kg Surgical navigation system
US8718346B2 (en) 2011-10-05 2014-05-06 Saferay Spine Llc Imaging system and method for use in surgical and interventional medical procedures
US8526700B2 (en) 2010-10-06 2013-09-03 Robert E. Isaacs Imaging system and method for surgical and interventional medical procedures
US9913693B2 (en) 2010-10-29 2018-03-13 Medtronic, Inc. Error correction techniques in surgical navigation
US8876866B2 (en) 2010-12-13 2014-11-04 Globus Medical, Inc. Spinous process fusion devices and methods thereof
EP3649937A1 (en) 2010-12-13 2020-05-13 Statera Spine, Inc. Methods, systems and devices for clinical data reporting and surgical navigation
CN107126634B (en) 2010-12-22 2021-04-27 优瑞技术公司 System and recording medium for image guidance during medical procedures
WO2012095755A1 (en) 2011-01-13 2012-07-19 Koninklijke Philips Electronics N.V. Intraoperative camera calibration for endoscopic surgery
KR101181613B1 (en) 2011-02-21 2012-09-10 윤상진 Surgical robot system for performing surgery based on displacement information determined by user designation and control method therefor
US20120226145A1 (en) 2011-03-03 2012-09-06 National University Of Singapore Transcutaneous robot-assisted ablation-device insertion navigation system
US9026247B2 (en) 2011-03-30 2015-05-05 University of Washington through its Center for Communication Motion and video capture for tracking and evaluating robotic surgery and associated systems and methods
WO2012131660A1 (en) 2011-04-01 2012-10-04 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system for spinal and other surgeries
US20120256092A1 (en) 2011-04-06 2012-10-11 General Electric Company Ct system for use in multi-modality imaging system
US20150213633A1 (en) 2011-04-06 2015-07-30 The Trustees Of Columbia University In The City Of New York System, method and computer-accessible medium for providing a panoramic cone beam computed tomography (cbct)
US10426554B2 (en) 2011-04-29 2019-10-01 The Johns Hopkins University System and method for tracking and navigation
JPWO2012169642A1 (en) 2011-06-06 2015-02-23 希 松本 Registration template manufacturing method
US8498744B2 (en) 2011-06-30 2013-07-30 Mako Surgical Corporation Surgical robotic systems with manual and haptic and/or active control modes
EP3588217A1 (en) 2011-07-11 2020-01-01 Board of Regents of the University of Nebraska Robotic surgical devices, systems and related methods
US8818105B2 (en) 2011-07-14 2014-08-26 Accuray Incorporated Image registration for image-guided surgery
KR20130015146A (en) 2011-08-02 2013-02-13 삼성전자주식회사 Method and apparatus for processing medical image, robotic surgery system using image guidance
US10866783B2 (en) 2011-08-21 2020-12-15 Transenterix Europe S.A.R.L. Vocally activated surgical control system
US9427330B2 (en) 2011-09-06 2016-08-30 Globus Medical, Inc. Spinal plate
US8864833B2 (en) 2011-09-30 2014-10-21 Globus Medical, Inc. Expandable fusion device and method of installation thereof
US9060794B2 (en) 2011-10-18 2015-06-23 Mako Surgical Corp. System and method for robotic surgery
US8894688B2 (en) 2011-10-27 2014-11-25 Globus Medical Inc. Adjustable rod devices and methods of using the same
DE102011054910B4 (en) 2011-10-28 2013-10-10 Ovesco Endoscopy Ag Magnetic end effector and means for guiding and positioning same
US8693730B2 (en) 2011-11-15 2014-04-08 Macdonald Dettwiler & Associates Inc. Method of real-time tracking of moving/flexible surfaces
FR2983059B1 (en) 2011-11-30 2014-11-28 Medtech ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD
WO2013084221A1 (en) 2011-12-05 2013-06-13 Mazor Robotics Ltd. Active bed mount for surgical robot
KR101901580B1 (en) 2011-12-23 2018-09-28 삼성전자주식회사 Surgical robot and control method thereof
WO2013101917A1 (en) 2011-12-30 2013-07-04 Mako Surgical Corp. System for image-based robotic surgery
US9265583B2 (en) 2011-12-30 2016-02-23 Mako Surgical Corp. Method for image-based robotic surgery
FR2985167A1 (en) 2011-12-30 2013-07-05 Medtech ROBOTISE MEDICAL METHOD FOR MONITORING PATIENT BREATHING AND CORRECTION OF ROBOTIC TRAJECTORY.
KR20130080909A (en) 2012-01-06 2013-07-16 삼성전자주식회사 Surgical robot and method for controlling the same
US9138297B2 (en) 2012-02-02 2015-09-22 Intuitive Surgical Operations, Inc. Systems and methods for controlling a robotic surgical system
EP2816966B1 (en) 2012-02-22 2023-10-25 Veran Medical Technologies, Inc. Steerable surgical catheter comprising a biopsy device at the distal end portion thereof
US11207132B2 (en) 2012-03-12 2021-12-28 Nuvasive, Inc. Systems and methods for performing spinal surgery
US8855822B2 (en) 2012-03-23 2014-10-07 Innovative Surgical Solutions, Llc Robotic surgical system with mechanomyography feedback
KR101946000B1 (en) 2012-03-28 2019-02-08 삼성전자주식회사 Robot system and Control Method thereof for surgery
US8888821B2 (en) 2012-04-05 2014-11-18 Warsaw Orthopedic, Inc. Spinal implant measuring system and method
JP6338570B2 (en) 2012-04-16 2018-06-06 ニューロロジカ・コーポレーション Imaging system with fixedly mounted reference markers
US20130272488A1 (en) 2012-04-16 2013-10-17 Neurologica Corp. Wireless imaging system
US20140142591A1 (en) 2012-04-24 2014-05-22 Auris Surgical Robotics, Inc. Method, apparatus and a system for robotic assisted surgery
US10383765B2 (en) 2012-04-24 2019-08-20 Auris Health, Inc. Apparatus and method for a global coordinate system for use in robotic surgery
WO2013166098A1 (en) 2012-05-01 2013-11-07 The Johns Hopkins University Improved method and apparatus for robotically assisted cochlear implant surgery
WO2013163800A2 (en) 2012-05-02 2013-11-07 医百科技股份有限公司 Oral surgery auxiliary guidance method
US9125556B2 (en) 2012-05-14 2015-09-08 Mazor Robotics Ltd. Robotic guided endoscope
EP2849650A4 (en) 2012-05-18 2016-01-20 Carestream Health Inc Cone beam computed tomography volumetric imaging system
KR20130132109A (en) 2012-05-25 2013-12-04 삼성전자주식회사 Supporting device and surgical robot system adopting the same
JP6313290B2 (en) 2012-06-01 2018-04-18 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Multi-port surgical robot system structure
EP4234185A3 (en) 2012-06-22 2023-09-20 Board of Regents of the University of Nebraska Local control robotic surgical devices
US20130345757A1 (en) 2012-06-22 2013-12-26 Shawn D. Stad Image Guided Intra-Operative Contouring Aid
US20140005678A1 (en) 2012-06-28 2014-01-02 Ethicon Endo-Surgery, Inc. Rotary drive arrangements for surgical instruments
US8880223B2 (en) 2012-07-16 2014-11-04 Florida Institute for Human & Maching Cognition Anthro-centric multisensory interface for sensory augmentation of telesurgery
US20140031664A1 (en) 2012-07-30 2014-01-30 Mako Surgical Corp. Radiographic imaging device
KR101997566B1 (en) 2012-08-07 2019-07-08 삼성전자주식회사 Surgical robot system and control method thereof
US9770305B2 (en) 2012-08-08 2017-09-26 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems, and related methods
CA2880622C (en) 2012-08-08 2021-01-12 Board Of Regents Of The University Of Nebraska Robotic surgical devices, systems and related methods
US10110785B2 (en) 2012-08-10 2018-10-23 Karl Storz Imaging, Inc. Deployable imaging system equipped with solid state imager
JP6220877B2 (en) 2012-08-15 2017-10-25 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for joint motion cancellation using zero space
KR20150058250A (en) 2012-08-24 2015-05-28 유니버시티 오브 휴스턴 Robotic device and systems for image-gruided and robot-assisted surgery
US20140080086A1 (en) 2012-09-20 2014-03-20 Roger Chen Image Navigation Integrated Dental Implant System
US8892259B2 (en) 2012-09-26 2014-11-18 Innovative Surgical Solutions, LLC. Robotic surgical system with mechanomyography feedback
US9757160B2 (en) 2012-09-28 2017-09-12 Globus Medical, Inc. Device and method for treatment of spinal deformity
KR102038632B1 (en) 2012-11-06 2019-10-30 삼성전자주식회사 surgical instrument, supporting device, and surgical robot system adopting the same
JP2016502435A (en) 2012-11-14 2016-01-28 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Smart drape for collision prevention
KR102079945B1 (en) 2012-11-22 2020-02-21 삼성전자주식회사 Surgical robot and method for controlling the surgical robot
US9393361B2 (en) 2012-12-14 2016-07-19 Medtronic, Inc. Method to determine a material distribution
US9008752B2 (en) 2012-12-14 2015-04-14 Medtronic, Inc. Method to determine distribution of a material by an infused magnetic resonance image contrast agent
DE102012025101A1 (en) 2012-12-20 2014-06-26 avateramedical GmBH Active positioning device of a surgical instrument and a surgical robotic system comprising it
US20150005784A2 (en) 2012-12-20 2015-01-01 avateramedical GmBH Device for Supporting and Positioning of a Surgical Instrument and/or an Endoscope for Use in Minimal-Invasive Surgery and a Surgical Robotic System
US9001962B2 (en) 2012-12-20 2015-04-07 Triple Ring Technologies, Inc. Method and apparatus for multiple X-ray imaging applications
US9002437B2 (en) 2012-12-27 2015-04-07 General Electric Company Method and system for position orientation correction in navigation
WO2014106262A1 (en) 2012-12-31 2014-07-03 Mako Surgical Corp. System for image-based robotic surgery
KR20140090374A (en) 2013-01-08 2014-07-17 삼성전자주식회사 Single port surgical robot and control method thereof
CN103969269B (en) 2013-01-31 2018-09-18 Ge医疗系统环球技术有限公司 Method and apparatus for geometric calibration CT scanner
US20140221819A1 (en) 2013-02-01 2014-08-07 David SARMENT Apparatus, system and method for surgical navigation
ES2804681T3 (en) 2013-02-04 2021-02-09 Childrens Nat Medical Ct Hybrid Control Surgical Robotic System
KR20140102465A (en) 2013-02-14 2014-08-22 삼성전자주식회사 Surgical robot and method for controlling the same
KR102117270B1 (en) 2013-03-06 2020-06-01 삼성전자주식회사 Surgical robot system and method for controlling the same
KR20140110620A (en) 2013-03-08 2014-09-17 삼성전자주식회사 surgical robot system and operating method thereof
KR20140110685A (en) 2013-03-08 2014-09-17 삼성전자주식회사 Method for controlling of single port surgical robot
US9314308B2 (en) 2013-03-13 2016-04-19 Ethicon Endo-Surgery, Llc Robotic ultrasonic surgical device with articulating end effector
KR102119534B1 (en) 2013-03-13 2020-06-05 삼성전자주식회사 Surgical robot and method for controlling the same
KR20140112207A (en) 2013-03-13 2014-09-23 삼성전자주식회사 Augmented reality imaging display system and surgical robot system comprising the same
CA2905948C (en) 2013-03-14 2022-01-11 Board Of Regents Of The University Of Nebraska Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers
US9629595B2 (en) 2013-03-15 2017-04-25 Hansen Medical, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US10667883B2 (en) 2013-03-15 2020-06-02 Virtual Incision Corporation Robotic surgical devices, systems, and related methods
KR102117273B1 (en) 2013-03-21 2020-06-01 삼성전자주식회사 Surgical robot system and method for controlling the same
KR20140121581A (en) 2013-04-08 2014-10-16 삼성전자주식회사 Surgical robot system
KR20140123122A (en) 2013-04-10 2014-10-22 삼성전자주식회사 Surgical Robot and controlling method of thereof
US9414859B2 (en) 2013-04-19 2016-08-16 Warsaw Orthopedic, Inc. Surgical rod measuring system and method
US8964934B2 (en) 2013-04-25 2015-02-24 Moshe Ein-Gal Cone beam CT scanning
KR20140129702A (en) 2013-04-30 2014-11-07 삼성전자주식회사 Surgical robot system and method for controlling the same
US20140364720A1 (en) 2013-06-10 2014-12-11 General Electric Company Systems and methods for interactive magnetic resonance imaging
DE102013012397B4 (en) 2013-07-26 2018-05-24 Rg Mechatronics Gmbh Surgical robot system
US10786283B2 (en) 2013-08-01 2020-09-29 Musc Foundation For Research Development Skeletal bone fixation mechanism
US20150085970A1 (en) 2013-09-23 2015-03-26 General Electric Company Systems and methods for hybrid scanning
JP6581973B2 (en) 2013-10-07 2019-09-25 テクニオン リサーチ アンド ディベロップメント ファンデーション リミテッド System for needle insertion and steering
WO2015054543A1 (en) 2013-10-09 2015-04-16 Nuvasive, Inc. Surgical spinal correction
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
ITBO20130599A1 (en) 2013-10-31 2015-05-01 Cefla Coop METHOD AND APPARATUS TO INCREASE THE FIELD OF VIEW IN A COMPUTERIZED TOMOGRAPHIC ACQUISITION WITH CONE-BEAM TECHNIQUE
US20150146847A1 (en) 2013-11-26 2015-05-28 General Electric Company Systems and methods for providing an x-ray imaging system with nearly continuous zooming capability
EP3682837B1 (en) 2014-03-17 2023-09-27 Intuitive Surgical Operations, Inc. System and method for breakaway clutching in an articulated arm
JP2017519562A (en) 2014-06-17 2017-07-20 ニューヴェイジヴ,インコーポレイテッド System and method for planning, performing, and evaluating spinal correction during surgery
EP3193768A4 (en) 2014-09-17 2018-05-09 Intuitive Surgical Operations, Inc. Systems and methods for utilizing augmented jacobian to control manipulator joint movement
WO2016088130A1 (en) 2014-12-04 2016-06-09 Mazor Robotics Ltd. Shaper for vertebral fixation rods
US20160166329A1 (en) 2014-12-15 2016-06-16 General Electric Company Tomographic imaging for interventional tool guidance
CN107645924B (en) 2015-04-15 2021-04-20 莫比乌斯成像公司 Integrated medical imaging and surgical robotic system
US10180404B2 (en) 2015-04-30 2019-01-15 Shimadzu Corporation X-ray analysis device
US20170143284A1 (en) 2015-11-25 2017-05-25 Carestream Health, Inc. Method to detect a retained surgical object
US10070939B2 (en) 2015-12-04 2018-09-11 Zaki G. Ibrahim Methods for performing minimally invasive transforaminal lumbar interbody fusion using guidance
JP6894441B2 (en) 2016-01-22 2021-06-30 ニューヴェイジヴ,インコーポレイテッド Systems and methods to facilitate spinal surgery
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US9962133B2 (en) 2016-03-09 2018-05-08 Medtronic Navigation, Inc. Transformable imaging system
US9931025B1 (en) 2016-09-30 2018-04-03 Auris Surgical Robotics, Inc. Automated calibration of endoscopes with pull wires
US10798339B2 (en) * 2017-06-14 2020-10-06 Roborep Inc. Telepresence management

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140357989A1 (en) * 2012-01-03 2014-12-04 Koninklijke Philips N.V. Position determining apparatus
EP3628263A1 (en) * 2018-09-27 2020-04-01 Koninklijke Philips N.V. Guidance in lung intervention procedures
US11269406B1 (en) * 2019-10-24 2022-03-08 Facebook Technologies, Llc Systems and methods for calibrating eye tracking

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230225797A1 (en) * 2022-01-18 2023-07-20 Stryker European Operations Limited Technique For Determining A Need For A Re-Registration Of A Patient Tracker

Also Published As

Publication number Publication date
US20230363827A1 (en) 2023-11-16
US12394086B2 (en) 2025-08-19

Similar Documents

Publication Publication Date Title
US20250228624A1 (en) Extended reality systems with three-dimensional visualizations of medical image scan slices
EP3711700B1 (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
CN113558762B (en) Registering a surgical tool with a reference array tracked by a camera of an augmented reality headset for assisted navigation during surgery
US11737831B2 (en) Surgical object tracking template generation for computer assisted navigation during surgical procedure
CN115211962B (en) Surgical system for computer-aided navigation during surgery
US12213745B2 (en) Extended reality systems for visualizing and controlling operating room equipment
JP2021194538A (en) Subject tracking and synthetic imaging in visible light surgery via reference seed
US12318150B2 (en) Camera tracking system for computer assisted surgery navigation
US12394086B2 (en) Accuracy check and automatic calibration of tracked instruments
CN110638526B (en) Method for adjusting virtual implants and associated surgical navigation system
JP7323489B2 (en) Systems and associated methods and apparatus for robotic guidance of a guided biopsy needle trajectory
US20240335240A1 (en) Camera tracking system identifying phantom markers during computer assisted surgery navigation
US20240164844A1 (en) Bone landmarks extraction by bone surface palpation using ball tip stylus for computer assisted surgery navigation
US20200297451A1 (en) System for robotic trajectory guidance for navigated biopsy needle, and related methods and devices
EP4595915A1 (en) Computer assisted pelvic surgery navigation
HK40027812A (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
HK40027812B (en) System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
HK40029958A (en) System for robotic trajectory guidance for navigated biopsy needle

Legal Events

Date Code Title Description
AS Assignment

Owner name: GLOBUS MEDICAL, INC., PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JOSHI, SANJAY M.;REEL/FRAME:059954/0789

Effective date: 20220510

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED