WO2025158390A1 - Système et procédé d'imagerie et d'enregistrement pour navigation - Google Patents
Système et procédé d'imagerie et d'enregistrement pour navigationInfo
- Publication number
- WO2025158390A1 WO2025158390A1 PCT/IB2025/050834 IB2025050834W WO2025158390A1 WO 2025158390 A1 WO2025158390 A1 WO 2025158390A1 IB 2025050834 W IB2025050834 W IB 2025050834W WO 2025158390 A1 WO2025158390 A1 WO 2025158390A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- image
- feature
- subject
- instrument
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/06—Measuring blood flow
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/12—Arrangements for detecting or locating foreign bodies
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
- A61B8/085—Clinical applications involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/488—Diagnostic techniques involving Doppler signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
- G06T2207/10136—3D ultrasound image
Definitions
- the subject disclosure is related generally to a tracking and navigation system, and particularly to imaging and registering coordinate selected systems.
- An instrument can be navigated relative to a subject for performing various procedures.
- the subject can include a patient on which a surgical procedure is being performed.
- an instrument can be tracked in an object or subject space.
- the subject space can be a patient space defined by a patient. The location of the instrument that is tracked can be displayed on a display device relative to an image of the patient.
- the position of the patient can be determined with a tracking system.
- a patient is registered to the image, via tracking an instrument relative to the patient to generate a translation map between the subject or object space (e.g., patient space) and the image space. This may occur by identifying one or more points in the subject space and correlating, often identical points, in the image space.
- object space e.g., patient space
- the position of the instrument can be appropriately displayed on the display device while tracking the instrument.
- the position of the instrument relative to the subject can be displayed as a graphical representation, sometimes referred to as an icon on the display device.
- an imaging system may be used to acquire image data of a subject.
- the imaging system may include an ultrasound imaging system that includes an ultrasound (US) probe that generally includes an ultrasound transducer to emit and receive ultrasound frequencies. It is understood, however, that the imaging system may include separate components that emit and receive ultrasound frequencies.
- US ultrasound
- Two or more images may be registered to one another.
- the two images may be intraoperative or pre- and intra-operative or any selected images. Further, more than two images may be registered.
- the images may be a real time image and a prior acquired image.
- the registration may be in real time to illustrate a current pose of a portion of the subject relative to a prior acquired image. Both the real time image and the prior acquired image may be selectively segmented.
- a current image which may be a real time image or any appropriate later acquired image, may be displayed relative to and/or superimposed on a prior acquired image.
- a segmented portion may be removed from the prior acquire image.
- the current image may be used to superimpose a current pose of the removed portion in the prior acquire image.
- the US probe may be operated to acquire real time or near real time image data.
- the image data may be used to generate images that are displayed on a display device.
- Prior and/or alternative image data may also be acquired and/or displayed as an image.
- the US probe images and the prior acquired images may be registered. The registration may allow for illustration of the sonograms to be displayed relative to the prior acquired images for various purposes, such as for inpainting, planned positions, pre-identified anatomical features, etc.
- the images may also include image data or images of instruments. Later acquired image data of a similar and/or identical position may be imaged and a position of an instrument may be illustrated in the new or current image. This may allow for an instrument to be positioned in an identical or nearly identical position at a later time after a first image data acquisition.
- FIG. 1 is diagrammatic view illustrating an overview of a robotic system and a navigation system, according to various embodiments
- FIG. 2 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments
- FIG. 3 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments
- Fig. 4A is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments
- Fig. 4B is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments
- FIG. 5 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments
- Fig. 6 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments
- Fig. 7 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments.
- Fig. 8 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments.
- Fig. 9 is a schematic view of a display of a composite image of a selected portion of a subject, according to various embodiments.
- Fig. 10 is a schematic view of a display of a composite image of a selected portion of a subject, according to various embodiments.
- FIG. 11 is a flowchart of a process to acquire image data and selectively generate a composite image of a subject, according to various embodiments
- Fig. 12 is diagrammatic view illustrating an overview of a surgical suite and display device, according to various embodiments.
- Fig. 13A is a diagrammatic view illustrating an overview of a surgical suite and an image acquisition system acquiring image data in a first orientation relative to a subject, according to various embodiments
- Fig. 13B is a diagrammatic view illustrating an overview of a surgical suite and an image acquisition system acquiring image data in a second orientation relative to a subject, according to various embodiments;
- Fig. 14 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments.
- Fig. 15 is a schematic view of a display of an image of a selected portion of a subject, according to various embodiments.
- Fig. 16 is a schematic view of a display of a composite image of a selected portion of a subject, according to various embodiments
- Fig. 17 is a schematic view of a display of a composite image of a selected portion of a subject, according to various embodiments.
- Fig. 18 is a schematic view of a display of a composite image of a selected portion of a subject, according to various embodiments.
- Fig. 19 is a schematic view of a display of a composite image of a selected portion of a subject, according to various embodiments.
- Fig. 20 is a diagrammatic view illustrating an overview of a surgical suite and an image acquisition system acquiring image data and vibration data relative to a subject, according to various embodiments.
- the subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects.
- the systems may be used to, for example, image and register coordinate systems between two systems for use on manufacturing systems, maintenance systems, and the like.
- automotive assembly may use one or more robotic systems including individual coordinate systems that may be registered together for coordinated or consorted actions. Accordingly, the exemplary illustration of a surgical procedure herein is not intended to limit the scope of the appended claims.
- a tracking system may be incorporated into a navigation system to allow tracking and navigation of one or more instruments (which may be the members) that may be tracked relative to the subject.
- the subject may also be tracked.
- the navigation system may include one or more tracking systems that track various portions, such as tracking devices, associated with instruments.
- the tracking system may include a localizer that is configured to determine the position of the tracking device in a navigation system coordinate system. Determination of the navigation system coordinate system may include those described at various references including U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; U.S. Pat. No.
- a localizer may be able to track an object within a volume relative to the subject.
- the navigation volume in which a device may be tracked, may include or be referred to as the navigation coordinate system or navigation space.
- a determination or correlation between two coordinate systems may allow for or also be referred to as a registration between two coordinate systems.
- the first coordinate system which may be a robotic coordinate system
- a second coordinate system which may be a navigation coordinate system. Accordingly, coordinates in one coordinate system may then be transformed to a different or second coordinate system due to a registration.
- Registration may allow for the use of two coordinate systems and/or the switching between two coordinate systems. For example, during a procedure, a first coordinate system may be used for a first portion or a selected portion of a procedure and a second coordinate system may be used during a second portion of a procedure. Further, two coordinate systems may be used to perform or track a single portion of a procedure, such as for verification and/or collection of additional information.
- images may be acquired of selected portions of a subject.
- the images may be displayed for viewing by a user, such as a surgeon.
- the images may have superimposed on a portion of the image a graphical representation of a tracked portion or member, such as an instrument.
- the graphical representation may be superimposed on the image at an appropriate position due to registration of an image space (also referred to as an image coordinate system) to a subject space.
- a method to register a subject space defined by a subject to an image space may include those disclosed in U.S. Pat. Nos. U.S. Pat. No. 8,737,708; U.S. Pat. No. 9,737,235; U.S. Pat. No. 8,503,745; and U.S. Pat. No. 8,175,681 ; all incorporated herein by reference.
- the first coordinate system may be registered to the subject space or subject coordinate system due to a selected procedure, such as imaging of the subject.
- the first coordinate system may be registered to the subject by imaging the subject with a fiducial portion that is fixed relative to the first member or system, such as a robotic system or other instrument.
- the known position of the fiducial relative to any portion, such as the robotic system or the subject may be used to register the subject space relative to any coordinate system in which the fiducial may be determined (e.g., by imaging or detecting (e.g., touching)).
- Due to registration of a second coordinate system may allow for tracking of additional elements not fixed to a first portion, such as a robot that has a known coordinate system.
- the tracking of an instrument during a procedure allows for navigation of a procedure.
- the navigation may be used to determine a pose of one or more portions, such as an instrument.
- the pose may include any number of degrees of freedom, such as a three-dimensional location (e.g., x, y, z) and an orientation (e.g., yaw, pitch, and roll).
- image data is used to define an image space it can be correlated or registered to a physical space defined by a subject, such as a patient.
- the patient defines a patient space in which an instrument can be tracked and navigated.
- the image space defined by the image data can be registered to the patient space defined by the patient.
- the registration can occur with the use of fiducials that can be identified in the image data and in the patient space.
- Fig. 1 is a diagrammatic view illustrating an overview of a procedure room or arena.
- the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures.
- the robotic system 20 may include a MazorXTM robotic guidance system, sold by Medtronic, Inc.
- the robotic system 20 may be used to assist in guiding a selected instrument, such as drills, screws, etc. relative to a subject 30.
- the robotic system 20 may hold and/or move various instrument such as an imaging system that may be an ultrasound (US) probe 33.
- the robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30.
- the robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44.
- the end effector may be any appropriate portion, such as a tube, guide, or passage member. Affixed to and/or in place of the end effector may be the imaging system that may be the US probe 33.
- the US probe 33 may be moved relative to a subject, such as a by a user and/or with a robotic system.
- the robotic system may include an appropriate robotic system, such as a Mazor XTM Robotic Guidance System, sold by Mazor Robotics Ltd. having a place of business in Israel and/or Medtronic, Inc. having a place of business in Minnesota, USA and/or as disclosed in U.S. Pat. No. 11 ,135,025, incorporated herein by reference.
- the US probe may be moved to achieve acquisition of selected image data.
- the end effector 44 may be moved relative to the base 38 with one or more motors.
- the position of the end effector 44 may be known or determined relative to the base 38 with one or more encoders at one or more joints, such as a wrist joint 48 and/or an elbow joint 52 of the robotic system 20.
- a robotic processor module 53 may be used to control (e.g., execute instructions) to move and determine a pose of the end effector, such as relative to the base 34.
- the robotic system 20 including the various portions may be operated to move each portion relative to the base 34.
- the pose of the base 34 may be known in a coordinate system, such as the patient space of the patient 30 and/or the image coordinate system due to a registration as discussed above and exemplary disclosed in U.S. Pat. No.
- the navigation system 26 can be used to track the location of one or more tracking devices, tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, a tool tracking device 66, and/or an US probe tracking device 81 . All or one of the tracking devices may be generally referred to as a tracking device herein.
- US probe 33 may be used to acquire US image data of the subject 30 as may an imaging system 80, as discussed herein.
- a tool or moveable member 68 may be any appropriate tool such as a drill, forceps, or other tool operated by a user 72.
- the tool 68 may also include an implant, such as a spinal implant or orthopedic implant.
- the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc.
- the instruments may be used to navigate or map any region of the body.
- the navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
- the imaging device or system 80 may be an additional or alternative imaging system that may be used to acquire pre-, intra-, or post-operative or realtime image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject.
- the imaging device 80 comprises an 0-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA.
- the imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed.
- the imaging device 80 can include those disclosed in U.S. Pat. Nos.
- the imaging device 80 may include in addition or alternatively a fluoroscopic C-arm.
- Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc.
- Other appropriate imaging devices can also include MRI, CT, ultrasound, etc.
- the position of the imaging system 33, 80, and/or portions therein such as the image capturing portion can be precisely known relative to any other portion of the imaging device 33, 80.
- the imaging device 33, 80 can know and/or recall precise coordinates relative to a fixed or selected coordinate system.
- the robotic system 20 may know or determine its position and position the US probe 33 at a selected pose.
- the imaging system 80 may also position the imaging portions at a selected pose. This can allow the imaging system 80 to know its position relative to the patient 30 or other references.
- the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30.
- reference to the imaging system 33 may refer to any appropriate imaging system, unless stated otherwise.
- the US probe 33 as the imaging system is merely exemplary regarding the subject disclosure.
- the US probe 33 may emit a US wave in a plane and receive an echo relative to any portions engaged by the wave.
- the received echo at the US probe 33 or other appropriate received may be used to generate image data and may be used to generate an US image also referred to as a sonogram.
- the pose (e.g., distance from a selected portion of the US probe 33 and/or the tracking device 81) may be determined or predetermined and saved for recall with a calibration process and/or jig, such as that disclosed in U.S. Pat. Nos. 7,831 ,082; 8,320,653; and 9,138,204, all incorporated herein by reference.
- the imaging device 80 can be tracked with a tracking device 62. Also, the tracking device 81 can be associated directly with the US probe 33. The US probe 33 may, therefore, be directly tracked with a navigation system as discussed herein. In addition or alternatively, the US probe 33 may be positioned and tracked with the robotic system 20. Regardless, image data defining an image space acquired of the patient 30 can, according to various embodiments, be registered (e.g., manually, inherently, or automatically) relative to an object space.
- the object space can be the space defined by a patient 30 in the navigation system 26.
- the patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58.
- the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration.
- registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data.
- a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84.
- An additional and/or alternative display device 84’ may also be present to display an image.
- Various tracking systems such as one including an optical localizer 88 or an electromagnetic (EM) localizer 92 can be used to track the instrument 68.
- EM electromagnetic
- More than one tracking system can be used to track the instrument 68 in the navigation system 26.
- these tracking systems can include an electromagnetic tracking (EM) system having the EM localizer 94, an optical tracking system having the optical localizer 88 and/or other appropriate tracking systems not illustrated such as an ultrasound tracking system, or other appropriate tracking systems.
- EM electromagnetic tracking
- optical tracking system having the optical localizer 88
- other appropriate tracking systems not illustrated such as an ultrasound tracking system, or other appropriate tracking systems.
- One or more of the tracking systems can be used to track selected tracking devices, as discussed herein, sequentially or simultaneously. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system.
- a tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
- the position of the patient 30 relative to the imaging device 33 can be determined by the navigation system 26.
- the position of the imaging system 33 may be determined, as discussed herein.
- the patient 30 can be tracked with the dynamic reference frame 58, as discussed further herein. Accordingly, the position of the patient 30 relative to the imaging device 33 can be determined.
- Image data acquired from the imaging system 33 can be acquired at and/or forwarded from an image device controller 96, that may include a processor module, to a navigation computer and/or processor system 102 that can be a part of a controller or work station 98 having the display 84 and a user interface 106.
- the processor system 102 may be a processor module, as discussed herein, including integral memory or a communication system to access external memory for executing instructions and/or operated as a specific integrated circuit (e.g., ASIC) It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98.
- the work station 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data.
- the user interface 106 which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84.
- the work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional image data.
- the navigation system 26 can further include any one or more tracking system, such as the tracking system including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88.
- the tracking systems may include a controller and interface portion 110.
- the controller 110 can be connected to the processor portion 102, which can include a processor included within a computer.
- the EM tracking system may include the STEALTHSTATION® AXIEMTM Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in U.S. Patent Application Serial No. 10/941 ,782, filed Sept.
- the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7TM tracking systems having an optical localizer, which may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Louisville, Colorado.
- Other tracking systems include an acoustic, radiation, radar, etc. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
- Wired or physical connections can interconnect the tracking systems, imaging device 80, etc.
- various portions such as the instrument 68 may employ a wireless communications channel, such as that disclosed in U.S. Patent No. 6,474,341 , entitled “Surgical Communication Power System,” issued November s, 2002, herein incorporated by reference, as opposed to being coupled directly to the controller 110.
- the tracking devices 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
- the instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device.
- the instrument 68 can include a graspable or manipulable portion at a proximal end and the tracking devices may be fixed near the manipulable portion of the instrument 68.
- the navigation system 26 may be a hybrid system that includes components from various tracking systems.
- the navigation system 26 can be used to track the instrument 68 relative to the patient 30.
- the instrument 68 can be tracked with the tracking system, as discussed above.
- Image data of the patient 30, or an appropriate subject can be used to assist the user 72 in guiding the instrument 68.
- the image data which may include one or more image data or images, may be registered to the patient 30.
- the image data defines an image space that is registered to the patient space defined by the patient 30. The registration can be performed as discussed herein, automatically, manually, or combinations thereof.
- registration allows a translation map to be generated of the physical location of the instrument 68 relative to the image space of the image data.
- the translation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image 108.
- a graphical representation 68i also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
- a subject registration system or method can use the tracking device 58.
- the tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly.
- the fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58.
- the fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy.
- the fiducial assembly 120 can be interconnected with a portion of a spine 126 such as a spinous process 130.
- the fixation portion 124 can be interconnected with a spinous process 130 in any appropriate manner.
- a pin or a screw can be driven into the spinous process 130.
- Fiducial portions may also include one or more portions of the subject that may be imaged, such as boney portions.
- the US probe 33 may be positioned relative to the subject 30, such as by the robotic system 20. As discussed herein, therefore, the robotic system 20 may move the US probe 33 to a selected position relative to the subject 30. According to various embodiments, the US probe 33 may be positioned relative to the subject in any appropriate manner. For example, the user 72 or a second user may move and/or hold the US probe 33.
- the ultrasound probe may emit or transmit ultrasound waves in a selected pattern or plane.
- the plane may be a shape as is understood by one skilled in the art.
- the plane is generally able to acquire data in a field of view to generate images, also referred to as sonograms when images are generated based on ultrasound data.
- the image data collection plane (304 Fig. 10) and/or its pose may be determined or known relative to the US tracking device 81 , as discussed above.
- image data or images may be acquired of the subject 30.
- MRI image data and/or X-ray image data may acquire a soft tissue or vascular image 130.
- the vascular image 130 may be generated with one or more image data collected with a selected imaging system, including the imaging system 80 discussed above.
- the vascular image 130 may include various portions that may be segmented, such as the superior vena cava 134, and various branches thereof, one or more chambers of a heart 138 and other soft tissue portions, such as a kidney 142.
- the anatomical portions of the subject 30 may be segmented in the image 130 with any appropriate method or system.
- an edge detection may include a threshold brightness or contrast factor to determine edges and structures in the image.
- the segmented portions may be identified, such as by a human user, shaping matching to a database, etc.
- the segmented image may be used for various purposes, such as displaying on the display device 184 as the image 108.
- the image 130 may be used to develop a plan for a procedure, such as an ablation of a tumor, placement of an implant (e.g., stent), or other appropriate procedures. Accordingly, the image 130 may be used during a procedure to assist in a procedure on the subject 30. Further, the image 130 may be a first image, such as a pre-operative image.
- the user 72 may also select to acquire image data during the procedure.
- the US probe 33 may be used to acquire an ultrasound image or sonogram of the subject 30.
- a sonogram or ultrasound image 150 also illustrated in Fig. 2, may be acquired of the subject 30, at an appropriate time such as during a procedure.
- the sonogram 150 may be generated with image data acquired by the US probe 33.
- the pose of the US probe 33 may be tracked relative to the subject 30 with the US probe tracking device 81.
- the sonogram 150 may be registered to the first image 130 or otherwise acquired image data of the subject. The registration may occur due to various techniques, such as matching of the anatomy between the first image 130 and the sonogram image 150.
- the sonogram 150 may include doppler imaging that is able to generate doppler portions, including a first doppler portion 154 and a second to doppler portion 158.
- the sonogram 150 may also be segmented, such as identifying the two doppler portions, 154, 158 in the sonogram 150. This may allow for identification or understanding of an anatomical structure, such as the superior vena cava 134, a chamber of the heart 138, or other appropriate anatomical portions. Thus, the segmented and identified portions of the image 130 may be compared and registered to the segmented and identified portions in the sonogram 150. Again, the segmented portions may be identified in an appropriate manner, such as manually by the user 72 or automatically such as by comparison to an atlas or prior acquired images and structure definitions.
- the registration of the two images 130, 150 may allow for a determination of a translation map between the two images 130, 150. This may then register the two images, 130, 150.
- the registration of the two images 130, 115 may allow for tracking of an instrument relative to the subject in substantially real time as the sonogram 150 may be generated in substantially real time of the subject 30.
- the first image 130 may be used to augment or assist in identifying features that are not imageable or clearly imaged in the sonogram 150.
- the image 130 may be an image based on preacquired image data and/or a predetermined image.
- the image 150 may be based on new or later acquired image data.
- Both of the images 130, 150 may be updated with data from the other.
- the image 130 may be updated with the later acquired image data from the image 150.
- One or either of the images 130, 150 may be displayed based on only the image data acquired for each and/or based on updated image data.
- images generated from image data may be two-dimensional (2D), three-dimensional (3D), four dimensional (4D, i.e., images over time), or combinations thereof.
- the registration may occur between any selected one of these as well.
- the number of points and/or other registration portions (e.g., features or structures) required or selected for registration is generally understood by one skilled in the art.
- the registration may be multimodal such that a plurality of types or data, features, etc. may be used to generate a registration between at least two images including the related image data.
- image data and related or generated images may be collected and displayed in any appropriate manner.
- image data may be collected and an image may be displayed based thereon in substantially real time.
- Real time may include a time difference between collection of image data and a display of an image based thereon of less than five seconds, including less than 4, 3, 2, 1 seconds or any appropriate increment of time including about 0.1 seconds to about 10 seconds.
- the US probe 33 may be positioned to image the subject and the image (e.g., sonogram) displayed 150 in less than five seconds from when the image data is collected.
- Real time may include a time differential which allows the user to understand the current condition of the subject even if the condition is changing, such as imaging a heart beat, flow of blood in vessels, etc.
- an augmented image may be generated at a same rate.
- preacquired image data may be inpainted onto a real time image or an augmented image may be generated including pre-acquired or alternative image data and a real time image data.
- the sonogram 170 may be acquired within the US probe 33 and may be displayed on various display devices, such as the display device 84.
- the sonogram image 170 may include data, such as echo data regarding various structures, including structures within the subject 30.
- the structures in the sonogram 170 may have various characteristics, such as a selected resolution or artifact based upon the area being imaged with the US probe 33.
- the US probe 33 may also acquire doppler information. Doppler information may be used to identify areas of movement and relative velocities. Movement may be illustrated relative to the heart of the subject 30. Further, doppler information may be acquired regarding a vasculature of the subject 30. A vasculature of a subject may be used to identify structures in this subject and allow for registration or illustration relative to other image data. As illustrated in Fig. 3, the image 170 may include doppler information of movement of blood through vasculature.
- the doppler information may include a first doppler region 174 and a second doppler region 180. The two doppler regions may have boundaries that are segmented such as a first boundary 178 of the first upper region 174 and the second doppler region 180 may have a segmented second boundary 182.
- the various segmented boundaries 178, 182 may be segmented based upon identifying or determining a boundary of the doppler region.
- the boundary may, therefore, be automatically determined by the processor assembly or module executing instructions to identify a contrast or edge of the doppler region.
- the user 72 may also identify a boundary.
- the boundaries may be understood to be structural boundaries, including, for example, veins or arteries that define the vasculature of the subject 30.
- the boundaries of the vasculature of the subject may, therefore, be used to map to boundaries in other images, such as in the vasculature or anatomy image 130 discussed above.
- other image data may also be acquired and registered to two other images, such as preacquired or otherwise acquired image data. Therefore, the use of the doppler image data may also be used to define boundaries or structures for registration.
- the boundaries that are identified in the US image may be used for registration to other image data.
- the other image data may include MRI.
- the MRI and the real time US image may be registered, also in real time.
- image data or features in the MRI image may also be displayed relative to the real time US image.
- this augmented image may be displayed for the use 72, such as with the display 84.
- the user 72 may then view an augmented image with image data from a plurality of sources or image data types.
- the US image 170 may include other structural image data such as of the tumor or clot 186.
- the clot 186 may also be identified in other image data, such as in an arteriogram or angiogram.
- the various or multiple types of image is or image data may be registered to one another. Therefore, the US probe 33 may be used to acquire substantially real time images of the subject 30 that may be registered to pre-acquired images or alternatively acquired images.
- the sonogram 200 may exemplary be of a gallbladder 204 that may be identified in the sonogram 200.
- the gallbladder 204 may be identified in any appropriate manner such as by the user 72 or substantially automatically, as discussed above.
- shadows or artifacts may appear.
- an artifact or shadow such as a first shadow boundary 208a and a second shadow boundary 208b may appear in the sonogram 200a.
- the shadow boundaries 208a, 208b may appear in the sonogram 200a due to gallstone that are present in the gallbladder 204a.
- the gallstones Due to the gallstone in the anatomy certain structure of the gallbladder or “past” the gallstones may be in the shadow and thus be indiscernible or only mildly discernible.
- the shadows 208a, 208b may be present due to the structure and density of the gallstones relative to the other structures nearby and their density.
- the characteristics may be high impedance boundaries, such as the gall stones.
- the relative characteristics of the anatomy that is past the gallstones relative to a source of the US probe 33 that is generally at the top of the sonogram 200a may cause the shadows due to a difference in density.
- the shadow 208a’, 208b’ are graphically illustrated relative to two gallstones 210, 212.
- the graphical illustrations of the gallstones 210, 212 and the respective shadows 200a’, 200b’ are for clarity of the sonogram 200b.
- the two sonograms may be the same sonogram where the sonogram 200b has been enhanced with a graphical illustration (e.g., superimposed graphic) to illustrate the gallstones and to identify the shadows due to the presence of the gallstones in the gallbladder 204. This may be performed by identification of the gallstones in the sonogram, such as by shape and/or position.
- the shadows may be identified due to lack of clarity or image data.
- the sonogram may be registered to alternative image data, such as x-ray image data (e.g., CT image data) or MRI, to identify features in the sonogram based on the alternative image data.
- the sonogram 200a that may be acquired in substantially real time may include the artifacts or shadows 208a, 208b. Due to the shadowing effect of the gallstone, however, the image content or anatomical content in the shadows may be difficult to discern.
- an image 220 may be an MRI image of the subject 30.
- the MRI image 220 may include the gallbladder 204 that is identified as 204c in the MRI image 220.
- the gallbladder 204c in the MRI image 220 may be segmented according to various techniques, including those discussed above.
- the gallbladder in the sonogram 200a may also be segmented, as discussed above. Therefore, the two images including the sonogram 200a and the MRI image 220 may be registered and mapped to one another by identifying the same or similar features in both, such as the segmented gall bladder boundary.
- image data from the MRI image 220 may be used to enhance or augment the sonogram image 200a, such as by inpainting selected features.
- painting features into an image e.g., 2D or 3D image
- series of images may include adding additional features or clarity (e.g., sharpness).
- the inpainting may be a graphical illustration (e.g., adding generated pixels), including portions from alternative image data (e.g., inpainting into a sonogram an MRI portion), other appropriate clarifications, or combinations thereof.
- the boundary of the gallbladder may be clearly determined in the MRI data due to the lack of shadowing. This boundary may be inpainted into the sonogram to clarify the boundary of the gallbladder in the sonogram. This may be done in real time, such as in time or speed with generating the sonogram image with acquired US data.
- the augmented image 230 may include an augmented view that may include image data from the sonogram 204a and from the MRI image 220.
- the augmentation may include inpainting, as noted above. Therefore, the augmented image 230 may include a display or outline 234 of the gallbladder and a secondary or otherwise graphical illustration 236 (e.g., via inpainting) of another anatomical structure, such as a kidney, liver portion, or the like. Nevertheless, the augmented image 230 may illustrate image data or a combination of image data from the sonogram 200a in the MRI 220.
- the augmented image may include an image of portions that are not included in at least one of the images or image data, such as the shadow portions in the sonogram not including image data that is completed or clarified with the MRI data.
- the inpainted portions may include those noted above and/or additional structures such as at least one of a nerve root, vasculature, tissue boundary, or combinations thereof.
- the augmented image 230 can, therefore, be displayed on the display device, such as the display 84, 84’for viewing by the user 72.
- the augmented image 230 may be prepared due to the registration of the real time US image 200a and the secondary or alternative image data, such as the MRI 220.
- the alternative image data may also be other sonogram image data acquired at a different orientation.
- the various orientation views may be registered together and allow for the inpainting or augmenting of views with other image data from other orientations.
- the registration may occur in any appropriate manner.
- anatomical features may be identified in the sonogram image, such as a vasculature.
- the same vasculature may be identified in the alternative image data (e.g., MRI image).
- This identification of the same vasculature in both image data may allow for a registration of the two image data.
- the identification of the vasculature in the MRI image data may be based on known atlas data, use assisted segmentation, edge and shape detection, or other appropriate techniques.
- any appropriate portion may be identified for registration, such as boney portions, implants (e.g., screws or plates), or other selected features in the image data. That is, real time image may be an augmentation of a real time image including both substantially real time image data and prior acquired image data.
- the US image 250 may include various data such as the portion of a hard tissue, such as an ulna bone 254.
- the US image 250 may include image data that may be segmented, such as to define a boundary or identify a boundary 256 of the bone 254.
- the US image 250 may include or be able to identify various features of the bone 254, such as a fracture or break 260.
- the break 260 may be used or assist in identifying or performing a procedure on the subject 30.
- an implant may be positioned to fix the bone 254 in a selected configuration, as is understood in the art.
- a plate, screw, or alike may be used to fix and hold the bone 254 in position during a procedure.
- the US image 250 may further include various additional information, such as doppler information or regions 264.
- the doppler portion 264 may illustrate flow of a liquid, such as blood, relative to a selected region of the subject, such as near the bone 254.
- the doppler information may be useful for various purposes this is understood by one skilled in the art.
- the alternative image 270 may include a x-ray image of the subject 30, such as the same ulna bone 254.
- the x-ray image 270 may include the bone 254 referred to as 254’.
- the fracture 260 may also be illustrated in the image 270 and referred to as 260’.
- the x-ray image 270 therefore, may be used to illustrate selected portions of the bone 254.
- the x-ray image 270 may allow for reviewing or identifying a clearer edge or boundary at the bone 254.
- the image 270 may assist in understanding the image 250.
- the US image 250 may be augmented with the image 270, or vice versa.
- the bone 254 appears in both of the images and it may be used, such as defining or segmenting edges or boundaries thereof, to register the US image 250 with the x- ray image 270. Therefore, information included in one image may be mapped to an appropriate pose in the second image to assist in the user 72 performing a procedure on the subject 30.
- the information may include feature edges, identified targets (e.g., tumor or ablation position), etc.
- the US image 250 may be a real time image that includes various information, such as the doppler information.
- the x-ray image 270 having additional information, such as a sharper or specific identification or segmentation of a boundary of the bone 254 that may be readily understood therein as opposed to the US image 250.
- Registration may also occur according to identification of various features that are not anatomical features in the subject 30.
- a previously positioned fiducial member or implant 272 which may be any appropriate implant such as a plate, screw, fiducial implant, or the like.
- the implant 272 will have a known geometry including known materials, known dimensions, and the like.
- the known features of the implant 272 may also be referred to as known components.
- the implant 272 may be identified in a plurality of image data and used to assist in registration.
- the implant 272 may be identifiable in the x-ray image 270 and identifiable in the US image 250.
- the implant 272 may have known components that may be stored and saved for recall, such as during a registration.
- the geometry of the implant 272, once determined, may be identified or confirmed in the plurality of images.
- the geometry of the implant that may be identified based upon the prior known components may be used to assist in performing a registration and/or confirming a registration.
- an augmented image 280 may be generated.
- the augmented image 280 may include a graphical representation and/or overlay of the segmented bone 254’ represented as 254” in the image 280.
- the augmented image 280 may include the doppler information 264 that may be overlaid on the image or in the image 280 and/or be represented as a graphical illustration thereof.
- the augmented image 280 may include information from both the US image 250 and the x-ray image 270. Any of the images, including all of the images, or any selected number there, may be displayed on the display 84.
- the implant 272 may also be positioned in the composite or augmented image 280.
- the composite or augmented image 280 may include the implant 272 to again confirm or assist in performing a registration or identifying information to the user 72 for performing a procedure.
- the position of the implant 272 may assist in a current procedure based upon a prior procedure that included or positioned the implant 272.
- the augmented image may include an inpainted implant 272 and/or clarity thereof via inpainting. The inpainting may be assisted with the predetermined known components of the implant, as noted above.
- the user may be able to perform a procedure with the image information from a plurality of the image data acquisition techniques.
- the plurality of images may be registered to one another.
- the registration may occur at least by identifying features in each of the images, such as anatomical portions and boundaries thereof, to assist in the registration.
- the features may be determined automatically such as by image segmentation and identifying the segmented portions (e.g., by comparison to a predetermined database).
- the identified features may also include at least some manual input, such as the user identifying a seed point for segmentation and/or a identification of one or more segmented portions. Therefore, the augmented image 280, or any appropriate augmented image, being displayed along with one or more or alone to illustrate in the anatomical portion of the subject for viewing about the user 72.
- the augmented or compound image 280 may include any appropriate image, such as an image 300.
- the compound image may include information from the non-US image, such as types of tissue therein.
- the nonUS image may include an MRI image 130.
- the MRI image may have identified therein or segmented therein various tissue types, such as fat or adipose tissue, cancellous bone tissue, muscle tissue, or other appropriate tissues.
- the US image may also be augmented based upon this predetermined information. For example, the speed of ultrasound through adipose tissue may differ from that through muscle tissue.
- the position of various features or portions in the US image may be augmented or determined based upon the type of tissue through which the ultrasound signal is traveling.
- the pre-acquired image including the segmented portions such as the adipose and muscle tissue may be used to inform the rendering of the US image that is displayed for the user.
- the illustration may be exemplary of various exemplary displays.
- the illustration of Fig. 10 may include an augmented reality display where the user 72 is viewing the subject 30 through an augmented reality display system 302 such as that disclosed in US Pat. No. 11 ,839,433 or U.S. Pat. App. Pub. No. 2019/0175059, all incorporated herein by reference, and/or other augmented reality display systems such as the Hololens® 2 sold by Microsoft Corporation having a place of business in Washington USA.
- the display 300 may include at least portions that are illustrated on the display device 84 such as at least an ultrasound image, alternative image such as an MRI image, or an augmented or composite image as discussed above.
- the user 72 may hold the US probe 33 relative to the subject 30.
- the US probe 33 may emit an ultrasound signal in a plane represented by plane 304.
- the plane 304 may be used to generate an image that may also be illustrated by the image 304, such as on the display 84 or with an augmented reality or virtual reality system, as noted above.
- image information in addition or alternative to that generated with the US probe 33 may include other image information which may also be used to augment or be displayed separately or additively, such as the image portion 308.
- the image portion 308 may include various features such as a bone portion 310, a soft tissue portion, such as a portion of a heart 312, or other predefined features that may include a target, such as a tumor or lesion 316.
- the augmented image or composite image may include the tumor 316 that may be predetermined in image data, such as image data acquired with the subject 30 prior to a procedure.
- the tumor 316 may be displayed relative to the ultrasound image, such as on a respective display due to registration of the pre-acquired image with the substantially real time image, such as generated with the US probe 33. Therefore, the tumor 316 may be illustrated with the pre-acquired image and/or with the image generated with the US probe 33.
- the tumor 316 maybe imaged with the US probe 33 due two properties thereof.
- the display 300 may be an augmented reality display where the user views the image portions including the US image 304 and the other image portions 308 separately, together, or as a composite image of the subject 30.
- the composite image may include graphical representations or graphical renderings that are displayed (e.g., superimposed) on either or both of the image portions, such as the ultrasound image. Therefore, the image viewed by the user 72 may include information from a plurality of image acquisition systems modalities to a system providing an optimal or selected view to the user 72. Further the various images may be registered to one another, as discussed above.
- the ultrasound image, or any appropriate image may be registered to the subject 30, as also discussed above.
- the US probe 33 may be tracked with the tracking device 81 .
- the subject 30 may be tracked with an appropriate tracking device such as the subject tracking device 58. Therefore, the pose of the subject may be tracked with the select tracking system and the pose of the US probe 33 may be tracked with the select tracking system. This allows the pose of the image generated with the US probe 33, such as with the US plane 304, to be tracked relative to the subject via the tracker 58. As also noted above, the US plane 304 may be calibrated relative to the US tracker 81 such that the pose of the plane 304 may be known or determined relative to the subject 30 during a procedure.
- images of the subject 30 and/or portions relative there to may be imaged with the US probe 33 substantially in real time.
- image portions may also be known relative to the subject in the tracking system. Therefore, the instrument 68 may be moved relative to the subject 72.
- the instrument 68 may be tracked with an appropriate tracking device, such as the tracking device 66.
- the instrument 68 may include a selected portion, such as the tip or extending portion 320 that may be known due to its imaging with the US probe 33 and/or geometry relative to a handle 322.
- the display may include the composite image or portion 300 to display other features.
- a predetermined anatomical or warning zone 330 may be identified.
- the anatomical warning zone 330 may be displayed on the display for the user 72 during the procedure.
- the predetermined zone 330 may be identified in any appropriate image, such as in the prior acquired MRI image.
- the predetermined image may be displayed with or include a portion of the superimposed on a real time image, such as a US image 304, to assist in the procedure.
- the user 72 may understand that the zone 330 to be a sensitive anatomical zone that should be avoided, cautious motion thereby should be taken, or specific instruments may be needed to perform a procedure relative thereto.
- a predetermined zone 330 may also be displayed on the display, such as the display 84 or a augmented reality display that may be displayed with the augmented reality system 302.
- the pre-acquired image may be used to identify a selected feature, such as the tumor 316.
- the tumor 316 may also be identified including a first or initial boundary 316a in the real time image that may be acquired with the US probe 33.
- the real time image may be used to identify in substantially real time a current boundary of the tumor 316 such as a boundary 316b. Therefore, a progression of a procedure may be tracked with real time imaging, such as with the US probe 33. Due to the registration of the image acquired with the US probe 33 and the preacquired image that may be displayed, the procedure may be visualized substantially in real time.
- a size of the tumor, an ablation portion, or other feature may be illustrated on the display, such as the display 84 or other appropriate display system.
- an implant 272 may be positioned in the patient it may also include a secondary or additional implant that may be positioned.
- a prior acquired image may include a visualization of the fracture 260’ as illustrated in the image 270 all the real time image, such as the US image, may illustrate the positioning of a selected member, such as an implant including a plate or a screw.
- various graphical representations can be made of features on the display or image portion 300. These may include graphic representations, such as real portions, segmented image portions superimposed on other images, graphical representations based upon image data, or other appropriate representations such as of the tumor 316 or selected anatomical region 330.
- various other information may also be provided to the user.
- the display portion or image 300 may include numerical or character data portion 340.
- the numerical or character data may identify various information, such as the distance from the instrument tip 320 to the selected and anatomical region 330. As illustrated Fig. 10, the “vessel at 3mm” may represent the distance between the tip 320 and the selected and anatomical region 330. Therefore, the user may be provided various types of information with the display 300.
- a selected display 300 may be provided to the user 72.
- the display may be the display panel or monitor 84 that is positioned away from the subject 30.
- the display may also include an augmented reality display, such as projected with an augmented reality system 302. Therefore, the subject 30 may be viewable through the augmented reality system 302 or directly by the user 72 while also viewing or switching attention between the display 84 and the subject 30.
- image data that may be provided to the user 72 such as real time image data, prior acquired image data, or composite images.
- the composite image may include various types of image data, such as US image data generated by the US probe 33 having various portions, such as inpainted or additional graphical representations generated from prior required image data included with the real time US image.
- a prior acquired image may be augmented with US image data that may be collected in real time.
- the ability to generate the composite image and/or display the image relative to the subject 30 may be based upon the tracking of the US probe 33 with the US probe tracking device 81 .
- the various images may be registered to each other, as discussed above. The registration may allow for inpainting and/or augmenting one or more of the images based upon the other image data from a registered image. Therefore, the user 72 may be provided image data and/or additional data to assist in performing a procedure according to various embodiments.
- the registration of the ultrasound image generated with US probe 33 to other image data may assist in determining a selected (e.g., optimal) position to position the US probe 33 to acquire selected image data, including optimal image data.
- a selected (e.g., optimal) position to position the US probe 33 to acquire selected image data including optimal image data.
- the prior acquired image such as an MRI or CT image may be registered to the subject 30.
- the known imageable region of the US probe based upon the calibration of the US plane and image portion 304 to the US probe 33 allows for a selection or determination of an optimal image acquired with the US probe 33.
- the US tracker 81 may be used to track a pose of the US probe 33 to direct or determine a selected or optimal pose of the US probe 33 relative to the subject 30.
- the selected or optimal pose of the US probe 33 may assist in acquiring a selected image of the subject 30 for performing a procedure.
- a rib 308 may be between the US probe 33 and a selected portion of the subject, such as a vasculature or organ (e.g., the heart).
- the navigation system and/or other appropriate processor system module may be useful in determining a proposed or selected pose of the US probe 33 and/or may direct movement of the US probe 33 to achieve a selected pose.
- the rib 308 may include a first rib 308a and a second rib 308b and the selected pose of the US probe 33 may be between the two ribs 308a, 308b. Further, it may be selected to have the US probe positioned at a selected angle relative to the subject 33.
- the determination of the selected pose of the US probe 33 may be predetermined based upon the prior acquired image.
- the instructions provided or determined for movement of the US probe 33 may also be predetermined and saved for use during the procedure. This may allow a procedure to be performed efficiently with a predetermined positioning of the US probe 33. Further, based upon the prior acquired image, the predetermined pose of the US probe 33 for the procedure may be different from or contrary to a standard positioning of the US probe 33. Therefore, instructions may be provided to assist in positioning the US probe 33 during a procedure.
- the prior acquired or predetermined positioning may also be used to instruct the robotic system 20 to move the US probe 33 to a selected positioning for imaging the subject 30.
- the US probe 33 may be used to acquire substantially real time image data of the subject 30. Therefore, positioning of the US probe 33 relative to the subject to acquire selected or optimal image data may allow for a substantially efficient procedure and/or viewing of a real time portion of the subject 30. Further, the US image data may be analyzed and registered to any prior required image. The registration of the two images may assist in determining or planning a procedure. Further the registration and comparison may assist in determining whether additional image data may be required or useful in determining or acquiring image data to assist in the procedure and/or confirming a procedure, or other appropriate purposes. Various image based metrics can be established to estimate the quality of the registration and determine if additional data may be needed to improve upon the registration. For example, similarity measures between two or more images or image data may include Euclidean Distance or Normalized Cross-Correlation as metrics for registration quality and/or confirmation.
- the systems described above may be used for various purposes, such as performing a procedure on the subject.
- the image data acquired with the various systems may be registered to one another and displayed, as discussed above, to assist in the procedure.
- a process or procedure 350 as illustrated in Fig. 11 , may be used to acquire and select image data for displaying for the user 72.
- the process 350 may begin in Start block 352.
- the process may enter a first sub-phase of process 356 that may relate to acquiring a first image data in block 358 of the subject 30.
- the first image data may be a pre-acquired image data, such as acquired prior to a procedure, or acquired at any appropriate time.
- the first image data may be an image data acquired at any appropriate time and may be referred to as first image data other than a second image data, as discussed herein.
- the first image data therefore, may be acquired in substantially real time such as with the imaging system 80.
- the first image data may also be acquired at an appropriate time such as with an MRI system, a CT system, or the like.
- the image data that is acquired as the first image data in block 358 may be two-dimensional image data, three-dimensional image data, or image data that is acquired over time.
- the image data acquired with the subject 30 may be acquired for various purposes such as assisting and planning a procedure on the subject, identifying features in the subjects such as anatomical structures or targets (e.g., tumors).
- the acquired first image data may be segmented in block 362. Segmenting the image data in block 362 may be performed in an appropriate manner. For example, the image data may be segmented to identify various anatomical structures in the image data. Further, the image data may be segmented to identify various features or elements, such as fiducials, prior positioned implants, targets for a procedure (e.g., a tumor for ablation) or other selected features. Segmentation may occur in an appropriate manner such as through threshold techniques to identify edges or define edges in the image data. Other appropriate segmentation techniques, however, may also be used such as clustering methods, region growing, machine learning (e.g., deep learning) methods, or combinations thereof.
- machine learning e.g., deep learning
- Features may also be identified in the first image data in block 366.
- the identified features may include at least one or more of the segmented features from block 362, if performed. It is understood, however, that the user may define a feature in an image data without segmentation being performed in block 362.
- the user 72 may view an image and identify and define a feature, such as a fiducial, bone fracture, anatomical element, or the like without segmentation.
- the user 72 may define an edge or a boundary of the identified feature in the image data in block 366 in a substantially manual process.
- An automatic process may also be used to identify features in the first image server.
- the automatic process may include a system comparing an image to a database of images, or at least image features to identify portions in the first image data.
- a bone portion, a fiducial, an implant, a standard tumor model, or the like may be used to identify segmented features in the image data in this substantially automatic process. Therefore, the first image data may have identified therein in a substantially automatic manner various features such as fiducials, anatomical features, or the like. Further the first image data may have features identified therein in both the manual and automatic manner or a combination thereof.
- the user 72 made define at least a portion of a selected feature and the system may complete or augment the manual determination of a feature with the above noted processes. Accordingly, features may identified in the image data for various purposes, such as assisting and performing a procedure, identifying fiducial portions for later registration, or the like.
- the process 350 may include also acquiring a second image data at a selected pose in block 370.
- the selection of acquiring a second image data block 370 may be performed based upon the first image data and/or features identified in the first image data.
- a feature may include a tumor in the first image data.
- a second image data may be acquired.
- the pose to acquire the second image data may be based upon acquiring the second image data with the US probe 33.
- a selection of a pose for acquiring the second image data may include a position and/or angle relative to the subject 30 to acquire an ultrasound image. The selection may be based upon acquiring a selected or desired image which may be an optimal image for assisting a selected procedure.
- the selected pose in block 370 may include an anterior or posterior positioning of the US probe, positioning the probe such that the transducer or the image plane is between selected bone structures, or other selected image acquisition poses.
- the selected pose may be stored for later recall, such as in the memory system.
- the pose selection may be based upon a determination of a plane projected with the US probe 33 for acquiring image data and ensuring that the selected anatomical feature or identified feature from block 366 is within the plane.
- the second image data may be acquired in an acquisition subprocess 380.
- the second image data may be acquired after selecting a pose in block 370 and/or at any appropriate pose, such as the one selected by the user 72. Accordingly, the second image data may be acquired in the sub-process 380, as discussed herein.
- the second image data may be acquired in block 384.
- the acquisition of the second image data may be with the US probe 33, as discussed above. In acquiring the image data with the US probe 33, it may be moved such that the US plane 304 moves relative to the subject 30.
- the image data may be acquired with the US probe 33 and forwarded or processed within the appropriate processing system, including those discussed above.
- the second image data may be segmented in block 388.
- the image data may be segmented in a manner similar to the first image data.
- the US image data may also include doppler information to allow for segmenting portions of the anatomy based upon the doppler information. Other segmentation may also occur, such as edge detection or shape detection. Accordingly, the second image data may be segmented in any appropriate manner, as discussed above in block 388.
- a feature may be identified in block 392.
- the feature identified in block 392 may be the same feature identified in block 366.
- the feature may allow for performing a procedure, for example if the feature is identified as the tumor 316.
- the feature may also allow it to assist in registration if the feature includes an anatomical fiducial landmark, and implanted a fiducial, landmark, or other appropriate feature. Further, the identification may be performed manually, automatically, or a combination thereof is also noted above.
- the first image data and the second image data may then be registered to one another in block 400.
- the registration of the first image data to the second image data, or vice versa allows for various displays or analysis or processing, as discussed above. Therefore, the first and second image data may be registered in block 400, according to any appropriate process, including those discussed above.
- the image data may then be analyzed in block 404.
- the analysis may include determining whether the feature is imaged appropriately in block 404.
- the feature may be identified in block 392 and may be registered to the first image data in block 400. Therefore, an analysis of whether the feature is imaged appropriately may be made in block 404.
- the analysis may include whether the feature is imaged in enough clarity or with enough information, in it’s full dimension or an appropriate dimension thereof, or at a selected angle, or enough contrast or spatial resolution to conduct or confirm the procedure, other appropriate analysis determinations or combinations thereof.
- a NO path 412 may be followed.
- the NO path may include or move to recalling a selected pose to acquire the second image data in block 416.
- the selected pose may be based upon the determination block 370.
- the pose of acquiring the second image data may be saved for later recall. Accordingly, if the acquisition of the second image data in block 380 does not allow for a determination that the feature is imaged appropriately in block 404, a recall of a selected pose may be made in block 416. The output recall pose may then be made in block 418.
- Outputting the recalled pose in block 418 may include providing a display on the display device, such as the display 84, to instruct the user 72 on a possible positioning or movement of the US probe 33.
- the information may include an absolute position, such as a pose or angle of the US probe or a possible movement of the probe (e.g., “move superiorly three centimeters”).
- the output may further include or alternatively include instructions to the robotic system 20. As discussed, above the robotic system 20 may engage or have the US probe 33 at the end effector. Therefore, the robotic system 20 may move the US probe to the selected pose based upon the output from block 416.
- the robotic system 20 may move the US probe 33 to a selected pose.
- the robotic system 20 may move the US probe in the navigation coordinate system due to the registration of the robotic system coordinate system to the subject coordinate system and the registration of the first and second image data, in a first instance.
- the tracking system may be used to determine a current pose of the US probe 33 and an updated or possibly updated pose of the US probe 33. Therefore, the registration of the various coordinate systems may allow for the robotic system 20 to move the US probe 33 to the output recalled pose in block 418.
- the process 350 may then loop to allow for the acquisition of second image data again in the sub-process 380.
- the loop or process may allow for the acquisition of an appropriate or selected image data such that a feature may be appropriately imaged.
- the appropriate imaging may allow for an appropriate amount of data or required amount of data to assist in performing a procedure or in generating a selected image.
- a YES path 430 may be followed.
- the yes path may be followed to generate a composite image, optionally, in block 344.
- the composite image may be generated with information from at least two image data acquisitions and/or other appropriate information, such as analysis information.
- the display such as the display 300, may be a composite image that includes entirely generated graphics (e.g., computer generated image portions).
- the generated image portions may include anatomical features, selected targets or information regarding these targets, or other appropriate information.
- the tumor 316 may be graphically generated and displayed relative to other anatomical portions that are also displayed.
- the component information that can be included is inpainting into the first image data.
- the first image data may include an MRI while the second image data may include the US image data.
- Information may be inpainted into the first image information, such as real time doppler information or segmentation information may be inpainted into the first image data.
- Inpainting may refer to adding information to an image, such “cut and pasting” from one image to another, adding a graphical representation based on additional information, etc. Thus, inpainting may referring to information added to an image that is not present in the raw or base image data of the image. In a similar manner image information may be inpainted into the second image.
- the target or various anatomical features such as organs or bones
- the first and second image data may be displayed as generated and may be augmented or composited with inpainting of information from another image data, such as the first and the second.
- the second image data may have inpainted (e.g., segmented and copied and bone portions) into the second image data and/or the first image data may have been inpainted (e.g., segmented doppler information) inpainted or superimposed on the first image data.
- the process 350 may then allow for outputting of the selected image in block 440.
- the output selected image in block 440 may include either first or second image based, both the first and second image data, and/or the composite image in block 434.
- the output selected image may be saved for various purposes, including those as generally understood by one skilled in the art and/or for display of the image in block 444. Displaying of the image may include that discussed above, such as displaying on the display device 84 and/or with the augmented reality system 302.
- the process 350 may generally relate to processes for acquiring image data, analyzing the same, and generating a display. It is understood that various other procedures or processes may occur relative to the process 350 that are not specifically identified in the process 350.
- the user may perform a procedure on the subject 30, such as an ablation, implantation of an implant, or other appropriate procedure.
- the user 72 may perform a procedure on the subject 30 with the image data as discussed above.
- the display may include various information displayed to the user 72.
- the display may display the image from the ultrasound either alone or in combination with other information.
- the ultrasound may illustrate segmentation of selected anatomical features such as vessels, nerves, patient organs, or other appropriate anatomical features.
- the display may include a segmentation of the images, including in substantially real time, to illustrate the various features to the user 72.
- the doppler information may be used to assist in segmenting vasculature of the subject.
- the ultrasound image data may be collected in real time and displayed real time and also be registered to alternative image data, including prior acquired image data.
- the alternative image data may be pre-segmented and identified.
- the registration of the real time ultrasound image data and the prior acquired image data may allow for the composite image of the ultrasound image data and alternative image data, in various embodiments the alternative image data is nonultrasound image data to be displayed on the display device 84 or other appropriate display device.
- the composite display may display to the user 72 various information including segmented features that are not only segmented in the ultrasound image data, but may be displayed in the real time image, such as superimposed on the ultrasound image, due to the registration of the ultrasound image to the prior image and information acquired in the prior image.
- the prior image may be any appropriate image and it may be acquired at an any appropriate time.
- an MRI image may be acquired substantially simultaneously with the ultrasound image and various features of the subject 30 may be segmented therein, such as soft tissues.
- image data may be displayed on the display device 84 or other appropriate display such as in the augmented reality display 302. Accordingly, discussion here and regarding any display may relate to any appropriate display.
- the display 84 may be provided for use in a procedure to display various information, such as image data acquired with the US probe 33 for use by the user 72.
- the display 84 may be provided with various inputs such as various hard or soft buttons 470 that may be accessed with an instrument, touch screen, or other appropriate inputs.
- the display may display image data, such as image data acquired with the subject 30 such as with the US probe 33.
- the display device may include a processing module or be associated with a processing module such as the processor 102 or other appropriate image processing module, such as an additional or incorporated processing module 474 the image processing module 474 may be also associated with the memory module 478.
- the processing and memory module is 474, 478 may be interconnected with the US probe 33 and with the display 84. It is understood that the connections may be any appropriate system to allow a data transfer or information transfer it may be wired, wireless, or combinations thereof.
- the image data acquired with the US probe 33 may be displayed on the display device 84 and/or stored in the memory module 478.
- the processor 474 and the memory module 478 may be provided separately or may be incorporated with other processor and memory modules, as is understood by one skilled in the art.
- the user 72 may position the instrument 68 relative to the subject.
- the instrument 68 may be a first instrument that may be used during a selected portion or procedure.
- the instrument 68 for example, may be a resection instrument, probe, or other appropriate instrument.
- the instrument 68 may be positioned within the subject 30 in an appropriate manner, such as through an incision.
- the US probe 33 may acquire image data of the instrument 68 and the subject 30.
- the US probe 33 may acquire image data of the subject 30 and the instrument 68.
- the user 72 may use a second instrument 484.
- the second instrument 484 may be different than the first instrument 68. It is understood, however, that the second instrument 484 may also be the same or similar instrument as the instrument 68, but only used at a different time during a procedure.
- the instrument 484 may be a selected instrument, such as a suction device, resection tool, or the like.
- the instrument 484 may include various portions such as a handle portion 488 and a working or distal tip portion 492.
- the distal tip portion 482 may be positioned relative to the subject 30 such as in substantially similar portions or areas as a portion of the instrument 68, such as the tip 320 as discussed above the instrument 484 may be tracked or navigated with the tracking or navigation system 26 such as with the use or assistance of an instrument tracking device 496.
- the instrument tracking device 496 may be similar or identical to the tracking devices, such as the instrument tracking device 66. Therefore, a pose of the instrument 484, such as of the tip 492 may be determined.
- the pose of the instrument may be displayed on an image, such as on the display device 84 similar to illustrating a pose of the instrument 68.
- the user 72 may also acquire and/or view image data acquired with the various imaging systems, such as the US probe 33.
- the US probe 33 may be positioned relative to the subject 30 in an appropriate pose.
- the US probe 33 may be positioned relative to the subject in a predetermined or in an any appropriate selected manner.
- the US probe 33 may be positioned to acquire anterior-to-posterior image data in an anterior-to-posterior orientation 33ap.
- the US probe 33 may also be positioned to acquire image data in the lateral orientation, such as illustrated at for the US probe 33lat in Fig. 13B.
- the two different image acquisition orientations may be acquired simultaneously or sequentially.
- the image data may be captured and displayed on the display device 84.
- the image data may be processed in the processor module 474, such as including segmentation thereof, and various portions may be stored in the memory 478.
- a sequential or series of images may be captured that may illustrate movement of the US probe 33 and/or movement of a portion of the subject 30 such as a beating of a heart or movement of a cavity due to breathing. Again, all of the images may be acquired and stored for display at an appropriate time.
- the image data acquired with the US probe 33 may be registered to the subject such as by being tracked relative to the subject tracker 58.
- the registration may allow a display of a registered image to the subject 30 and an appropriate or tracked pose of the respective instrument 68, 484 relative to the subject 30 and for display on the display device 84.
- the display device 84 may display various information including that discussed above.
- the display device 84 may display an image, such as a real time ultrasound image or composite image 500.
- the image 500 may be based upon image data acquired with the US probe 33 at a selected pose, such as that the AP pose. Regardless, the image 500 may be a real time image acquired during a selected portion of a procedure.
- the display 84 may also display the graphical representation 68i of the instrument 68.
- the graphical representation 68i may be illustrated in any appropriate manner, such as being superimposed on the image 500 as a three-dimensional representation, two- dimensional representation, or the like.
- the instrument 68 may also be imaged and appear as an imaged portion 68p in the image 500.
- the US probe 33 emits a plane of ultrasound and therefore image data for images are acquired in a single plane.
- a plurality of images may be acquired individually for display sequentially (e.g., as a movie) or for generation of three-dimensional images.
- the image data of the instrument therefore, may be a substantially two-dimensional representation of the instrument 68.
- the representation in the image of the instrument may appear to be a cross-sectional outline of the instrument 68.
- the image displayed with the display device 84 may display the real time image 500, and a real time image of the instrument 68p and/or it may have the portion superimposed thereon such as that of the instrument 68i. As also discussed above, other portions may also be identified or segmented, such as the anatomical target, for example the tumor 316.
- the image 500 may be collected in substantially real time with the US probe 33.
- the image may be processed, such as having portions thereof segmented or identified by executing instructions with the processor module 474. Further, the image 500 may be registered to the subject 30 and/or other portions, such as other acquired image data. Therefore, the other image data may also be used to assist in generating a composite image as discussed above. Any or all of these image portions may be saved in the memory module 478. The memory module 478 may then be accessed to recall or redisplay any of the acquired images or generated images.
- the US probe 33 may acquire a single or a sequence of images.
- the sequence of images may be generated, such as by the US probe 33 continually collecting images or image data for generation of images over a time.
- the acquired images may be displayed sequentially such as in a loop or movie. Therefore, the user may select, such as with an appropriate selection or input system 504, to illustrate or have illustrated a single still image or a loop or movie image.
- the image 500 may be a single image that is displayed on the display device 84. Further, the user may select to display the image 500 to be substantially continuously updated based upon information acquired with the US probe 33. This may allow for a real time illustration of the instrument 68, such as with the display 68p, and/or acquiring substantially real time information regarding the subject 30. Again each of the images or series of images may be stored for later recall.
- the instrument 68 may be positioned relative to a selected portion, such as the tumor or target 316.
- the image data might be stored with the pose of the subject 30 therewith and/or pose of the instrument 68 therewith.
- the stored images or image data may then be recalled at a selected time.
- the second instrument 484 may be moved relative to the subject 30 at a selected time.
- the instrument 484 may be used after the usage of the instrument 68 and the acquisition of the image data is illustrated in Fig. 14.
- the instrument 484 may be moved relative to the subject.
- the pose of the instrument 484 may be displayed as a graphical representation 484i.
- the display of the instrument as the graphical representation 484i may be displayed relative to the image 500 and the image 500 displayed on the display device, as illustrated in Fig. 15, may be displayed when the instrument 484 is positioned relative to the subject 30 at a selected position, such as near the target or tumor 316.
- the instrument 484 may be tracked to identify when the instrument, including the distal portion of 492 thereof, is at a selected pose relative to the target or tumor 316.
- the user may select or the system may automatically select to display the image 500 that was collected when the instrument 68 was near or at the same pose relative to the subject 30. [00130] This may allow the US probe 33 not be required to acquire substantially real time image data continuously during an entire procedure.
- a later second portion of the procedure may be performed without the use of the US probe 33. This may be done particularly if the first and second portions are performed near in time, such as minutes apart.
- the image data may be recalled manually or automatically to be displayed on the display device 84 based upon a track pose of the instrument 484. It is understood, however, that the selection of the image for display may also be based upon any appropriate selection, such as by the user 72.
- a CT image 508 may also be displayed on the display device 84.
- the various image data may be registered to one another such that the US image 500 is displayed relative to the second or additional image data 508 at a registered pose or position such that the two image data are substantially melded.
- the user may again select the type of display such that the US image 500 may be a still image or maybe a loop or movie of images that were taken over a period of time.
- the user 72 may understand the pose of the instrument 484 relative to the subject in a manner similar to that if the display 84 did display real time US image data.
- the image 500 displayed during the use of the second instrument 484 may not be real time image data if selected by the user but rather the recalled image data with a graphical representation of the instrument 484.
- the graphical representation may be, for example, super imposed on the recalled image at a pose determined by the tracking of the instrument 484.
- the display device 84 may display image data from any appropriate imaging system.
- the US probe 33 may be an internal US probe or other appropriate imaging system, such as an endoscope.
- the image data acquired therefore, may be real time image data, but may include a visual image reconstruction or display, which may be based upon a visual light acquisition by an appropriate sensor or three-dimensional image is that are based upon acquired US image data that is generated into a three-dimensional image for display.
- the display 84 may display image data in an appropriate manner or type, such as a surface or internal image 520. The image data may be acquired from within the subject 30. Again as illustrated in Fig.
- the instrument 68 may be positioned relative to the subject when performing at least a first portion of a procedure.
- the instrument 68 may be represented relative to the image 520 as a graphical representation 68i. It is understood, however, that the instrument may also be imaged directly in the image and displayed in the image 520. For example, if the image 520 is acquired with a light endoscope, the representation of the instrument 68 may be a light image of the instrument 68 rather than a graphical representation thereof.
- Fig. 16 may illustrate a real time display of the subject 30 and a real time display of the instrument 68 rather than a graphic representation thereof.
- the image 520 may be registered to the subject such as through viewing or identification of fiducials, tracking of the imaging device that acquires the image data, or other appropriate techniques. Further, the pose of the instrument 68 may be known and registered to the subject such as with the tracking device 66 associated with the instrument 68. Further, the user 72 may select the style of image to be displayed which may include a steel image or a loop or movie image as discussed above.
- the second instrument 484 may also be used relative to the subject 30.
- the instrument 484 may also be tracked relative to the subject.
- the second instrument 484 may be tracked relatives to various portions of the subject 30, such as a target 316, which may include a tumor.
- the image 520 may be selected and displayed on the display device 84.
- the selected pose may be determined by tracking the second instrument 484.
- the within a selected pose may include a threshold pose relative to the pose of the first instrument 68 when the image was acquired. The threshold may be that the tip is within a selected distance, angle, etc.
- the threshold distance may be less than 1 millimeter (mm), less than about 2 mm, less than about 5 mm, less than 10 mm, or any appropriate distance include increments within the ranges noted above.
- the selection of the image 520 may be substantially automatic, manual, or combinations thereof for display on the display device 84.
- the selected image may be a specific image or series of images (e.g., movie) based on the tracked current pose of the instrument 484.
- the graphical representation 484i may also be displayed on the display device such as superimposed on the image 520. [00135] Again, the image 520 may not be a real time image but may be close in time.
- the close in time image that is selected may be acquired with the imaging system, such as the US probe 33, during a different portion of the procedure such as during use of the instrument 68.
- the image 520 may be a pre-acquired image that is displayed when the instrument 484 is at a selected pose relative to the subject 30. This may allow for the display of an image and representation of the instrument relative thereto without a continuous use of an imaging system, such as the US probe 33, during a procedure.
- the image displayed 520 may be a close in time image, such as an intraoperative image, but may not be a real time image when using the instrument 484. Again, the user may also display or select the style for display with the selection 504.
- the image data displayed for the user 72 may include various types of images. As discussed above, real time images may be acquired and displayed at select times. Prior acquired images may also be displayed based upon a selection of the user or the system (e.g., automatically) such as based on a tracked pose of an instrument at a later portion of a procedure. The user may select or the image may be displayed as a single image (such as only an ultrasound image) or a composite image which may include image data from other image modalities or various information, such as segmented and/or highlighted portions of the image. This may allow the display 84 to display an appropriate or selected information for use by the user 72.
- the image displayed may be a close in time image (e.g., acquired during the usage of a first instrument, such as for a first portion or procedure) rather than a real time image. Nevertheless, the image may be closer in time than a prior required image, such as an image acquired of the subject 30 during a planning or diagnosis stage. Thus, the user 72 may have selected image data for viewing during performing a procedure.
- the display 84 may display a plurality of types, or angles of image data.
- the US probe may be moved relative to the subject to acquire various views such as an AP image 530 and a lateral image 534. Both of the images 530, 534 may be displayed on the display device 84 individually or simultaneously.
- the graphic representation 68i of the instrument may be displayed relative to the respective images 530, 534 at an appropriate pose relative to the images 530, 534.
- the US probe 33 may be tracked relative to the subject, therefore, the pose of the image relative to the subject and the respective pose of the instrument 68 may be known and appropriately illustrated relative to the images 530, 534. Additionally, as discussed above, if the instrument 68 passes through the plane of the US probe 33, an image of the instrument may also be displayed in the respective images 530, 534.
- the display device 84 may also display the images 530, 534 either alone on the display device and/or registered to and displayed with a second image data 538.
- the second image data may be any appropriate image data and may be registered to the images 530, 534 in a manner as discussed above.
- a graphical representation 484i of the instrument may also be displayed relative to the images 530, 534, 538 in an appropriate manner such as superimposed thereon. Again, the graphical representations 484i may be displayed relative to the images 530, 534 based upon the tracked pose of the instrument 484.
- the images 530, 534 may be selected for display by the display device 84 in the manner as discussed above.
- the images may be selected automatically, manually, or combinations thereof.
- the selection may be based on the tracked current pose of the instrument 484.
- This system and processor allows the display in Fig. 18 to illustrate or show real time images relative to the instrument 68 while the display in Fig. 19 may display the images 530, 534 based upon a recall of the stored images, including those discussed above.
- the two images 530, 534 may provide a plurality of information to the user including based upon or due to different orientations or different poses of the imaging device, such as the US probe 33, relative to the subject 30.
- the display of the images 530, 534 in Fig. 19 may be similar to the selection discussed above but include the multiple views as illustrated in Fig. 19. It is also understood that regardless of the display, the user may also select the image style such as being a still or loop or movie display is also discussed above.
- acquisition and display of real time images may be had. Further the images that are acquired may be saved for later viewing, such as for when a second instrument or later instrument is moved relative to the subject in a selected pose.
- the selected pose may be a previous pose of a previous instrument pose during an image data acquisition. This may allow for the use of real time image data and/or close in time image data to be displayed while not requiring continuous use of an imaging device, such as the US probe 33 during an entire procedure. It is understood, however, within image data may be acquired of the subject 30 at any appropriate time such as during a second phase of portion of the procedure such as when a second instrument or second usage of an instrument is occurring.
- Close in time image data may include image data that is acquired during a same procedure or a different procedure than one during use of the second instrument 484.
- the close in time image data is image data acquired during use of the first instrument 68 and the second image may be displayed that the image acquired during use of the first instrument, but during a later use of the second instrument.
- the US probe 33 may include one or more receivers or transducers to receive a vibration signal, such as an ultrasound vibration signal.
- a vibration signal such as an ultrasound vibration signal.
- US probe 33 may include a transducer that may receive a reflection of an ultrasound signal emitted by the US probe 33.
- the transducer may also receive a vibration signal that comes from a source other than the US probe 33.
- the instrument 68 may generate a vibration, such as within the subject 30, which may be sensed by the transducer of the US probe 33.
- the instrument 68 may interact with the subject 30 directly and cause at least a portion of the subject 30, such as a bone (e.g., vertebra) of the subject 30 to vibrate and also transmit a vibration signal to the US probe 33. Therefore, the US probe 33 may receive a signal from various portions or items that are not directly related to the US probe 33 itself.
- the US probe 33 may receive a vibration signal from one or more of the instruments, such as the instrument 68 or the instrument 484, for example as illustrated and discussed above in Figs. 18 and 19.
- the instrument 68 may be positioned relative to the subject 30, such as through an incision or the like into an interior of the subject 30.
- the instrument 68 may be an ablation instrument, a drill, or the like instrument that may be positioned relative to the subject 30.
- the US probe 33 may be used to generate an image of an interior of the subject, and displayed on the display device 84.
- the instrument 68 may be positioned relative to soft tissue, such as a liver of the subject, for various purposes such as ablation, tissue resection (e.g., biopsy), or other appropriate purposes.
- Other instruments, and/or an additional instrument or alternative instrument may include a drill or reamer 574.
- the alternative or additional instrument 574 may be positioned according to or relative to a selected portion of the subject 30, such as a femur 578 of the subject 30.
- the instrument 574 may be used to ream an acetabulum, drill a bore for insertion and/or to insert a screw to fix a plate or intramedullary rod, or other appropriate procedure.
- the instruments 68, 484, 574 may be positioned relative to the subject 30.
- the instruments may include distal portions, such as a working end 570, that are positioned within the subject 30.
- the distal ends may vibrate according to various purposes or mechanisms.
- an ablation instrument may have an appropriate or selective vibration to assist in ablation or removal of tissue.
- the drill or inserter may rotate a drill tip or bit to assist in forming a bore or for insertion of an instrument or an implant. Therefore, the instruments 68, 574 may have portions, such as distal tips that may move and cause vibrations or noise that may be sensed by the transducer of the US probe 33.
- the US probe 33 may include a transducer that may receive the vibrations through the subject of 30. Therefore, a determination of a pose of a portion of the instrument may be determined.
- the image generated by the US probe 33 due to US probe imaging plane may have a boundary 584.
- the boundary 584 may include an extent or boundary of a plane of the US signal to which the US probe 33 collects data or that data may be generated as a display image.
- the instrument 68 may be exterior to the area that is imaged with the US probe 33. Nevertheless, due to the vibrations of a portion of the instrument, such as a distal tip, a determined pose of the instrument may be displayed as a graphical representation 68i relative to the image.
- the determination or position of the instrument may be based upon a received or transmitted vibration 590 from the instrument. The vibration may not be displayed on display device and the vibration representation 590 is merely for illustration.
- the vibrations for the instrument 68 may be known, such as in a predetermination phase.
- the instrument 68 may be placed in a medium that mimics tissue of the subject to determine a vibration frequency and/or transmission speed and type to assist in determining a pose of the instrument 68 relative to the US probe 33.
- the instrument 68 may be tracked with the tracking device at an initial pose of the instrument 68.
- the vibrations from the instrument may be sensed then within the subject 30 and the tracked pose of both the instrument 68 and the US probe 33 may be determined. Thereafter, the pose of the tracking device may be confirmed and/or updated based upon the vibrations received from the instrument 68.
- the pose of the instrument 68 may be determined outside of the imaging region of the US probe 33. Nevertheless, the pose of the instrument 68 may be illustrated outside of the boundary 584 of the image displayed on the display device 84. Thus, the pose of the instrument 68 may be determined and displayed as the graphical representation 68i even if it is not within the imaging plane of the US probe 33 due to the sensed vibrations from the instrument 68 at the US probe 33.
- the vibrations may be used to define a characteristic of an instrument of member.
- the characteristic may include a mass, type of movement, speed of movement, density of an object, etc. Characteristics may also be used to assess the performance or failure of the instrument.
- the pose of various portions may also be displayed.
- the femur 578 may vibrate.
- a vibration echo or noise 594 may be sent from the femur 578 allowing for a graphical representation 578r to be displayed on the display device 84.
- the US probe 33 may image the bone, but the vibration may also be used to assist in determining a geometry and status of the femur 578.
- the vibration of the femur may change.
- a change in continuity of the femur 578, mass of the femur 578, or the like may change the vibration even if the instrument 574 remains constant in vibration.
- a status of the femur 578 may also be determined or implied based upon the vibrations received therefrom.
- One or more conditions may be determined based on, for example, the pitch or spectrum of tones of the vibration may change and/or the change in one or more of these.
- femur 578 is merely exemplary and that any hard tissue or selected tissue may be determined based upon vibrations that are induced therein. Further, any appropriate instruments may induce vibrations in any appropriate heart tissue, such as a reamer, drill, implanted insertion, or the like.
- the pose of an instrument or portion that causes vibrations within the subject 30 may be determined within the US probe 33 having the transducer therein.
- the pose of the portion such as a vibrating instrument portion of the subject having a vibration induced therein, and the like may be sensed at the US probe 33.
- the pose of the instrument even if not in the image plane of the US probe 33, may be determined and displayed on the display device by sensing or receiving vibrations at the US probe 33.
- the ultrasound probe may emit or transmit ultrasound waves in a selected pattern, such as a plane with a selected or known shape.
- the plane may be a shape as is understood by one skilled in the art.
- the ultrasound probe may be a transducer and/or include both transmission and receiving portions. An echo from the transmitted signal in the plane may be received at the receiver portion. Via the transmitted signal in the plane an acquisition of image data in a field of view occurs to generate images, also referred to as sonograms when images are generated based on ultrasound data. Also, ultrasound signals may be received from other portions or members.
- Example 1 - A method of displaying an image during a procedure comprising: acquiring a first image data; determining at least one feature in the first image data; acquiring a second image data; determining the at least one feature in the second image data; registering the first image data and the second image data based at least on the determined at least one feature in both the first image data and the second image data; and generating for display the image based on at least one of the first image data or the second image data.
- Example 2 The method of Example 1 , wherein acquiring the second image data includes acquiring ultrasound image data in substantially real time.
- Example 4 The method of Example 1 , further comprising:determining a boundary of the determined at least one feature in the second image data; wherein registering the first image data and the second image data is further based on the determined boundary of the at least one feature in the second image data.
- Example s The method of Example 4, further comprising: receiving doppler information regarding the second image data; and determining a boundary of the feature based at least onin part on the received doppler information; wherein the determined feature includes a vasculature in the second image data identified based on the received doppler information.
- Example 6 The method of Example 1 , further comprising: wherein determining at least one feature in the first image data include determining a first feature and a second feature; wherein determining the at least one feature in the second image data includes determining only the first feature; displaying with the second image data the second feature based at least on the registration of the first image data and the second image data.
- Example 7- The method of Example 1 , further comprising: wherein determining at least one feature in the second image data include determining a first feature and a second feature; wherein determining the at least one feature in the first image data includes determining only the first feature; displaying with the first image data the second feature based at least on the registration of the first image data and the second image data.
- Example 9 The method of Example 1 , further comprising: determining a tissue type in the first image data; determining a speed of an ultrasound signal through the determined tissue type; and determining a position of the tissue type in the second image data based on the registration of the first image data and the second image data.
- Example 10 A system to provide a display of an image during a procedure, comprising: an imaging acquisition system configured to acquire or recall a first image data; an ultrasound imaging system configured to acquire a second image data; a processor module configured to execute instructions to: determine at least one feature in the first image data; determine the at least one feature in the second image data; register the first image data and the second image data based at least on the determined at least one feature in both the first image data and the second image data; and generate for display at least one of a first image based on the first image data or a second image based on the second image data.
- Example 11- The system of Example 10, further comprising: a display for displaying the second image as a display over time as the second image data is acquired in substantially real time.
- Example 12 The system of Example 10, wherein the processor module is further configured to execute instructions to: determine a boundary of the determined at least one feature in the second image data; wherein registering the first image data and the second image data is further based on the determined boundary of the at least one feature in the second image data.
- Example 13 The system of Example 12, wherein the ultrasound imaging system is configured to receive doppler information regarding the second image data; and wherein the procedure module is further configured to execute instructions to determine a boundary of the feature based at least on part on the received doppler information; wherein the determined feature includes a vasculature in the second image data identified based on the received doppler information.
- Example 14 The system of Example 10, wherein the processor module is further configured to execute instructions to: determine at least a first feature and a second feature as the determined at least one feature in either of the first image data or the second image data; determining only the first feature in the other of the first image data or the second image data; generate an image for display with the other of the first image data or the second image data with the second feature based at least on the registration of the first image data and the second image data.
- Example 15 The system of Example 11 , wherein the processor module is further configured to execute instructions to: determine a tissue type in the first image data; determine a speed of an ultrasound signal through the determined tissue type; and determine a position of the tissue type in the second image data based on the registration of the first image data and the second image data.
- Example 16 The system of Example 10, wherein the processor module comprises a plurality of processor modules.
- Example 17 The system of Example 10, further comprising: a tracking system configured to track the ultrasound imaging system to determine a pose of the second image data relative to a subject; and a navigation system for displaying a pose of an instrument relative to the generated first image or second image.
- a tracking system configured to track the ultrasound imaging system to determine a pose of the second image data relative to a subject
- a navigation system for displaying a pose of an instrument relative to the generated first image or second image.
- Example 18 The system of Example 14, further comprising: a display device configured to display at least one of the first image or the second image; wherein the processor module is further configured to execute instructions to inpaint into the other of the first image or the second image the second feature, for display with the display device.
- Example 19 -A method of displaying an image during a procedure comprising: acquiring a first image data with a first imaging system; determining a first feature and a second feature in the first image data; acquiring a second image data with an ultrasound imaging system; determining the first feature in the second image data; tracking the ultrasound imaging system while acquiring the second image data; registering the first image data and the second image data based at least on the determined at least one feature in both the first image data and the second image data; and generating for display at least the image based on at least one of the first image data or the second image data.
- Example 20 The method of Example 19, further comprising: inpainting the determined second feature in the generated image based on the registration.
- Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects.
- the term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules.
- the term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above.
- the term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules.
- the term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
- the apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e. , created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs.
- the computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium.
- the computer programs may also include or rely on stored data.
- the computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
- BIOS basic input/output system
- the computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc.
- source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
- Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.11 -2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20- 2008.
- IEEE 802.11 -2012 may be supplemented by draft IEEE standard 802.11ac, draft IEEE standard 802.11 ad, and/or draft IEEE standard 802.11 ah.
- a processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically noted otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on- chip.
- ASIC Application Specific Integrated Circuit
- FPGA field programmable gate array
- processors or processor modules such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable logic arrays
- processors or processor modules may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- High Energy & Nuclear Physics (AREA)
- Robotics (AREA)
- Optics & Photonics (AREA)
- Vascular Medicine (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Hematology (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Electrically Operated Instructional Devices (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
L'invention concerne un système d'aide au guidage et à la réalisation d'une procédure sur un sujet. Le sujet peut être n'importe quel sujet approprié tel qu'un objet inanimé et/ou un objet animé. Un système d'imagerie peut être utilisé pour imager le sujet. Un système peut comprendre divers éléments manipulables ou mobiles, tels que des systèmes robotiques, et peut être utilisé pour déplacer et positionner le système d'imagerie.
Applications Claiming Priority (10)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463625738P | 2024-01-26 | 2024-01-26 | |
| US202463625623P | 2024-01-26 | 2024-01-26 | |
| US202463625632P | 2024-01-26 | 2024-01-26 | |
| US202463625720P | 2024-01-26 | 2024-01-26 | |
| US202463625713P | 2024-01-26 | 2024-01-26 | |
| US63/625,623 | 2024-01-26 | ||
| US63/625,632 | 2024-01-26 | ||
| US63/625,738 | 2024-01-26 | ||
| US63/625,720 | 2024-01-26 | ||
| US63/625,713 | 2024-01-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025158390A1 true WO2025158390A1 (fr) | 2025-07-31 |
Family
ID=94605757
Family Applications (5)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2025/050838 Pending WO2025158394A1 (fr) | 2024-01-26 | 2025-01-24 | Système et procédé d'imagerie |
| PCT/IB2025/050845 Pending WO2025158401A1 (fr) | 2024-01-26 | 2025-01-24 | Système et procédé d'imagerie et d'alignement pour navigation |
| PCT/IB2025/050834 Pending WO2025158390A1 (fr) | 2024-01-26 | 2025-01-24 | Système et procédé d'imagerie et d'enregistrement pour navigation |
| PCT/IB2025/050849 Pending WO2025158404A1 (fr) | 2024-01-26 | 2025-01-25 | Système et procédé d'imagerie et d'alignement pour navigation |
| PCT/IB2025/050883 Pending WO2025158413A1 (fr) | 2024-01-26 | 2025-01-27 | Système et procédé d'imagerie et d'alignement pour navigation |
Family Applications Before (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2025/050838 Pending WO2025158394A1 (fr) | 2024-01-26 | 2025-01-24 | Système et procédé d'imagerie |
| PCT/IB2025/050845 Pending WO2025158401A1 (fr) | 2024-01-26 | 2025-01-24 | Système et procédé d'imagerie et d'alignement pour navigation |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2025/050849 Pending WO2025158404A1 (fr) | 2024-01-26 | 2025-01-25 | Système et procédé d'imagerie et d'alignement pour navigation |
| PCT/IB2025/050883 Pending WO2025158413A1 (fr) | 2024-01-26 | 2025-01-27 | Système et procédé d'imagerie et d'alignement pour navigation |
Country Status (1)
| Country | Link |
|---|---|
| WO (5) | WO2025158394A1 (fr) |
Citations (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
| US5913820A (en) | 1992-08-14 | 1999-06-22 | British Telecommunications Public Limited Company | Position location system |
| US5983126A (en) | 1995-11-22 | 1999-11-09 | Medtronic, Inc. | Catheter location system and method |
| US6474341B1 (en) | 1999-10-28 | 2002-11-05 | Surgical Navigation Technologies, Inc. | Surgical communication and power system |
| US6940941B2 (en) | 2002-02-15 | 2005-09-06 | Breakaway Imaging, Llc | Breakable gantry apparatus for multidimensional x-ray based imaging |
| US7001045B2 (en) | 2002-06-11 | 2006-02-21 | Breakaway Imaging, Llc | Cantilevered gantry apparatus for x-ray imaging |
| US7106825B2 (en) | 2002-08-21 | 2006-09-12 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US7108421B2 (en) | 2002-03-19 | 2006-09-19 | Breakaway Imaging, Llc | Systems and methods for imaging large field-of-view objects |
| US7188998B2 (en) | 2002-03-13 | 2007-03-13 | Breakaway Imaging, Llc | Systems and methods for quasi-simultaneous multi-planar x-ray imaging |
| US7831082B2 (en) | 2000-06-14 | 2010-11-09 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
| US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
| US8503745B2 (en) | 2009-05-13 | 2013-08-06 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US8737708B2 (en) | 2009-05-13 | 2014-05-27 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US20140371774A1 (en) * | 2013-06-18 | 2014-12-18 | Samsung Electronics Co., Ltd. | Method, apparatus, and system for generating ultrasound |
| US9138204B2 (en) | 2011-04-29 | 2015-09-22 | Medtronic Navigation, Inc. | Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker |
| US9737235B2 (en) | 2009-03-09 | 2017-08-22 | Medtronic Navigation, Inc. | System and method for image-guided navigation |
| US20190175059A1 (en) | 2017-12-07 | 2019-06-13 | Medtronic Xomed, Inc. | System and Method for Assisting Visualization During a Procedure |
| US11135025B2 (en) | 2019-01-10 | 2021-10-05 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation |
| US20220296303A1 (en) * | 2019-08-28 | 2022-09-22 | Intuitive Surgical Operations, Inc. | Systems and methods for registering imaging data from different imaging modalities based on subsurface image scanning |
| US20230181165A1 (en) * | 2021-12-15 | 2023-06-15 | GE Precision Healthcare LLC | System and methods for image fusion |
| US20230240790A1 (en) * | 2022-02-03 | 2023-08-03 | Medtronic Navigation, Inc. | Systems, methods, and devices for providing an augmented display |
| US11839433B2 (en) | 2016-09-22 | 2023-12-12 | Medtronic Navigation, Inc. | System for guided procedures |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7697972B2 (en) * | 2002-11-19 | 2010-04-13 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
| US20090292204A1 (en) * | 2008-05-23 | 2009-11-26 | Oscillon Ltd. | Method and device for recognizing tissue structure using doppler effect |
| US9066681B2 (en) * | 2012-06-26 | 2015-06-30 | Covidien Lp | Methods and systems for enhancing ultrasonic visibility of energy-delivery devices within tissue |
| CN113576679A (zh) * | 2013-12-20 | 2021-11-02 | 皇家飞利浦有限公司 | 用于光子工具和电磁跟踪引导的支气管镜的用户接口 |
| WO2016037969A1 (fr) * | 2014-09-08 | 2016-03-17 | Koninklijke Philips N.V. | Appareil d'imagerie médicale |
| CN110248618B (zh) * | 2016-09-09 | 2024-01-09 | 莫比乌斯成像公司 | 用于在计算机辅助手术中显示患者数据的方法及系统 |
| US10524865B2 (en) * | 2016-12-16 | 2020-01-07 | General Electric Company | Combination of 3D ultrasound and computed tomography for guidance in interventional medical procedures |
| US11911110B2 (en) * | 2019-01-30 | 2024-02-27 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation of selected members |
| CN116034307A (zh) * | 2020-08-24 | 2023-04-28 | 富士胶片株式会社 | 图像处理装置、方法及程序 |
| DE102020211107A1 (de) * | 2020-09-03 | 2022-03-03 | Siemens Healthcare Gmbh | Erzeugung von kombinierten Bilddaten basierend auf MR-Daten |
-
2025
- 2025-01-24 WO PCT/IB2025/050838 patent/WO2025158394A1/fr active Pending
- 2025-01-24 WO PCT/IB2025/050845 patent/WO2025158401A1/fr active Pending
- 2025-01-24 WO PCT/IB2025/050834 patent/WO2025158390A1/fr active Pending
- 2025-01-25 WO PCT/IB2025/050849 patent/WO2025158404A1/fr active Pending
- 2025-01-27 WO PCT/IB2025/050883 patent/WO2025158413A1/fr active Pending
Patent Citations (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5913820A (en) | 1992-08-14 | 1999-06-22 | British Telecommunications Public Limited Company | Position location system |
| US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
| US5983126A (en) | 1995-11-22 | 1999-11-09 | Medtronic, Inc. | Catheter location system and method |
| US6474341B1 (en) | 1999-10-28 | 2002-11-05 | Surgical Navigation Technologies, Inc. | Surgical communication and power system |
| US8320653B2 (en) | 2000-06-14 | 2012-11-27 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
| US7831082B2 (en) | 2000-06-14 | 2010-11-09 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
| US6940941B2 (en) | 2002-02-15 | 2005-09-06 | Breakaway Imaging, Llc | Breakable gantry apparatus for multidimensional x-ray based imaging |
| US7188998B2 (en) | 2002-03-13 | 2007-03-13 | Breakaway Imaging, Llc | Systems and methods for quasi-simultaneous multi-planar x-ray imaging |
| US7108421B2 (en) | 2002-03-19 | 2006-09-19 | Breakaway Imaging, Llc | Systems and methods for imaging large field-of-view objects |
| US7001045B2 (en) | 2002-06-11 | 2006-02-21 | Breakaway Imaging, Llc | Cantilevered gantry apparatus for x-ray imaging |
| US7106825B2 (en) | 2002-08-21 | 2006-09-12 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US8175681B2 (en) | 2008-12-16 | 2012-05-08 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
| US9737235B2 (en) | 2009-03-09 | 2017-08-22 | Medtronic Navigation, Inc. | System and method for image-guided navigation |
| US8503745B2 (en) | 2009-05-13 | 2013-08-06 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US8737708B2 (en) | 2009-05-13 | 2014-05-27 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
| US9138204B2 (en) | 2011-04-29 | 2015-09-22 | Medtronic Navigation, Inc. | Method and apparatus for calibrating and re-aligning an ultrasound image plane to a navigation tracker |
| US20140371774A1 (en) * | 2013-06-18 | 2014-12-18 | Samsung Electronics Co., Ltd. | Method, apparatus, and system for generating ultrasound |
| US11839433B2 (en) | 2016-09-22 | 2023-12-12 | Medtronic Navigation, Inc. | System for guided procedures |
| US20190175059A1 (en) | 2017-12-07 | 2019-06-13 | Medtronic Xomed, Inc. | System and Method for Assisting Visualization During a Procedure |
| US11135025B2 (en) | 2019-01-10 | 2021-10-05 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation |
| US20220296303A1 (en) * | 2019-08-28 | 2022-09-22 | Intuitive Surgical Operations, Inc. | Systems and methods for registering imaging data from different imaging modalities based on subsurface image scanning |
| US20230181165A1 (en) * | 2021-12-15 | 2023-06-15 | GE Precision Healthcare LLC | System and methods for image fusion |
| US20230240790A1 (en) * | 2022-02-03 | 2023-08-03 | Medtronic Navigation, Inc. | Systems, methods, and devices for providing an augmented display |
Non-Patent Citations (3)
| Title |
|---|
| DIXON LUKE ET AL: "Intraoperative ultrasound in brain tumor surgery: A review and implementation guide", NEUROSURGICAL REVIEW, SPRINGER BERLIN HEIDELBERG, BERLIN/HEIDELBERG, vol. 45, no. 4, 30 March 2022 (2022-03-30), pages 2503 - 2515, XP037926103, DOI: 10.1007/S10143-022-01778-4 * |
| GÉRARD MAXIME ET AL: "Geometric modeling of hepatic arteries in 3D ultrasound with unsupervised MRA fusion during liver interventions", INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, SPRINGER, DE, vol. 12, no. 6, 7 March 2017 (2017-03-07), pages 961 - 972, XP036247522, ISSN: 1861-6410, [retrieved on 20170307], DOI: 10.1007/S11548-017-1550-4 * |
| WOO HYUN NAM ET AL: "Automatic registration between 3D intra-operative ultrasound and pre-operative CT images of the liver based on robust edge matching;Automatic registration between 3D ultrasound and CT images of the liver", PHYSICS IN MEDICINE AND BIOLOGY, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL GB, vol. 57, no. 1, 29 November 2011 (2011-11-29), pages 69 - 91, XP020216220, ISSN: 0031-9155, DOI: 10.1088/0031-9155/57/1/69 * |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025158401A1 (fr) | 2025-07-31 |
| WO2025158394A9 (fr) | 2025-10-02 |
| WO2025158404A1 (fr) | 2025-07-31 |
| WO2025158394A1 (fr) | 2025-07-31 |
| WO2025158413A1 (fr) | 2025-07-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109414295B (zh) | 基于图像的导航的方法和装置 | |
| JP5227027B2 (ja) | 直線状の器具を較正する方法および装置 | |
| JP5328137B2 (ja) | 用具又は埋植物の表現を表示するユーザ・インタフェイス・システム | |
| US9579161B2 (en) | Method and apparatus for tracking a patient | |
| JP5121401B2 (ja) | 埋植物距離測定のシステム | |
| US20210378631A1 (en) | Real-Time 3-D Ultrasound Reconstruction of Knee and Its Implications For Patient Specific Implants and 3-D Joint Injections | |
| US20180286287A1 (en) | System and methods for training physicians to perform ablation procedures | |
| US20070073136A1 (en) | Bone milling with image guided surgery | |
| US10869725B2 (en) | Simulated method and system for navigating surgical instrument based on tomography | |
| WO2002000093A2 (fr) | Enregistrement d'images d'objet cible dans des donnees d'images stockees | |
| WO2021030129A1 (fr) | Systèmes, dispositifs et méthodes de navigation chirurgicale avec repérage anatomique | |
| WO2025088616A1 (fr) | Procédé et appareil de navigation pour intervention | |
| WO2025158390A1 (fr) | Système et procédé d'imagerie et d'enregistrement pour navigation | |
| Penney et al. | Cadaver validation of intensity-based ultrasound to CT registration | |
| US20240341882A1 (en) | System And Method For Imaging And Registration For Navigation | |
| Tyryshkin et al. | A navigation system for shoulder arthroscopic surgery | |
| WO2024215880A1 (fr) | Système et procédé d'imagerie et d'enregistrement pour navigation | |
| CN121127181A (zh) | 用于导航的成像和配准的系统和方法 | |
| WO2025191487A1 (fr) | Système de suivi d'un instrument |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25705333 Country of ref document: EP Kind code of ref document: A1 |