AU2018290995B2 - System and method for identifying, marking and navigating to a target using real-time two-dimensional fluoroscopic data - Google Patents
System and method for identifying, marking and navigating to a target using real-time two-dimensional fluoroscopic data Download PDFInfo
- Publication number
- AU2018290995B2 AU2018290995B2 AU2018290995A AU2018290995A AU2018290995B2 AU 2018290995 B2 AU2018290995 B2 AU 2018290995B2 AU 2018290995 A AU2018290995 A AU 2018290995A AU 2018290995 A AU2018290995 A AU 2018290995A AU 2018290995 B2 AU2018290995 B2 AU 2018290995B2
- Authority
- AU
- Australia
- Prior art keywords
- target
- fluoroscopic
- reconstruction
- images
- medical device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/008—Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
- G06T2207/10121—Fluoroscopy
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
- G06T2207/10124—Digitally reconstructed radiograph [DRR]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/428—Real-time
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/004—Annotating, labelling
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
A system for facilitating identification and marking of a target in a fluoroscopic image of a body region of a patient, the system comprising one or more storage devices having stored thereon instructions for: receiving a CT scan and a fluoroscopic 3D reconstruction of the body region of the patient, wherein the CT scan includes a marking of the target; and generating at least one virtual fluoroscopy image based on the CT scan of the patient, wherein the virtual fluoroscopy image includes the target and the marking of the target, at least one hardware processor configured to execute these instructions, and a display configured to display to a user the virtual fluoroscopy image and the fluoroscopic 3D reconstruction.
Description
Technical Field
[0001] The present disclosure relates to the field of identifying and marking a
target in fluoroscopic images, in general, and to such target identification and marking in
medical procedures involving intra-body navigation, in particular. Furthermore, the
present disclosure relates to a system, apparatus, and method of navigation in medical
procedures.
Description of Related Art
[0002] There are several commonly applied medical methods, such as endoscopic
procedures or minimally invasive procedures, for treating various maladies affecting
organs including the liver, brain, heart, lung, gall bladder, kidney and bones. Often, one
or more imaging modalities, such as magnetic resonance imaging (MRI), ultrasound
imaging, computed tomography (CT), fluoroscopy as well as others are employed by
clinicians to identify and navigate to areas of interest within a patient and ultimately a
target for treatment. In some procedures, pre-operative scans may be utilized for target
identification and intraoperative guidance. However, real-time imaging may be often
required to obtain a more accurate and current image of the target area. Furthermore,
real-time image data displaying the current location of a medical device with respect to the
target and its surrounding may be required to navigate the medical device to the target in
a more safe and accurate manner (e.g., with unnecessary or no damage caused to other
organs or tissue).
[0003] For example, an endoscopic approach has proven useful in navigating to
areas of interest within a patient, and particularly so for areas within luminal networks of the body such as the lungs. To enable the endoscopic, and more particularly the bronchoscopic approach in the lungs, endobronchial navigation systems have been developed that use previously acquired MRI data or CT image data to generate a three-dimensional (3D) rendering, model or volume of the particular body part such as the lungs.
[0004] The resulting volume generated from the MRI scan or CT scan is then
utilized to create a navigation plan to facilitate the advancement of a navigation catheter
(or other suitable medical device) through a bronchoscope and a branch of the bronchus
of a patient to an area of interest. A locating or tracking system, such as an electromagnetic
(EM) tracking system, may be utilized in conjunction with, for example, CT data, to
facilitate guidance of the navigation catheter through the branch of the bronchus to the
area of interest. In certain instances, the navigation catheter may be positioned within one
of the airways of the branched luminal networks adjacent to, or within, the area of interest
to provide access for one or more medical instruments.
[0005] However, a three-dimensional volume of a patient's lungs, generated from
previously acquired scans, such as CT scans, may not provide a basis sufficient for
accurate guiding of medical instruments to a target during a navigation procedure. In
certain instances, the inaccuracy is caused by deformation of the patient's lungs during the
procedure relative to the lungs at the time of the acquisition of the previously acquired CT
data. This deformation (CT-to-Body divergence) may be caused by many different factors,
for example: sedation vs. no sedation, bronchoscope changing patient pose and also
pushing the tissue, different lung volume because CT was in inhale while navigation is
during breathing, different bed, day, etc.
[0006] Thus, another imaging modality is necessary to visualize such targets in
real-time and enhance the in-vivo navigation procedure by correcting navigation during the procedure. Furthermore, in order to accurately and safely navigate medical devices to a remote target, for example, for biopsy or treatment, both the medical device and the target should be visible in some sort of a three-dimensional guidance system.
[0007] A fluoroscopic imaging device is commonly located in the operating room
during navigation procedures. The standard fluoroscopic imaging device may be used by
a clinician, for example, to visualize and confirm the placement of a medical device after
it has been navigated to a desired location. However, although standard fluoroscopic
images display highly dense objects such as metal tools and bones as well as large soft
tissue objects such as the heart, the fluoroscopic images may have difficulty resolving
small soft-tissue objects of interest such as lesions. Furthermore, the fluoroscope image
is only a two-dimensional projection. Therefore, an X-ray volumetric reconstruction may
enable identification of such soft tissue objects and navigation to the target.
[0008] Several solutions exist that provide three-dimensional volume
reconstruction such as CT and Cone-beam CT which are extensively used in the medical
world. These machines algorithmically combine multiple X-ray projections from known,
calibrated X-ray source positions into three-dimensional volume in which, inter alia, soft
tissues are more visible. For example, a CT machine can be used with iterative scans during
procedure to provide guidance through the body until the tools reach the target. This is a
tedious procedure as it requires several full CT scans, a dedicated CT room and blind
navigation between scans. In addition, each scan requires the staff to leave the room due
to high-levels of ionizing radiation and exposes the patient to such radiation. Another
option is a Cone-beam CT machine, which is available in some operation rooms and is
somewhat easier to operate but is expensive and like the CT only provides blind navigation
between scans, requires multiple iterations for navigation and requires the staff to leave the room. In addition, a CT-based imaging system is extremely costly, and in many cases not available in the same location as the location where a procedure is carried out.
[0009] Hence, an imaging technology, which uses standard fluoroscope devices,
to reconstruct local three-dimensional volume in order to visualize and facilitate
navigation to in-vivo targets, and to small soft-tissue objects in particular, has been
introduced: US Patent Application No. 2017/035379 to Weingarten et al. entitled
RECONSTRUCTION USING A STANDARD FLUOROSCOPE, the contents of which
are incorporated herein by reference, US Patent Application No. 2017/035380 to Barak et
al. entitled SYSTEM AND METHOD FOR NAVIGATING TO TARGET AND
LOCAL THEREE DIMENSIONAL VOLUME RECONSTRUCTION, the contents of
which are incorporated herein by reference and US Patent Application No. 15/892,053 to
Weingarten et al. entitled SYSTEMS AND METHODS FOR LOCAL THREE
FLUOROSCOPE, the contents of which are incorporated herein by reference.
[0010] In general, according to the systems and methods disclosed in the above
mentioned patent applications, a standard fluoroscope c-arm can be rotated, e.g., about 30
degrees, around a patient during a medical procedure, and a fluoroscopic 3D reconstruction
of the region of interest is generated by a specialized software algorithm. The user can then
scroll through the image slices of the fluoroscopic 3D reconstruction using the software
interface to identify the target (e.g., a lesion) and mark it.
[0011] Such quick generation of a 3D reconstruction of a region of interest can
provide real-time three-dimensional imaging of the target area. Real-time imaging of the
target and medical devices positioned in its area may benefit numerous interventional procedures, such as biopsy and ablation procedures in various organs, vascular interventions and orthopedic surgeries. For example, when navigational bronchoscopy is concerned, the aim may be to receive accurate information about the position of a biopsy catheter relative to a target lesion.
[0012] As another example, minimally invasive procedures, such as laparoscopy
procedures, including robotic-assisted surgery, may employ intraoperative fluoroscopy to
increase visualization, e.g., for guidance and lesion locating, and to prevent unnecessary injury
and complications. Employing the above-mentioned systems and methods for real-time
reconstruction of fluoroscopic three-dimensional imaging of a target area and for navigation based
on the reconstruction may benefit such procedures as well.
[0013] Still, it may not be an easy task to accurately identify and mark a target in the
fluoroscopic 3D reconstruction, in particular when the target is a small soft-tissue. Thus, there is
a need for systems and methods for facilitating the identification and marking of a target in
fluoroscopic image data, and in a fluoroscopic 3D reconstruction in particular, to consequently
facilitate the navigation to the target and the yield of pertinent medical procedures.
[0013a] According to an aspect of the present invention, there is provided a system
for facilitating identification and marking of a target in a target area in a fluoroscopic image
of a body region of a patient, the system comprising: (i) one or more storage devices
having stored thereon instructions for: receiving a CT scan of the body region of the
patient, wherein the CT scan includes a marking of the target; receiving a sequence of
fluoroscopic images including the target area acquired in real-time about a plurality of
angles relative to the target, while a medical device is positioned in proximity to the target;
and generating a fluoroscopic 3D reconstruction based on at least a portion of the sequence
of fluoroscopic images; generating at least one virtual fluoroscopic image based on the CT
scan of the patient, wherein the virtual fluoroscopic image includes the target and the marking of the target, receiving a selection of the target in the fluoroscopic 3D reconstruction via a user input; receiving a selection of the medical device in the 3D reconstruction or the sequence of fluoroscopic images; determining an offset of the medical device with respect to the target based on the selections of the target and the medical device; determining a location of the medical device within the patient based on data provided by a tracking system; displaying the target area and the location of the medical device with respect to the target on a display; and correcting the display of the location of the medical device with respect to the target based on the offset between the medical device and the target; and (ii) at least one hardware processor configured to execute said instructions.
[00141 There is provided in accordance with the present disclosure, a system for
facilitating identification and marking of a target in a fluoroscopic image of a body region of a
patient, the system comprising: (i) one or more storage devices having stored thereon instructions
for: receiving a CT scan and a fluoroscopic 3D reconstruction of the body region of the patient,
wherein the CT scan includes a marking of the target; generating at least one virtual fluoroscopy
image based on the CT scan of the patient, wherein the virtual fluoroscopy image includes the
target and the marking of the target, (ii) at least one hardware processor configured to
execute said instructions, and (iii) a display configured to display to a user the virtual fluoroscopy image simultaneously with the fluoroscopic 3D reconstruction.
[0015] There is further provided in accordance with the present disclosure, a
system for facilitating identification and marking of a target in a fluoroscopic image of a
body region of a patient, the system comprising: (i) one or more storage devices having
stored thereon instructions for: receiving a CT scan and a fluoroscopic 3D reconstruction
of the body region of the patient, wherein the CT scan includes a marking of the target;
generating at least one virtual fluoroscopy image based on the CT scan of the patient,
wherein the virtual fluoroscopy image includes the target and the marking of the target,
(ii) at least one hardware processor configured to execute said instructions, and (iii) a
display configured to display to a user the virtual fluoroscopy image and the fluoroscopic
3D reconstruction.
[0016] There is further provided in accordance with the present disclosure, a
method for identifying and marking a target in an image of a body region of a patient, the
method comprising using at least one hardware processor for: receiving a CT scan and a
fluoroscopic 3D reconstruction of the body region of the patient, wherein the CT scan
includes a marking of the target; generating at least one virtual fluoroscopy image based
on the CT scan of the patient, wherein the at least one virtual fluoroscopy image includes
the target and the marking of the target; and displaying to a user the at least one virtual
fluoroscopy image simultaneously with the fluoroscopic 3D reconstruction on a display,
thereby facilitating the identification of the target in the fluoroscopic 3D reconstruction by
the user.
[0017] There is further provided in accordance with the present disclosure, a
method for identifying and marking a target in an image of a body region of a patient, the
method comprising using at least one hardware processor for: receiving a CT scan and a fluoroscopic 3D reconstruction of the body region of the patient, wherein the CT scan includes a marking of the target; generating at least one virtual fluoroscopy image based on the CT scan of the patient, wherein the at least one virtual fluoroscopy image includes the target and the marking of the target; and displaying to a user the at least one virtual fluoroscopy image and the fluoroscopic 3D reconstruction on a display, thereby facilitating the identification of the target in the fluoroscopic 3D reconstruction by the user.
[0018] There is further provided in accordance with the present disclosure, a
system for navigating to a target area within a patient's body during a medical procedure
using real-time two-dimensional fluoroscopic images, the system comprising: a medical
device configured to be navigated to the target area; a fluoroscopic imaging device
configured to acquire a sequence of 2D fluoroscopic images of the target area about a
plurality of angles relative to the target area, while the medical device is positioned in the
target area; and a computing device configured to: receive a pre-operative CT scan of the
target area, wherein the pre-operative CT scan includes a marking of the target; generate
at least one virtual fluoroscopy image based on the pre-operative CT scan, wherein the at
least one virtual fluoroscopy image includes the target and the marking of the target;
generate a three-dimensional reconstruction of the target area based on the acquired
sequence of 2D fluoroscopic images; display to a user the at least one virtual fluoroscopy
image and the fluoroscopic 3D reconstruction simultaneously, receive a selection of the
target from the fluoroscopic 3D reconstruction via the user; receive a selection of the
medical device from the three-dimensional reconstruction or the sequence of 2D
fluoroscopic images; and determine an offset of the medical device with respect to the
target based on the selections of the target and the medical device.
[0019] There is further provided in accordance with the present disclosure, a
system for navigating to a target area within a patient's body during a medical procedure using real-time two-dimensional fluoroscopic images, the system comprising: a medical device configured to be navigated to the target area; a fluoroscopic imaging device configured to acquire a sequence of 2D fluoroscopic images of the target area about a plurality of angles relative to the target area, while the medical device is positioned in the target area; and a computing device configured to: receive a pre-operative CT scan of the target area, wherein the pre-operative CT scan includes a marking of the target; generate at least one virtual fluoroscopy image based on the pre-operative CT scan, wherein the at least one virtual fluoroscopy image includes the target and the marking of the target; generate a three-dimensional reconstruction of the target area based on the acquired sequence of 2D fluoroscopic images; display to a user the at least one virtual fluoroscopy image and the fluoroscopic 3D reconstruction, receive a selection of the target from the fluoroscopic 3D reconstruction via the user; receive a selection of the medical device from the three-dimensional reconstruction or the sequence of 2D fluoroscopic images; and determine an offset of the medical device with respect to the target based on the selections of the target and the medical device.
[0020] There is further provided in accordance with the present disclosure, a
method for navigating to a target area within a patient's body during a medical procedure
using real-time two-dimensional fluoroscopic images, the method comprising using at
least one hardware processor for: receiving a pre-operative CT scan of the target area,
wherein the pre-operative CT scan includes a marking of the target; generating at least one
virtual fluoroscopy image based on the pre-operative CT scan, wherein the at least one
virtual fluoroscopy image includes the target and the marking of the target; receiving a
sequence of 2D fluoroscopic images of the target area acquired in real-time about a
plurality of angles relative to the target area, while a medical device is positioned in the
target area; generating a three-dimensional reconstruction of the target area based on the sequence of 2D fluoroscopic images; displaying to a user the at least one virtual fluoroscopy image and the fluoroscopic 3D reconstruction simultaneously, receiving a selection of the target from the fluoroscopic 3D reconstruction via the user; receiving a selection of the medical device from the three-dimensional reconstruction or the sequence of 2D fluoroscopic images; and determining an offset of the medical device with respect to the target based on the selections of the target and the medical device.
[0021] There is further provided in accordance with the present disclosure, a
method for navigating to a target area within a patient's body during a medical procedure
using real-time two-dimensional fluoroscopic images, the method comprising using at
least one hardware processor for: receiving a pre-operative CT scan of the target area,
wherein the pre-operative CT scan includes a marking of the target; generating at least one
virtual fluoroscopy image based on the pre-operative CT scan, wherein the at least one
virtual fluoroscopy image includes the target and the marking of the target; receiving a
sequence of 2D fluoroscopic images of the target area acquired in real-time about a
plurality of angles relative to the target area, while a medical device is positioned in the
target area; generating a three-dimensional reconstruction of the target area based on the
sequence of 2D fluoroscopic images; displaying to a user the at least one virtual
fluoroscopy image and the fluoroscopic 3D reconstruction, receiving a selection of the
target from the fluoroscopic 3D reconstruction via the user; receiving a selection of the
medical device from the three-dimensional reconstruction or the sequence of 2D
fluoroscopic images; and determining an offset of the medical device with respect to the
target based on the selections of the target and the medical device.
[0022] There is further provided in accordance with the present disclosure, a
computer program product comprising a non-transitory computer-readable storage
medium having program code embodied therewith, the program code executable by at least one hardware processor to: receive a pre-operative CT scan of the target area, wherein the pre-operative CT scan includes a marking of the target; generate at least one virtual fluoroscopy image based on the pre-operative CT scan, wherein the at least one virtual fluoroscopy image includes the target and the marking of the target; receive a sequence of
2D fluoroscopic images of the target area acquired in real-time about a plurality of angles
relative to the target area, while a medical device is positioned in the target area; generate
a fluoroscopic three-dimensional reconstruction of the target area based on the sequence
of 2D fluoroscopic images; display to a user the at least one virtual fluoroscopy image and
the fluoroscopic three-dimensional reconstruction simultaneously, receive a selection of
the target from the fluoroscopic three-dimensional reconstruction via the user; receive a
selection of the medical device from the fluoroscopic three-dimensional reconstruction or
the sequence of 2D fluoroscopic images; and determine an offset of the medical device
with respect to the target based on the selections of the target and the medical device.
[0023] There is further provided in accordance with the present disclosure, a
computer program product comprising a non-transitory computer-readable storage
medium having program code embodied therewith, the program code executable by at
least one hardware processor to: receive a pre-operative CT scan of the target area, wherein
the pre-operative CT scan includes a marking of the target; generate at least one virtual
fluoroscopy image based on the pre-operative CT scan, wherein the at least one virtual
fluoroscopy image includes the target and the marking of the target; receive a sequence of
2D fluoroscopic images of the target area acquired in real-time about a plurality of angles
relative to the target area, while a medical device is positioned in the target area; generate
a fluoroscopic three-dimensional reconstruction of the target area based on the sequence
of 2D fluoroscopic images; display to a user the at least one virtual fluoroscopy image and
the fluoroscopic three-dimensional reconstruction, receive a selection of the target from the fluoroscopic three-dimensional reconstruction via the user; receive a selection of the medical device from the fluoroscopic three-dimensional reconstruction or the sequence of
2D fluoroscopic images; and determine an offset of the medical device with respect to the
target based on the selections of the target and the medical device.
[0024] In another aspect of the present disclosure, the one or more storage devices
have stored thereon further instructions for directing the user to identify and mark the
target in the fluoroscopic 3D reconstruction.
[0025] In another aspect of the present disclosure, the one or more storage devices
have stored thereon further instructions for directing the user to identify and mark the
target in the fluoroscopic 3D reconstruction while using the virtual fluoroscopy image as
a reference.
[0026] In another aspect of the present disclosure, the one or more storage devices
have stored thereon further instructions for directing the user to identify and mark the
target in two fluoroscopic slice images of the fluoroscopic 3D reconstruction captured at
two different angles.
[0027] In another aspect of the present disclosure, the generating of the at least one
virtual fluoroscopy image is based on Digitally Reconstructed Radiograph techniques.
[0028] In another aspect of the present disclosure, the generating of the at least one
virtual fluoroscopy image comprises: generating virtual fluoroscope poses around the
target by simulating a fluoroscope trajectory while scanning the target; generating virtual
2D fluoroscopic images by projecting the CT scan volume according to the virtual
fluoroscope poses; generating virtual fluoroscopic 3D reconstruction based on the virtual
2D fluoroscopic images; and selecting a slice image from the virtual fluoroscopic 3D
reconstruction which comprises the marking of the target.
[0029] In another aspect of the present disclosure, the target is a soft-tissue target.
[0030] In another aspect of the present disclosure, the receiving of the fluoroscopic
3D reconstruction of the body region comprises: receiving a sequence of 2D fluoroscopic
images of the body region acquired about a plurality of angles relative to the body region
and generating the fluoroscopic 3D reconstruction of the body region based on the
sequence of 2D fluoroscopic images.
[0031] In another aspect of the present disclosure, the method further comprises
using said at least one hardware processor for directing the user to identify and mark the
target in the fluoroscopic 3D reconstruction.
[0032] In another aspect of the present disclosure, the method further comprises
using said at least one hardware processor for directing the user to identify and mark the
target in the fluoroscopic 3D reconstruction while using the at least one virtual fluoroscopy
image as a reference.
[0033] In another aspect of the present disclosure, the method further comprises
using said at least one hardware processor for instructing the user to identify and mark the
target in two fluoroscopic slice images of the fluoroscopic 3D reconstruction captured at
two different angles.
[0034] In another aspect of the present disclosure, the system further comprises: a
tracking system configured to provide data indicating the location of the medical device
within the patient's body; and a display, wherein the computing device is further
configured to: determine the location of the medical device based on the data provided by
the tracking system; display the target area and the location of the medical device with
respect to the target on the display; and correct the display of the location of the medical
device with respect to the target based on the determined offset between the medical device
and the target.
[0035] In another aspect of the present disclosure, the computing device is further
configured to: generate a 3D rendering of the target area based on the pre-operative CT
scan, wherein the display of the target area comprises displaying the 3D rendering; and
register the tracking system to the 3D rendering, wherein the correction of the location of
the medical device with respect to the target comprises updating the registration between
the tracking system and the 3D rendering.
[0036] In another aspect of the present disclosure, the tracking system comprises:
a sensor; and an electromagnetic field generator configured to generate an electromagnetic
field for determining a location of the sensor, wherein the medical device comprises a
catheter guide assembly having the sensor disposed thereon, and the determining of the
location of the medical device comprises determining the location of the sensor based on
the generated electromagnetic field.
[0037] In another aspect of the present disclosure, the target area comprises at least
a portion of the lungs and the medical device is configured to be navigated to the target
area through the airways luminal network.
[0038] In another aspect of the present disclosure, the computing device is
configured to receive the selection of the medical device by automatically detecting a
portion of the medical device in the acquired sequence of 2D fluoroscopic images or three
dimensional reconstruction and receiving the user command either accepting or rejecting
the detection.
[0039] In another aspect of the present disclosure, the computing device is further
configured to estimate the pose of the fluoroscopic imaging device, while the fluoroscopic
imaging device acquires each of at least a plurality of images of the sequence of 2D
fluoroscopic images, and wherein the generating of the three-dimensional reconstruction
of the target area is based on the pose estimation of the fluoroscopic imaging device.
[0040] In another aspect of the present disclosure, the system further comprises a
structure of markers, wherein the fluoroscopic imaging device is configured to acquire a
sequence of 2D fluoroscopic images of the target area and of the structure of markers, and
wherein the estimation of the pose of the fluoroscopic imaging device while acquiring each
image of the at least plurality of images is based on detection of a possible and most
probable projection of the structure of markers, as a whole, on each image.
[0041] In another aspect of the present disclosure, the computing device is further
configured to direct the user to identify and mark the target in the fluoroscopic 3D
reconstruction while using the at least one virtual fluoroscopy image as a reference.
[0042] In another aspect of the present disclosure, the method further comprises
using said at least one hardware processor for: determining the location of the medical
device within the patient's body based on data provided by a tracking system; displaying
the target area and the location of the medical device with respect to the target on a display;
and correcting the display of the location of the medical device with respect to the target
based on the determined offset between the medical device and the target.
[0043] In another aspect of the present disclosure, the method further comprises
using said at least one hardware processor for: generating a 3D rendering of the target area
based on the pre-operative CT scan, wherein the displaying of the target area comprises
displaying the 3D rendering; and registering the tracking system to the 3D rendering,
wherein the correcting of the location of the medical device with respect to the target
comprises updating the registration between the tracking system and the 3D rendering.
[0044] In another aspect of the present disclosure, the receiving of the selection of
the medical device comprises automatically detecting a portion of the medical device in
the sequence of 2D fluoroscopic images or three-dimensional reconstruction and receiving
the user command either accepting or rejecting the detection.
[0045] In another aspect of the present disclosure, the method further comprises
using said at least one hardware processor for estimating the pose of the fluoroscopic
imaging device while acquiring each of at least a plurality of images of the sequence of
2D fluoroscopic images, wherein the generating of the three-dimensional reconstruction
of the target area is based on the pose estimation of the fluoroscopic imaging device.
[0046] In another aspect of the present disclosure, a structure of markers is placed
with respect to the patient and the fluoroscopic imaging device such that each image of the
at least plurality of images comprises a projection of at least a portion of the structure of
markers, and wherein the estimating of the pose of the fluoroscopic imaging device while
acquiring each image of the at least plurality of images is based on detection of a possible
and most probable projection of the structure of markers as a whole on each image.
[0047] In another aspect of the present disclosure, the non-transitory computer
readable storage medium has further program code executable by the at least one hardware
processor to: determine the location of the medical device within the patient's body based
on data provided by a tracking system; display the target area and the location of the
medical device with respect to the target on a display; and correct the display of the
location of the medical device with respect to the target based on the determined offset
between the medical device and the target.
[0048] Any of the above aspects and embodiments of the present disclosure may
be combined without departing from the scope of the present disclosure.
[0049] Various aspects and embodiments of the present disclosure are described
hereinbelow with references to the drawings, wherein:
[0050] Fig. 1 is a flow chart of a method for identifying and marking a target in
fluoroscopic 3D reconstruction in accordance with the present disclosure;
[0051] Fig. 2 is a schematic diagram of a system configured for use with the
method of Fig. 1;
[0052] Fig. 3A is an exemplary screen shot showing a display of slice images of a
fluoroscopic 3D reconstruction in accordance with the present disclosure;
[0053] Fig. 3B is an exemplary screen shot showing a virtual fluoroscopy image
presented simultaneously with slice images of a fluoroscopic 3D reconstruction in
accordance with the present disclosure;
[0054] Fig. 3C is an exemplary screen shot showing a display of a fluoroscopic 3D
reconstruction in accordance with the present disclosure;
[0055] Fig. 4 is a flow chart of a method for navigating to a target using real-time
two-dimensional fluoroscopic images in accordance with the present disclosure; and
[0056] Fig. 5 is a perspective view of one illustrative embodiment of an exemplary
system for navigating to a soft-tissue target via the airways network in accordance with
the method of Fig. 4.
[0057] The term "target", as referred to herein, may relate to any element,
biological or artificial, or to a region of interest in a patient's body, such as a tissue
(including soft tissue and bone tissue), an organ, an implant or a fiducial marker.
[0058] The term "target area", as referred to herein, may relate to the target and
at least a portion of its surrounding area. The term "target area" and the term "body
region" may be used interchangeably when the term "body region" refers to the body
region in which the target is located. Alternatively or in addition, the term "target area"
may also refer to a portion of the body region in which the target is located, all according
to the context.
[0059] The terms "and", "or" and "and/or" may be used interchangeably, while
each term may incorporate the others, all according to the term's context.
[0060] The term "medical device", as referred to herein, may include, without
limitation, optical systems, ultrasound probes, marker placement tools, biopsy tools,
ablation tools (i.e., microwave ablation devices), laser probes, cryogenic probes, sensor
probes, and aspirating needles.
[0061] The terms "fluoroscopic image" and "fluoroscopic images" may refer to
a 2D fluoroscopic image/s and/or to a slice-image of any fluoroscopic 3D reconstructions,
all in accordance with the term's context.
[0062] The terms "virtualfluoroscopic image" or "virtualfluoroscopic images"
may refer to a virtual 2D fluoroscopic image/s and/or to a virtual fluoroscopy slice-image/s
of a virtual fluoroscopic 3D reconstruction or any other 3D reconstruction, all in
accordance with the term's context.
[0063] The present disclosure is directed to systems, methods and computer
program products for facilitating the identification and marking of a target by a user in
real-time fluoroscopic images of a body region of interest generated via a standard
fluoroscope. Such real-time fluoroscopic images may include two-dimensional images
and/or slice-images of a three-dimensional reconstruction. In particular, the identification
and marking of the target in the real-time fluoroscopic data may be facilitated by using
synthetic or virtual fluoroscopic data, which includes a marking or an indication of the
target, as a reference. The virtual fluoroscopic data may be generated from previously
acquired volumetric data and preferably such that it would imitate, as much as possible,
fluoroscopic type of data. Typically, the target is better shown in the imaging modality of
the previously acquired volumetric data than in the real-time fluoroscopic data.
[0064] The present disclosure is further directed to systems and methods for
facilitating the navigation of a medical device to a target and/or its area using real-time
two-dimensional fluoroscopic images of the target area. The navigation is facilitated by
using local three-dimensional volumetric data, in which small soft-tissue objects are
visible, constructed from a sequence of fluoroscopic images captured by a standard
fluoroscopic imaging device available in most procedure rooms. The fluoroscopic-based
constructed local three-dimensional volumetric data may be used to correct a location of a
medical device with respect to a target or may be locally registered with previously
acquired volumetric data. In general, the location of the medical device may be determined
by a tracking system. The tracking system may be registered with the previously acquired
volumetric data. A local registration of the real-time three-dimensional fluoroscopic data
to the previously acquired volumetric data may be then performed via the tracking system.
Such real-time data, may be used, for example, for guidance, navigation planning,
improved navigation accuracy, navigation confirmation, and treatment confirmation.
[0065] Reference is now made to Fig. 1, which is a flow chart of a method for
identifying and marking a target in a 3D fluoroscopic reconstruction in accordance with
the present disclosure. In a step 100, a CT scan and a fluoroscopic 3D reconstruction of a
body region of a patient may be received. The CT scan may include a marking or an
indication of a target located in the patient's body region. Alternatively, a qualified medical
professional may be directed to identify and mark the target in the CT scan. In some
embodiments the target may be a soft-tissue target, such as a lesion. In some embodiments
the imaged body region may include at least a portion of the lungs. In some embodiments,
the 3D reconstructions may be displayed to the user. In some embodiments the 3D
reconstruction may be displayed such that the user may scroll through its different slice
images. Reference is now made to Fig. 3A, which is an exemplary screen shot 300 showing a display of slice images of a fluoroscopic 3D reconstruction in accordance with the present disclosure. Screen shot 300 includes a slice image 310, a scrolling bar 320 and an indicator 330. Scrolling bar 320 allows a user to scroll through the slice images of the fluoroscopic 3D reconstruction. Indicator 330 indicates the relative location of the slice image currently displayed, e.g., slice image 310, within the slice images constituting the fluoroscopic 3D reconstruction.
[0066] In some embodiments, the receiving of the fluoroscopic 3D reconstruction
of the body region may include receiving a sequence of fluoroscopic images of the body
region and generating the fluoroscopic 3D reconstruction of the body region based on at
least a portion of the fluoroscopic images. In some embodiments, the method may further
include directing a user to acquire the sequence of fluoroscopic images by manually
sweeping the fluoroscope. In some embodiments, the method may further include
automatically acquiring the sequence of fluoroscopic images. The fluoroscopic images
may be acquired by a standard fluoroscope, in a continuous manner and about a plurality
of angles relative to the body region. The fluoroscope may be swept manually, i.e., by a
user, or automatically. For example, the fluoroscope may be swept along an angle of 20 to
45 degrees. In some embodiments, the fluoroscope may be swept along an angle of 30±5
degrees.
[0067] In some embodiments, the fluoroscopic 3D reconstruction may be
generated based on tomosynthesis methods, and/or according to the systems and methods
disclosed in US Patent Application No. 2017/035379 and US Patent Application No.
15/892,053, as mentioned above and incorporated herein by reference. The CT scan may
be generated according and via methods and systems as known in the art. The CT scan is
a pre-operative CT scan, i.e., generated previously (i.e., not in real-time) and prior to a
medical procedure for which the identification and marking of the target may be required.
[0068] In a step 110, at least one virtual fluoroscopy image may be generated
based on the CT scan of the patient. The virtual fluoroscopy image can then include the
target and the marking of the target, as indicated in the CT scan. The aim is to generate an
image of the target, which includes a relatively accurate indication of the target, and which
resembles the fluoroscopic type of images. A user may then use the indication of the target
in the synthetic image to identify and mark the target in the real-time fluoroscopic volume
(e.g., by identifying the target in one or more slice images). In some embodiments, the
virtual fluoroscopy image may be of a type of 2D fluoroscopic image, e.g., a virtual 2D
fluoroscopic image. In some embodiments, the virtual fluoroscopy image may be of a type
of fluoroscopic 3D reconstruction slice image, e.g., a virtual slice image.
[0069] In some embodiments, the virtual 2D fluoroscopic image may be generated
based on Digitally Reconstructed Radiograph techniques.
[0070] In some embodiments, the generation of the virtual fluoroscopy slice image
may be generated according to the following steps. In a first step, the received CT volume
is aligned with the fluoroscopic 3D reconstruction. In a second step, an estimation of a
pose of the fluoroscopic device while capturing the set of fluoroscopic images used to
generate the fluoroscopic 3D reconstruction in a selected position, e.g., in AP
(anteroposterior) position, with respect to the target or patient is received or calculated. In
a third step, a slice or slices of the CT scan volume perpendicular to the selected position
and which include the target are generated. In a fourth step, the CT slice or slices are
projected according to the estimated fluoroscope pose to receive a virtual fluoroscopy slice
image.
[0071] In some embodiments, generation of a virtual fluoroscopy slice image of
the target area may include the following steps. In a first step, virtual fluoroscope poses
around the target may be obtained. In some embodiments, the virtual fluoroscope poses may be generated by simulating a fluoroscope trajectory while the fluoroscope scans the target. In some embodiments, the method may further include the generation of the 3D fluoroscopic reconstruction, as described with respect to step 430 of Fig. 4. The estimated poses of the fluoroscopic device while capturing the sequence of fluoroscopic images used to generate the fluoroscopic 3D reconstruction may be then utilized. In a second step, virtual fluoroscopic images may be generated by projecting the CT scan volume according to the virtual fluoroscope poses. In a third step, a virtual fluoroscopic 3D reconstruction may be generated based on the virtual fluoroscopic images. In some embodiments, the virtual fluoroscopic 3D reconstruction may be generated while using the method of reconstruction of the 3D fluoroscopic volume with adaptations. The resulting virtual fluoroscopic volume may then look more like the fluoroscopic volume. For example, the methods of fluoroscopic 3D reconstruction disclosed in US Patent Application No.
2017/035379, US Patent Application No. 2017/035380 and US Patent Application No.
15/892,053, as detailed above and herein incorporated by reference, may be used. In a
fourth step, a slice image which includes the indication of the target may be selected from
the virtual fluoroscopic 3D reconstruction.
[0072] In some embodiments, when marking of the target in a slice image of a
fluoroscopic 3D reconstruction is desired, generating and using a virtual slice image as a
reference may be more advantageous. In some embodiments, when marking of the target
in a fluoroscopic 2D image is desired, generating and using a virtual fluoroscopic 2D
image may be more advantageous.
[0073] In a step 120, the virtual fluoroscopy image and the fluoroscopic 3D
reconstruction may be displayed to a user. The indication of the target in the virtual
fluoroscopy image may be then used as a reference for identifying and marking the target
in slice images of the fluoroscopic 3D reconstruction. Thus, facilitating the identification and marking of the target in the fluoroscopic 3D reconstruction. The identification and marking of the target performed by the user may be then more accurate. A user may use the virtual fluoroscopy as a reference prior to the identification and marking of the target in the real-time fluoroscopic images and/or may use it after such identification and marking.
[0074] Various workflows and displays may be used to identify and mark the target
while using virtual fluoroscopic data as a reference according to the present disclosure.
Such displays are exemplified in Figs. 3B and 3C. Reference is now made to Fig. 3B,
which is an exemplary screen shot 350 showing a virtual fluoroscopy image 360 displayed
simultaneously with fluoroscopic slice images 310a and 310b of a fluoroscopic 3D
reconstruction in accordance with the present disclosure. Screen shot 350 includes a virtual
fluoroscopy image 360, fluoroscopic slice images 310a and 310b, scroll bar 320 and
indicator 330. Virtual fluoroscopy image 360 includes a circular marking 370 of a target.
Fluoroscopic slice images 310a and 310b include circular markings 380a and 380b of the
target correspondingly performed by a user. In some embodiments, the user may visually
align between fluoroscopic slice images 310a and 310b and markings 380a and 380b and
virtual fluoroscopic image 360 and marking 370 to verify markings 380a and 380b. In
some embodiments, the user may use virtual fluoroscopic image 360 and marking 370 to
mark fluoroscopic slice images 310a and 310b. In this specific example, two fluoroscopic
slice images are displayed simultaneously. However, according to other embodiments,
only one fluoroscopic slice image may be displayed or more than two. In this specific
example, the virtual fluoroscopy image is displayed in the center of the screen and the
fluoroscopic slice images are displayed in the bottom sides of the screen. However, any
other display arrangement may be used.
[0075] Reference is now made to Fig. 3C, which is an exemplary screen shot 355
showing a display of at least a portion of a 3D fluoroscopic reconstruction 365. Screen
shot 355 includes the 3D reconstruction image 365, which includes at least a portion (e.g.,
a slice) of the 3D fluoroscopic reconstruction; delimited areas 315a and 315b, scroll bar
325, indicator 335 and button 375. Delimited areas 315a and 315b are specified areas for
presenting slice images of the portion of the 3D reconstruction presented in 3D
reconstruction image 365 selected by the user (e.g., selected by marking the target in these
slice images). Button 374 is captioned "Planned Target". In some embodiments, once the
user press or click button 374, he is presented with at least one virtual fluoroscopic image
showing the target and a marking of it to be used as reference. Once button 374 is pressed,
the display may change. In some embodiments, the display presented once button 374 is
pressed may include virtual fluoroscopy images only. In some embodiments, the display
presented once button 374 is pressed may include additional images, including slice
images of the 3D reconstruction. Scroll bar 325, and indicator 335 may be used by the user
to scroll through slices of at least the portion of the 3D reconstruction presented in 3D
reconstruction image 365.
[0076] In some embodiments, the virtual fluoroscopy image and the fluoroscopic
3D reconstruction (e.g., a selected slice of the fluoroscopic 3D reconstruction) may be
displayed to a user simultaneously. In some embodiments, the virtual fluoroscopy image
and the fluoroscopic 3D reconstruction may be displayed in a non-simultaneous manner.
For example, the virtual fluoroscopy image may be displayed in a separate alternative
screen or in a pop-up window.
[0077] In an optional step 130, the user may be directed to identify and mark the
target in the fluoroscopic 3D reconstruction. In some embodiments, the user may be
specifically directed to use the virtual fluoroscopy image/s as a reference. In some embodiments, the user may be instructed to identify and mark the target in two fluoroscopic slice images of the fluoroscopic 3D reconstruction captured at two different angles. Marking the target in two fluoroscopic slice images may be required when the slice width is relatively thick, and such that additional data would be required to accurately determine the location of the target in the fluoroscopic 3D reconstruction. In some embodiments, the user may need or may be required to only identify the target and may be directed accordingly. In some embodiments, the target may be automatically identified in the fluoroscopic 3D reconstruction by a dedicated algorithm. The user may be then required to confirm and optionally amend the automatic marking using the virtual fluoroscopy image as a reference.
[0078] In some embodiments, the identification and marking of a target may be
performed in one or more two-dimensional fluoroscopic images, i.e., fluoroscopic images
as originally captured. One or more fluoroscopic images may be then received and
displayed to the user instead of the fluoroscopic 3D reconstruction. The identification and
marking of the target by a user may be then performed with respect to the received one or
more fluoroscopic images.
[0079] In some embodiments, the set of two-dimensional fluoroscopic images
(e.g., as originally captured), which was used to construct the fluoroscopic 3D
reconstruction, may be additionally received (e.g., in addition to the 3D fluoroscopic
reconstruction). The fluoroscopic 3D reconstruction, the corresponding set of
two-dimensional fluoroscopic images and the virtual fluoroscopy image may be displayed
to the user. The user may then select if to identify and mark the target in one or more slice
images of the fluoroscopic 3D reconstruction, one or more images of the two-dimensional
fluoroscopic images or in both.
[0080] Reference is now made to Fig. 2, which is a schematic diagram of a system
200 configured for use with the method of Fig. 1. System 200 may include a workstation
80, and optionally a fluoroscopic imaging device or fluoroscope 215. In some
embodiments, workstation 80 may be coupled with fluoroscope 215, directly or indirectly,
e.g., by wireless communication. Workstation 80 may include a memory or storage device
202, a processor 204, a display 206 and an input device 210. Processor or hardware
processor 204 may include one or more hardware processors. Workstation 80 may
optionally include an output module 212 and a network interface 208. Memory 202 may
store an application 81 and image data 214. Application 81 may include instructions
executable by processor 204, inter alia, for executing the method steps of Fig. 1.
Application 81 may further include a user interface 216. Image data 214 may include the
CT scan, the fluoroscopic 3D reconstructions of the target area and /or any other
fluoroscopic image data and/or the generated one or more virtual fluoroscopy images.
Processor 204 may be coupled with memory 202, display 206, input device 210, output
module 212, network interface 208 and fluoroscope 215. Workstation 80 may be a
stationary computing device, such as a personal computer, or a portable computing device
such as a tablet computer. Workstation 80 may embed a plurality of computer devices.
[0081] Memory 202 may include any non-transitory computer-readable storage
media for storing data and/or software including instructions that are executable by
processor 204 and which control the operation of workstation 80 and in some
embodiments, may also control the operation of fluoroscope 215. Fluoroscope 215 may
be used to capture a sequence of fluoroscopic images based on which the fluoroscopic 3D
reconstruction is generated. In an embodiment, memory or storage device 202 may include
one or more storage devices such as solid-state storage devices such as flash memory
chips. Alternatively, or in addition to the one or more solid-state storage devices, memory
202 may include one or more mass storage devices connected to the processor 204 through
a mass storage controller (not shown) and a communications bus (not shown). Although
the description of computer-readable media contained herein refers to a solid-state storage,
it should be appreciated by those skilled in the art that computer-readable storage media
can be any available media that can be accessed by the processor 204. That is, computer
readable storage media may include non-transitory, volatile and non-volatile, removable
and non-removable media implemented in any method or technology for storage of
information such as computer-readable instructions, data structures, program modules or
other data. For example, computer-readable storage media may include RAM, ROM,
EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM,
DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk
storage or other magnetic storage devices, or any other medium which may be used to
store the desired information, and which may be accessed by workstation 80.
[0082] Application 81 may, when executed by processor 204, cause display 206 to
present user interface 216. User interface 216 may be configured to present to the user the
fluoroscopic 3D reconstruction and the generated virtual fluoroscopy image, as shown, for
example, in Figs. 3A and 3B. User interface 216 may be further configured to direct the
user to identify and mark the target in the displayed fluoroscopic 3D reconstruction or any
other fluoroscopic image data in accordance with the present disclosure.
[0083] Network interface 208 may be configured to connect to a network such as
a local area network (LAN) consisting of a wired network and/or a wireless network, a
wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the
internet. Network interface 208 may be used to connect between workstation 80 and
fluoroscope 215. Network interface 208 may be also used to receive image data 214. Input
device 210 may be any device by means of which a user may interact with workstation 80, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
Output module 212 may include any connectivity port or bus, such as, for example, parallel
ports, serial ports, universal serial busses (USB), or any other similar connectivity port
known to those skilled in the art.
[0084] Reference is now made to Fig. 4, which is a flow chart of a method for
navigating to a target using real-time two-dimensional fluoroscopic images in accordance
with the present disclosure. The method facilitates navigating to a target area within a
patient's body during a medical procedure. The method utilizes real-time fluoroscopic
based three-dimensional volumetric data. The fluoroscopic three-dimensional volumetric
data may be generated from two-dimensional fluoroscopic images.
[0085] In a step 400, a pre-operative CT scan of the target area may be received.
The pre-operative CT scan may include a marking or indication of the target. Step 400
may be similar to step 100 of the method of Fig. 1.
[0086] In a step 410, one or more virtual fluoroscopy images may be generated
based on the pre-operative CT scan. The virtual fluoroscopy images may include the target
and the marking or indication of the target. Step 400 may be similar to step 110 of the
method of Fig. 1.
[0087] In a step 420, a sequence of fluoroscopic images of the target area acquired
in real time about a plurality of angles relative to the target area may be received. The
sequence of images may be captured while a medical device is positioned in the target
area. In some embodiments, the method may include further steps for directing a user to
acquire the sequence of fluoroscopic images. In some embodiments, the method may
include one or more further steps for automatically acquiring the sequence of fluoroscopic
images.
[0088] In a step 430, a three-dimensional reconstruction of the target area may be
generated based on the sequence of fluoroscopic images.
[0089] In some embodiments, the method further comprises one or more steps for
estimating the pose of the fluoroscopic imaging device while acquiring each of the
fluoroscopic images, or at least a plurality of them. The three-dimensional reconstruction
of the target area may be then generated based on the pose estimation of the fluoroscopic
imaging device.
[0090] In some embodiments, a structure of markers may be placed with respect
to the patient and the fluoroscopic imaging device, such that each fluoroscopic image
includes a projection of at least a portion of the structure of markers. The estimation of the
pose of the fluoroscopic imaging device while acquiring each image may be then
facilitated by the projections of the structure of markers on the fluoroscopic images. In
some embodiments, the estimation may be based on detection of a possible and most
probable projection of the structure of markers as a whole on each image.
[0091] Exemplary systems and methods for constructing such fluoroscopic-based
three-dimensional volumetric data are disclosed in the above commonly owned U.S.
Patent Publication No. 2017/0035379, US Patent Application No. 15/892,053 and U.S.
Provisional Application Serial No. 62/628,017, which are incorporated by reference.
[0092] In some embodiments, once the pose estimation process is complete, the
projection of the structure of markers on the images may be removed by using well known
methods. One such method is disclosed in commonly-owned U.S. Patent Application No.
62/628,028, entitled: "IMAGE RECONSTRUCTION SYSTEM AND METHOD", filed
on February 8, 2018, to Alexandroni et al., the entire content of which is hereby
incorporated by reference.
[0093] In a step 440, one or more virtual fluoroscopy images and the fluoroscopic
3D reconstruction may be displayed to a user. The display may be according to step 120
of the method of Fig. 1 and as exemplified in Fig. 3B. In some embodiments, one or more
virtual fluoroscopy images and the fluoroscopic 3D reconstruction may be displayed to a
user simultaneously. In some embodiments, the virtual fluoroscopy image and the
fluoroscopic 3D reconstruction may be displayed in a non-simultaneous manner. For
example, the virtual fluoroscopy image may be displayed in a separate screen or may be
displayed, e.g., upon the user's request, instead of the display of the fluoroscopic 3D
reconstruction.
[0094] In a step 450, a selection of the target from the fluoroscopic 3D
reconstruction may be received via the user. In some embodiments, the user may be
directed to identify and mark the target in the fluoroscopic 3D reconstruction while using
the one or more virtual fluoroscopy images as a reference.
[0095] In a step 460, a selection of the medical device from the three-dimensional
reconstruction or the sequence of fluoroscopic images may be received. In some
embodiments, the receipt of the selection may include automatically detecting at least a
portion of the medical device in the sequence of fluoroscopic images or three-dimensional
reconstruction. In some embodiments, a user command either accepting or rejecting the
detection may be also received. In some embodiments the selection may be received via
the user. Exemplary automatic detection of a catheter in fluoroscopic images is disclosed
in commonly-owned US Provisional Application No. 62/627,911 to Weingarten et al.,
entitled "System And Method For Catheter Detection In Fluoroscopic Images And
Updating Displayed Position Of Catheter", the contents of which are incorporated herein
by reference.
[0096] In a step 470, an offset of the medical device with respect to the target may
be determined. The determination of the offset may be based on the received selections of
the target and the medical device.
[0097] In some embodiments, the method may further include a step for
determining the location of the medical device within the patient's body based on data
provided by a tracking system, such as an electromagnetic tracking system. In a further
step, the target area and the location of the medical device with respect to the target may
be displayed to the user on a display. In another step, the display of the location of the
medical device with respect to the target may be corrected based on the determined offset
between the medical device and the target.
[0098] In some embodiments, the method may further include a step for generating
a 3D rendering of the target area based on the pre-operative CT scan. A display of the
target area may then include a display of the 3D rendering. In another step, the tracking
system may be registered with the 3D rendering. A correction of the location of the medical
device with respect to the target based on the determined offset may then include the local
updating of the registration between the tracking system and the 3D rendering in the target
area. In some embodiments, the method may further include a step for registering the
fluoroscopic 3D reconstruction to the tracking system. In another step and based on the
above, a local registration between the fluoroscopic 3D reconstruction and the 3D
rendering may be performed in the target area.
[0099] In some embodiments, the target may be a soft tissue target. In some
embodiments, the target area may include at least a portion of the lungs and the medical
device may be configured to be navigated to the target area through the airways luminal
network.
[00100] In some embodiments, the method may include receiving a selection of the
target from one or more images of the sequence of fluoroscopy images in addition or
alternatively to receiving a selection of the target from the fluoroscopic 3D reconstruction.
The sequence of fluoroscopy images may be then displayed to the user in addition or
instead of the display of the fluoroscopic 3D reconstruction correspondingly. The method
of Fig. 4 should be then adapted accordingly.
[00101] A computer program product for navigating to a target using real-time
two-dimensional fluoroscopic images is herein disclosed. The computer program product
may include a non-transitory computer-readable storage medium having program code
embodied therewith. The program code may be executable by at least one hardware
processor to perform the steps of the method of Fig. 1 and/or Fig. 4.
[00102] Fig. 5 is a perspective view of one illustrative embodiment of an exemplary
system for facilitating navigation to a soft-tissue target via the airways network in
accordance with the method of Fig. 4. System 500 may be further configured to construct
fluoroscopic-based three-dimensional volumetric data of the target area from 2D
fluoroscopic images. System 500 may be further configured to facilitate approach of a
medical device to the target area by using Electromagnetic Navigation Bronchoscopy
(ENB) and for determining the location of a medical device with respect to the target.
[00103] System 500 may be configured for reviewing CT image data to identify one
or more targets, planning a pathway to an identified target (planning phase), navigating an
extended working channel (EWC) 512 of a catheter assembly to a target (navigation phase)
via a user interface, and confirming placement of EWC 512 relative to the target. One
such EMN system is the ELECTROMAGNETIC NAVIGATION BRONCHOSCOPY@
system currently sold by Medtronic PLC. The target may be tissue of interest identified
by review of the CT image data during the planning phase. Following navigation, a medical device, such as a biopsy tool or other tool, may be inserted into EWC 512 to obtain a tissue sample from the tissue located at, or proximate to, the target.
[00104] As shown in Fig. 5, EWC 512 is part of a catheter guide assembly 540. In
practice, EWC 512 is inserted into a bronchoscope 530 for access to a luminal network of
the patient "P." Specifically, EWC 512 of catheter guide assembly 540 may be inserted
into a working channel of bronchoscope 530 for navigation through a patient's luminal
network. A locatable guide (LG) 532, including a sensor 544 is inserted into EWC 512
and locked into position such that sensor 544 extends a desired distance beyond the distal
tip of EWC 512. The position and orientation of sensor 544 relative to the reference
coordinate system, and thus the distal portion of EWC 512, within an electromagnetic field
can be derived. Catheter guide assemblies 540 are currently marketed and sold by
Medtronic PLC under the brand names SUPERDIMENSION@ Procedure Kits, or
EDGETM Procedure Kits, and are contemplated as useable with the present disclosure. For
a more detailed description of catheter guide assemblies 540, reference is made to
commonly-owned U.S. Patent Publication No. 2014/0046315, filed on March 15, 2013,
by Ladtkow et al, U.S. Patent No. 7,233,820, and U.S. Patent No. 9,044,254, the entire
contents of each of which are hereby incorporated by reference.
[00105] System 500 generally includes an operating table 520 configured to support
a patient "P," a bronchoscope 530 configured for insertion through the patient's "P's"
mouth into the patient's "P's" airways; monitoring equipment 535 coupled to
bronchoscope 530 (e.g., a video display, for displaying the video images received from the
video imaging system of bronchoscope 530); a locating or tracking system 550 including
a locating module 552, a plurality of reference sensors 554 and a transmitter mat coupled
to a structure of markers 556; and a computing device 525 including software and/or
hardware used to facilitate identification of a target, pathway planning to the target, navigation of a medical device to the target, and/or confirmation and/or determination of placement of EWC 512, or a suitable device therethrough, relative to the target.
Computing device 525 may be similar to workstation 80 of Fig. 2 and may be configured,
inter alia, to execute the methods of Fig. 1 and Fig. 4.
[00106] A fluoroscopic imaging device 510 capable of acquiring fluoroscopic or x
ray images or video of the patient "P" is also included in this particular aspect of system
500. The images, sequence of images, or video captured by fluoroscopic imaging device
510 may be stored within fluoroscopic imaging device 510 or transmitted to computing
device 525 for storage, processing, and display, e.g., as described with respect to Fig. 2.
Additionally, fluoroscopic imaging device 510 may move relative to the patient "P" so
that images may be acquired from different angles or perspectives relative to patient "P"
to create a sequence of fluoroscopic images, such as a fluoroscopic video. The pose of
fluoroscopic imaging device 510 relative to patient "P" and while capturing the images
may be estimated via structure of markers 556 and according to the method of Fig. 4. The
structure of markers is positioned under patient "P", between patient "P" and operating
table 520 and may be positioned between patient "P" and a radiation source or a sensing
unit of fluoroscopic imaging device 510. The structure of markers is coupled to the
transmitter mat (both indicated 556) and positioned under patient "P" on operating table
520. Structure of markers and transmitter map 556 are positioned under the target area
within the patient in a stationary manner. Structure of markers and transmitter map 556
may be two separate elements which may be coupled in a fixed manner or alternatively
may be manufactured as a single unit. Fluoroscopic imaging device 510 may include a
single imaging device or more than one imaging device. In embodiments including
multiple imaging devices, each imaging device may be a different type of imaging device or the same type. Further details regarding the imaging device 510 are described in U.S.
Patent No. 8,565,858, which is incorporated by reference in its entirety herein.
[00107] Computing device 525 may be any suitable computing device including a
processor and storage medium, wherein the processor is capable of executing instructions
stored on the storage medium. Computing device 525 may further include a database
configured to store patient data, CT data sets including CT images, fluoroscopic data sets
including fluoroscopic images and video, fluoroscopic 3D reconstruction, navigation
plans, and any other such data. Although not explicitly illustrated, computing device 525
may include inputs, or may otherwise be configured to receive, CT data sets, fluoroscopic
images/video and other data described herein. Additionally, computing device 525
includes a display configured to display graphical user interfaces. Computing device 525
may be connected to one or more networks through which one or more databases may be
accessed.
[00108] With respect to the planning phase, computing device 525 utilizes
previously acquired CT image data for generating and viewing a three-dimensional model
or rendering of the patient's "P's" airways, enables the identification of a target on the
three-dimensional model (automatically, semi-automatically, or manually), and allows for
determining a pathway through the patient's "P's" airways to tissue located at and around
the target. More specifically, CT images acquired from previous CT scans are processed
and assembled into a three-dimensional CT volume, which is then utilized to generate a
three-dimensional model of the patient's "P's" airways. The three-dimensional model may
be displayed on a display associated with computing device 525, or in any other suitable
fashion. Using computing device 525, various views of the three-dimensional model or
enhanced two-dimensional images generated from the three-dimensional model are
presented. The enhanced two-dimensional images may possess some three-dimensional capabilities because they are generated from three-dimensional data. The three-dimensional model may be manipulated to facilitate identification of target on the three-dimensional model or two-dimensional images, and selection of a suitable pathway through the patient's "P's" airways to access tissue located at the target can be made. Once selected, the pathway plan, three-dimensional model, and images derived therefrom, can be saved and exported to a navigation system for use during the navigation phase(s). One such planning software is the ILOGIC@ planning suite currently sold by Medtronic PLC.
[00109] With respect to the navigation phase, a six degrees-of-freedom
electromagnetic locating or tracking system 550, e.g., similar to those disclosed in U.S.
Patent Nos. 8,467,589, 6,188,355, and published PCT Application Nos. WO 00/10456 and
WO 01/67035, the entire contents of each of which are incorporated herein by reference,
or other suitable system for determining location, is utilized for performing registration of
the images and the pathway for navigation, although other configurations are also
contemplated. Tracking system 550 includes a locating or tracking module 552, a plurality
of reference sensors 554, and a transmitter mat 556 (coupled with the structure of markers).
Tracking system 550 is configured for use with a locatable guide 532 and particularly
sensor 544. As described above, locatable guide 532 and sensor 544 are configured for
insertion through EWC 512 into a patient's "P's" airways (either with or without
bronchoscope 530) and are selectively lockable relative to one another via a locking
mechanism.
[00110] Transmitter mat 556 is positioned beneath patient "P." Transmitter mat
556 generates an electromagnetic field around at least a portion of the patient "P" within
which the position of a plurality of reference sensors 554 and the sensor element 544 can
be determined with use of a tracking module 552. One or more of reference sensors 554
are attached to the chest of the patient "P." The six degrees of freedom coordinates of reference sensors 554 are sent to computing device 525 (which includes the appropriate software) where they are used to calculate a patient coordinate frame of reference.
Registration is generally performed to coordinate locations of the three-dimensional model
and two-dimensional images from the planning phase, with the patient's "P's" airways as
observed through the bronchoscope 530, and allow for the navigation phase to be
undertaken with precise knowledge of the location of the sensor 544, even in portions of
the airway where the bronchoscope 530 cannot reach. Further details of such a registration
technique and their implementation in luminal navigation can be found in U.S. Patent
Application Pub. No. 2011/0085720, the entire content of which is incorporated herein by
reference, although other suitable techniques are also contemplated.
[00111] Registration of the patient's "P's" location on the transmitter mat 556 is
performed by moving LG 532 through the airways of the patient's "P." More specifically,
data pertaining to locations of sensor 544, while locatable guide 532 is moving through
the airways, is recorded using transmitter mat 556, reference sensors 554, and tracking
module 552. A shape resulting from this location data is compared to an interior geometry
of passages of the three-dimensional model generated in the planning phase, and a location
correlation between the shape and the three-dimensional model based on the comparison
is determined, e.g., utilizing the software on computing device 525. In addition, the
software identifies non-tissue space (e.g., air filled cavities) in the three-dimensional
model. The software aligns, or registers, an image representing a location of sensor 544
with the three-dimensional model and/or two-dimensional images generated from the
three-dimension model, which are based on the recorded location data and an assumption
that locatable guide 532 remains located in non-tissue space in the patient's "P's" airways.
Alternatively, a manual registration technique may be employed by navigating the
bronchoscope 530 with the sensor 544 to pre-specified locations in the lungs of the patient
"P", and manually correlating the images from the bronchoscope to the model data of the
three-dimensional model.
[00112] Following registration of the patient "P" to the image data and pathway
plan, a user interface is displayed in the navigation software which sets for the pathway
that the clinician is to follow to reach the target. One such navigation software is the
ILOGIC@ navigation suite currently sold by Medtronic PLC.
[00113] Once EWC 512 has been successfully navigated proximate the target as
depicted on the user interface, the locatable guide 532 may be unlocked from EWC 512
and removed, leaving EWC 512 in place as a guide channel for guiding medical devices
including without limitation, optical systems, ultrasound probes, marker placement tools,
biopsy tools, ablation tools (i.e., microwave ablation devices), laser probes, cryogenic
probes, sensor probes, and aspirating needles to the target.
[00114] A medical device may be then inserted through EWC 512 and navigated to
the target or to a specific area adjacent to the target. A sequence of fluoroscopic images
may be then acquired via fluoroscopic imaging device 510, optionally by a user and
according to directions displayed via computing device 525. A fluoroscopic 3D
reconstruction may be then generated via computing device 525. The generation of the
fluoroscopic 3D reconstruction is based on the sequence of fluoroscopic images and the
projections of structure of markers 556 on the sequence of images. One or more virtual
fluoroscopic images may be then generated based on the pre-operative CT scan and via
computing device 525. The one or more virtual fluoroscopic images and the fluoroscopic
3D reconstruction may be then displayed to the user on a display via computing device
525, optionally simultaneously. The user may be then directed to identify and mark the
target while using the virtual fluoroscopic image as a reference. The user may be also
directed to identify and mark the medical device in the sequence of fluoroscopic
2D-dimensional images. An offset between the location of the target and the medical
device may be then determined or calculated via computer device 525. The offset may be
then utilized, via computing device 525, to correct the location of the medical device on
the display with respect to the target and/or correct the registration between the
three-dimensional model and tracking system 550 in the area of the target and/or generate
a local registration between the three-dimensional model and the fluoroscopic 3D
reconstruction in the target area.
[00115] System 500 or a similar version of it in conjunction with the method of Fig.
4 may be used in various procedures, other than ENB procedures with the required obvious
modifications, and such as laparoscopy or robotic assisted surgery.
[00116] The terms "tracking" or "localization", as referred to herein, may be used
interchangeably. Although the present disclosure specifically describes the use of an EM
tracking system to navigate or determine the location of a medical device, various tracking
systems or localization systems may be used or applied with respect to the methods and
systems disclosed herein. Such tracking, localization or navigation systems may use
various methodologies including electromagnetic, Infra-Red, echolocation, optical or
imaging-based methodologies. Such systems may be based on pre-operative imaging
and/or real-time imaging.
[00117] In some embodiments, the standard fluoroscope may be employed to
facilitate navigation and tracking of the medical device, as disclosed, for example, in US
Patent No. 9,743,896 to Averbuch. For example, such fluoroscopy-based localization or
navigation methodology may be applied in addition to or instead of the above-mentioned
EM tracking methodology, e.g., as described with respect to Fig. 5, to facilitate or enhance
navigation of the medical device.
[00118] From the foregoing and with reference to the various figure drawings, those
skilled in the art will appreciate that certain modifications can also be made to the present
disclosure without departing from the scope of the same.
[00119] Detailed embodiments of the present disclosure are disclosed herein.
However, the disclosed embodiments are merely examples of the disclosure, which may
be embodied in various forms and aspects. Therefore, specific structural and functional
details disclosed herein are not to be interpreted as limiting, but merely as a basis for the
claims and as a representative basis for teaching one skilled in the art to variously employ
the present disclosure in virtually any appropriately detailed structure.
[00120] While several embodiments of the disclosure have been shown in the
drawings, it is not intended that the disclosure be limited thereto, as it is intended that the
disclosure be as broad in scope as the art will allow and that the specification be read
likewise. Therefore, the above description should not be construed as limiting, but merely
as exemplifications of embodiments. Those skilled in the art will envision other
modifications within the scope and spirit of the claims appended hereto.
Claims (5)
1. A system for facilitating identification and marking of a target in a target area in a
fluoroscopic image of a body region of a patient, the system comprising:
(i) one or more storage devices having stored thereon instructions for:
receiving a CT scan of the body region of the patient, wherein the CT scan includes a
marking of the target;
receiving a sequence of fluoroscopic images including the target area acquired in real
time about a plurality of angles relative to the target, while a medical device is positioned in
proximity to the target; and
generating a fluoroscopic 3D reconstruction based on at least a portion of the sequence
of fluoroscopic images;
generating at least one virtual fluoroscopic image based on the CT scan of the patient,
wherein the virtual fluoroscopic image includes the target and the marking of the target,
receiving a selection of the target in the fluoroscopic 3D reconstruction via a user
input;
receiving a selection of the medical device in the 3D reconstruction or the sequence
of fluoroscopic images;
determining an offset of the medical device with respect to the target based on the
selections of the target and the medical device;
determining a location of the medical device within the patient based on data provided
by a tracking system;
displaying the target area and the location of the medical device with respect to the
target on a display; and
correcting the display of the location of the medical device with respect to the target
based on the offset between the medical device and the target; and
(ii) at least one hardware processor configured to execute said instructions.
2. The system of claim 1, wherein the one or more storage devices have stored thereon
further instructions for directing the user to identify and mark the target in the fluoroscopic
3D reconstruction while using the virtual fluoroscopic image as a reference.
3. The system of claim 2, wherein the user is directed to identify and mark the target in
two fluoroscopic slice images of the fluoroscopic 3D reconstruction.
4. The system of any one of the preceding claims, wherein the generating of the at least
one virtual fluoroscopic image comprises:
generating virtual fluoroscope poses around the target by simulating a fluoroscope
trajectory while scanning the target;
generating virtual fluoroscopic images by projecting the CT scan according to the
virtual fluoroscope poses;
generating virtual fluoroscopic 3D reconstruction based on the virtual fluoroscopic
images; and
selecting a slice image from the virtual fluoroscopic 3D reconstruction which
comprises the marking of the target.
5. The system of any one of the preceding claims, wherein the target is a soft-tissue
target.
Covidien LP Patent Attorneys for the Applicant/Nominated Person SPRUSON&FERGUSON
Applications Claiming Priority (11)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762526798P | 2017-06-29 | 2017-06-29 | |
| US62/526,798 | 2017-06-29 | ||
| US201762570431P | 2017-10-10 | 2017-10-10 | |
| US62/570,431 | 2017-10-10 | ||
| US201862628017P | 2018-02-08 | 2018-02-08 | |
| US62/628,017 | 2018-02-08 | ||
| US201862641777P | 2018-03-12 | 2018-03-12 | |
| US62/641,777 | 2018-03-12 | ||
| US16/022,222 US10699448B2 (en) | 2017-06-29 | 2018-06-28 | System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data |
| US16/022,222 | 2018-06-28 | ||
| PCT/US2018/040222 WO2019006258A1 (en) | 2017-06-29 | 2018-06-29 | System and method for identifying, marking and navigating to a target using real-time two-dimensional fluoroscopic data |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| AU2018290995A1 AU2018290995A1 (en) | 2019-11-28 |
| AU2018290995B2 true AU2018290995B2 (en) | 2022-07-07 |
Family
ID=64734857
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU2018290995A Ceased AU2018290995B2 (en) | 2017-06-29 | 2018-06-29 | System and method for identifying, marking and navigating to a target using real-time two-dimensional fluoroscopic data |
Country Status (6)
| Country | Link |
|---|---|
| US (4) | US10699448B2 (en) |
| EP (1) | EP3646289A4 (en) |
| JP (1) | JP7277386B2 (en) |
| CN (2) | CN117252948A (en) |
| AU (1) | AU2018290995B2 (en) |
| WO (1) | WO2019006258A1 (en) |
Families Citing this family (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10674982B2 (en) | 2015-08-06 | 2020-06-09 | Covidien Lp | System and method for local three dimensional volume reconstruction using a standard fluoroscope |
| US10702226B2 (en) | 2015-08-06 | 2020-07-07 | Covidien Lp | System and method for local three dimensional volume reconstruction using a standard fluoroscope |
| EP3429475B1 (en) | 2016-03-13 | 2021-12-15 | Vuze Medical Ltd. | Apparatus for use with skeletal procedures |
| WO2019012520A1 (en) | 2017-07-08 | 2019-01-17 | Vuze Medical Ltd. | Apparatus and methods for use with image-guided skeletal procedures |
| US10893843B2 (en) | 2017-10-10 | 2021-01-19 | Covidien Lp | System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction |
| AU2019200594B2 (en) * | 2018-02-08 | 2020-05-28 | Covidien Lp | System and method for local three dimensional volume reconstruction using a standard fluoroscope |
| US10893842B2 (en) * | 2018-02-08 | 2021-01-19 | Covidien Lp | System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target |
| US11344371B2 (en) * | 2018-10-19 | 2022-05-31 | Canon U.S.A., Inc. | Visualization of three-dimensional image data on a two-dimensional image |
| US11925333B2 (en) | 2019-02-01 | 2024-03-12 | Covidien Lp | System for fluoroscopic tracking of a catheter to update the relative position of a target and the catheter in a 3D model of a luminal network |
| US11564751B2 (en) * | 2019-02-01 | 2023-01-31 | Covidien Lp | Systems and methods for visualizing navigation of medical devices relative to targets |
| JP7311859B2 (en) * | 2019-03-25 | 2023-07-20 | 株式会社日立製作所 | Moving body tracking device, radiotherapy system, program, and moving body tracking method |
| US11627924B2 (en) * | 2019-09-24 | 2023-04-18 | Covidien Lp | Systems and methods for image-guided navigation of percutaneously-inserted devices |
| US20210169583A1 (en) * | 2019-12-04 | 2021-06-10 | Covidien Lp | Method for maintaining localization of distal catheter tip to target during ventilation and/or cardiac cycles |
| US11847730B2 (en) * | 2020-01-24 | 2023-12-19 | Covidien Lp | Orientation detection in fluoroscopic images |
| JP7454435B2 (en) * | 2020-04-15 | 2024-03-22 | キヤノンメディカルシステムズ株式会社 | Medical image processing device and medical image processing method |
| US12347100B2 (en) | 2020-11-19 | 2025-07-01 | Mazor Robotics Ltd. | Systems and methods for generating virtual images |
| CN116887778A (en) * | 2020-12-30 | 2023-10-13 | 直观外科手术操作公司 | System for integrating intraoperative image data with minimally invasive medical techniques |
| US12193759B2 (en) | 2020-12-30 | 2025-01-14 | Canon U.S.A., Inc. | Real-time correction of regional tissue deformation during endoscopy procedure |
| CN113112560B (en) * | 2021-04-14 | 2023-10-03 | 杭州柳叶刀机器人有限公司 | Physiological point region marking method and device |
| US20230145801A1 (en) * | 2021-11-10 | 2023-05-11 | Covidien Lp | Systems and methods of visualizing a medical device relative to a target |
| CN116433874B (en) * | 2021-12-31 | 2024-07-30 | 杭州堃博生物科技有限公司 | Bronchoscope navigation method, device, equipment and storage medium |
| CN114581635B (en) * | 2022-03-03 | 2023-03-24 | 上海涞秋医疗科技有限责任公司 | Positioning method and system based on HoloLens glasses |
| US20240023917A1 (en) * | 2022-07-21 | 2024-01-25 | Covidien Lp | Handling respiration during navigational bronchoscopy |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140046175A1 (en) * | 2012-08-07 | 2014-02-13 | Covidien Lp | Microwave ablation catheter and method of utilizing the same |
| US20170035380A1 (en) * | 2015-08-06 | 2017-02-09 | Covidien Lp | System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction |
Family Cites Families (386)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4686695A (en) | 1979-02-05 | 1987-08-11 | Board Of Trustees Of The Leland Stanford Junior University | Scanned x-ray selective imaging system |
| US5057494A (en) | 1988-08-03 | 1991-10-15 | Ethicon, Inc. | Method for preventing tissue damage after an ischemic episode |
| EP0419729A1 (en) | 1989-09-29 | 1991-04-03 | Siemens Aktiengesellschaft | Position finding of a catheter by means of non-ionising fields |
| US5376795A (en) | 1990-07-09 | 1994-12-27 | Regents Of The University Of California | Emission-transmission imaging system using single energy and dual energy transmission and radionuclide emission data |
| EP0931516B1 (en) | 1990-10-19 | 2008-08-20 | St. Louis University | Surgical probe locating system for head use |
| US6405072B1 (en) | 1991-01-28 | 2002-06-11 | Sherwood Services Ag | Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus |
| US5251635A (en) | 1991-09-03 | 1993-10-12 | General Electric Company | Stereoscopic X-ray fluoroscopy system using radiofrequency fields |
| US5647361A (en) | 1992-09-28 | 1997-07-15 | Fonar Corporation | Magnetic resonance imaging method and apparatus for guiding invasive therapy |
| AU6666894A (en) | 1993-04-22 | 1994-11-08 | Pixsys, Inc. | System for locating relative positions of objects |
| US5321113A (en) | 1993-05-14 | 1994-06-14 | Ethicon, Inc. | Copolymers of an aromatic anhydride and aliphatic ester |
| US5803089A (en) | 1994-09-15 | 1998-09-08 | Visualization Technology, Inc. | Position tracking and imaging system for use in medical applications |
| US5829444A (en) | 1994-09-15 | 1998-11-03 | Visualization Technology, Inc. | Position tracking and imaging system for use in medical applications |
| DE19512819C2 (en) | 1995-04-05 | 1999-05-27 | Siemens Ag | X-ray computer tomograph |
| US5588033A (en) | 1995-06-06 | 1996-12-24 | St. Jude Children's Research Hospital | Method and apparatus for three dimensional image reconstruction from multiple stereotactic or isocentric backprojections |
| US5638819A (en) | 1995-08-29 | 1997-06-17 | Manwaring; Kim H. | Method and apparatus for guiding an instrument to a target |
| US5772594A (en) | 1995-10-17 | 1998-06-30 | Barrick; Earl F. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
| US6122549A (en) | 1996-08-13 | 2000-09-19 | Oratec Interventions, Inc. | Apparatus for treating intervertebral discs with resistive energy |
| US5744802A (en) | 1995-10-25 | 1998-04-28 | Adac Laboratories | Image generation from limited projections in positron emission tomography using multi-slice rebinning |
| ES2210498T3 (en) | 1996-02-15 | 2004-07-01 | Biosense, Inc. | POSITIONABLE TRANSDUCERS INDEPENDENTLY FOR LOCATION SYSTEM. |
| DE19620371A1 (en) | 1996-05-21 | 1997-12-04 | Philips Patentverwaltung | X-ray procedure |
| US5902239A (en) | 1996-10-30 | 1999-05-11 | U.S. Philips Corporation | Image guided surgery system including a unit for transforming patient positions to image positions |
| DE19703556A1 (en) | 1997-01-31 | 1998-08-06 | Philips Patentverwaltung | Method and arrangement for determining the position in X-ray imaging |
| US6314310B1 (en) | 1997-02-14 | 2001-11-06 | Biosense, Inc. | X-ray guided surgical location system with extended mapping volume |
| US6580938B1 (en) | 1997-02-25 | 2003-06-17 | Biosense, Inc. | Image-guided thoracic therapy and apparatus therefor |
| US6038282A (en) | 1997-04-30 | 2000-03-14 | Siemens Aktiengesellschaft | X-ray imaging system |
| US6055449A (en) | 1997-09-22 | 2000-04-25 | Siemens Corporate Research, Inc. | Method for localization of a biopsy needle or similar surgical tool in a radiographic image |
| US5909476A (en) | 1997-09-22 | 1999-06-01 | University Of Iowa Research Foundation | Iterative process for reconstructing cone-beam tomographic images |
| US5930329A (en) | 1997-09-22 | 1999-07-27 | Siemens Corporate Research, Inc. | Apparatus and method for detection and localization of a biopsy needle or similar surgical tool in a radiographic image |
| US5951475A (en) | 1997-09-25 | 1999-09-14 | International Business Machines Corporation | Methods and apparatus for registering CT-scan data to multiple fluoroscopic images |
| DE19746093C2 (en) | 1997-10-17 | 2002-10-17 | Siemens Ag | C-arm X-ray device |
| DE19746092C2 (en) | 1997-10-17 | 2002-09-05 | Siemens Ag | X-ray imaging device for 3D imaging |
| US6461370B1 (en) | 1998-11-03 | 2002-10-08 | C. R. Bard, Inc. | Temporary vascular filter guide wire |
| PT1028877E (en) | 1997-11-14 | 2003-01-31 | Continental Teves Ag & Co Ohg | BRAKING FORK TRANSMISSION DEVICE FOR AUTOMOTIVE VEHICLES |
| KR100280198B1 (en) | 1997-11-24 | 2001-02-01 | 이민화 | X-ray imaging apparatus and method capable of CT imaging |
| US6149592A (en) | 1997-11-26 | 2000-11-21 | Picker International, Inc. | Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data |
| IL122578A (en) | 1997-12-12 | 2000-08-13 | Super Dimension Ltd | Wireless six-degree-of-freedom locator |
| US6731283B1 (en) | 1997-12-31 | 2004-05-04 | Siemens Corporate Research, Inc. | C-arm calibration method utilizing aplanar transformation for 3D reconstruction in an imaging system |
| US6289235B1 (en) | 1998-03-05 | 2001-09-11 | Wake Forest University | Method and system for creating three-dimensional images using tomosynthetic computed tomography |
| JP3743594B2 (en) | 1998-03-11 | 2006-02-08 | 株式会社モリタ製作所 | CT imaging device |
| US6003517A (en) | 1998-04-30 | 1999-12-21 | Ethicon Endo-Surgery, Inc. | Method for using an electrosurgical device on lung tissue |
| US6118845A (en) | 1998-06-29 | 2000-09-12 | Surgical Navigation Technologies, Inc. | System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers |
| FR2781140B1 (en) | 1998-07-17 | 2000-11-10 | Ge Medical Syst Sa | METHOD FOR POSITIONING A RADIOLOGY APPARATUS |
| US6081577A (en) | 1998-07-24 | 2000-06-27 | Wake Forest University | Method and system for creating task-dependent three-dimensional images |
| WO2000010456A1 (en) | 1998-08-02 | 2000-03-02 | Super Dimension Ltd. | Intrabody navigation system for medical applications |
| US6477400B1 (en) | 1998-08-20 | 2002-11-05 | Sofamor Danek Holdings, Inc. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
| EP1115328A4 (en) | 1998-09-24 | 2004-11-10 | Super Dimension Ltd | System and method for determining the location of a catheter during an intra-body medical procedure |
| US6092928A (en) | 1998-11-12 | 2000-07-25 | Picker International, Inc. | Apparatus and method to determine the relative position of a detector array and an x-ray tube focal spot |
| US7016457B1 (en) | 1998-12-31 | 2006-03-21 | General Electric Company | Multimode imaging system for generating high quality images |
| JP2000201920A (en) | 1999-01-19 | 2000-07-25 | Fuji Photo Film Co Ltd | Photographed image data acquiring method and photographed image data acquiring device |
| JP4473358B2 (en) | 1999-01-21 | 2010-06-02 | 株式会社東芝 | Diagnostic equipment |
| US6285902B1 (en) | 1999-02-10 | 2001-09-04 | Surgical Insights, Inc. | Computer assisted targeting device for use in orthopaedic surgery |
| US6285739B1 (en) | 1999-02-19 | 2001-09-04 | The Research Foundation Of State University Of New York | Radiographic imaging apparatus and method for vascular interventions |
| US6470207B1 (en) | 1999-03-23 | 2002-10-22 | Surgical Navigation Technologies, Inc. | Navigational guidance via computer-assisted fluoroscopic imaging |
| DE19919907C2 (en) | 1999-04-30 | 2003-10-16 | Siemens Ag | Method and device for catheter navigation in three-dimensional vascular tree images |
| US8442618B2 (en) | 1999-05-18 | 2013-05-14 | Mediguide Ltd. | Method and system for delivering a medical device to a selected position within a lumen |
| US7343195B2 (en) | 1999-05-18 | 2008-03-11 | Mediguide Ltd. | Method and apparatus for real time quantitative three-dimensional image reconstruction of a moving organ and intra-body navigation |
| US6139544A (en) | 1999-05-26 | 2000-10-31 | Endocare, Inc. | Computer guided cryosurgery |
| US6236704B1 (en) | 1999-06-30 | 2001-05-22 | Siemens Corporate Research, Inc. | Method and apparatus using a virtual detector for three-dimensional reconstruction from x-ray images |
| DE19936364A1 (en) | 1999-08-03 | 2001-02-15 | Siemens Ag | Identification and localisation of marks in a 3D medical scanning process |
| DE19936408B4 (en) | 1999-08-03 | 2005-09-01 | Siemens Ag | Mobile X-ray machine |
| US6608081B2 (en) | 1999-08-12 | 2003-08-19 | Ortho-Mcneil Pharmaceutical, Inc. | Bicyclic heterocyclic substituted phenyl oxazolidinone antibacterials, and related compositions and methods |
| US6413981B1 (en) | 1999-08-12 | 2002-07-02 | Ortho-Mcneil Pharamceutical, Inc. | Bicyclic heterocyclic substituted phenyl oxazolidinone antibacterials, and related compositions and methods |
| US6307908B1 (en) | 1999-09-20 | 2001-10-23 | General Electric Company | System and method for data interpolation in a multislice x-ray computed tomography system |
| FR2799028B1 (en) | 1999-09-27 | 2002-05-03 | Ge Medical Syst Sa | METHOD FOR RECONSTRUCTING A THREE-DIMENSIONAL IMAGE OF ELEMENTS OF STRONG CONTRAST |
| AU2001224721A1 (en) | 2000-01-10 | 2001-08-07 | Super Dimension Ltd. | Methods and systems for performing medical procedures with reference to projective images and with respect to pre-stored images |
| US7689014B2 (en) | 2000-01-18 | 2010-03-30 | Z-Kat Inc | Apparatus and method for measuring anatomical objects using coordinated fluoroscopy |
| DE10003524B4 (en) | 2000-01-27 | 2006-07-13 | Siemens Ag | Mobile X-ray device and method for the determination of projection geometries |
| AU2001241008A1 (en) | 2000-03-09 | 2001-09-17 | Super Dimension Ltd. | Object tracking using a single sensor or a pair of sensors |
| US6856827B2 (en) | 2000-04-28 | 2005-02-15 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
| US6856826B2 (en) | 2000-04-28 | 2005-02-15 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
| US6490475B1 (en) | 2000-04-28 | 2002-12-03 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
| US6782287B2 (en) | 2000-06-27 | 2004-08-24 | The Board Of Trustees Of The Leland Stanford Junior University | Method and apparatus for tracking a medical instrument based on image registration |
| US6351513B1 (en) | 2000-06-30 | 2002-02-26 | Siemens Corporate Research, Inc. | Fluoroscopy based 3-D neural navigation based on co-registration of other modalities with 3-D angiography reconstruction data |
| US6389104B1 (en) | 2000-06-30 | 2002-05-14 | Siemens Corporate Research, Inc. | Fluoroscopy based 3-D neural navigation based on 3-D angiography reconstruction data |
| US6750034B1 (en) | 2000-06-30 | 2004-06-15 | Ortho-Mcneil Pharmaceutical, Inc. | DNA encoding human serine protease D-G |
| DE10033063A1 (en) | 2000-07-07 | 2002-01-24 | Brainlab Ag | Respiration compensated radiation treatment tracks target volume using markers and switches beam |
| US6823207B1 (en) | 2000-08-26 | 2004-11-23 | Ge Medical Systems Global Technology Company, Llc | Integrated fluoroscopic surgical navigation and imaging workstation with command protocol |
| US6714810B2 (en) | 2000-09-07 | 2004-03-30 | Cbyon, Inc. | Fluoroscopic registration system and method |
| WO2002022072A2 (en) | 2000-09-11 | 2002-03-21 | Closure Medical Corporation | Bronchial occlusion method and apparatus |
| DE10047382C2 (en) | 2000-09-25 | 2003-12-18 | Siemens Ag | X-ray calibration phantom, method for markerless registration for navigation-guided interventions using the X-ray calibration phantom and medical system comprising such an X-ray calibration phantom |
| DE10051370A1 (en) | 2000-10-17 | 2002-05-02 | Brainlab Ag | Method and appliance for exact positioning of patient for radiation therapy and radio surgery with which only one camera is used to determine and compensate for positional error |
| US7778685B2 (en) | 2000-10-18 | 2010-08-17 | Paieon Inc. | Method and system for positioning a device in a tubular organ |
| US6472372B1 (en) | 2000-12-06 | 2002-10-29 | Ortho-Mcneil Pharmaceuticals, Inc. | 6-O-Carbamoyl ketolide antibacterials |
| US6666579B2 (en) | 2000-12-28 | 2003-12-23 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system |
| EP1357850A1 (en) | 2001-02-07 | 2003-11-05 | SYNTHES AG Chur | Method for establishing a three-dimensional representation of bone x-ray images |
| FR2820629B1 (en) | 2001-02-12 | 2003-09-05 | Ge Med Sys Global Tech Co Llc | METHOD FOR CALIBRATING AN ASSISTANCE SYSTEM FOR SURGICAL INTERVENTION OF THE MEDICAL IMAGING TYPE |
| US6785571B2 (en) | 2001-03-30 | 2004-08-31 | Neil David Glossop | Device and method for registering a position sensor in an anatomical body |
| FR2823968B1 (en) | 2001-04-27 | 2005-01-14 | Ge Med Sys Global Tech Co Llc | CALIBRATION METHOD OF IMAGING SYSTEM, MEMORY MEDIUM AND ASSOCIATED DEVICE |
| AU2002308732A1 (en) | 2001-05-15 | 2002-11-25 | Ortho-Mcneil Pharmaceutical, Inc. | Ex-vivo priming for generating cytotoxic t lymphocytes specific for non-tumor antigens to treat autoimmune and allergic disease |
| US20030014093A1 (en) | 2001-05-29 | 2003-01-16 | Makin Inder Raj. S. | Excisional and ultrasound medical treatment system |
| US7607440B2 (en) | 2001-06-07 | 2009-10-27 | Intuitive Surgical, Inc. | Methods and apparatus for surgical planning |
| WO2003032837A1 (en) | 2001-10-12 | 2003-04-24 | University Of Florida | Computer controlled guidance of a biopsy needle |
| US6768784B1 (en) | 2001-11-07 | 2004-07-27 | Koninklijke Philips Electronics N.V. | X-ray image enhancement |
| DE10155590A1 (en) | 2001-11-13 | 2003-05-15 | Philips Corp Intellectual Pty | Fluoroscopic computed tomography procedure |
| US7010152B2 (en) | 2002-01-22 | 2006-03-07 | Canon Kabushiki Kaisha | Radiographic image composition and use |
| US6774624B2 (en) | 2002-03-27 | 2004-08-10 | Ge Medical Systems Global Technology Company, Llc | Magnetic tracking system |
| DE10215808B4 (en) | 2002-04-10 | 2005-02-24 | Siemens Ag | Registration procedure for navigational procedures |
| US6707878B2 (en) | 2002-04-15 | 2004-03-16 | General Electric Company | Generalized filtered back-projection reconstruction in digital tomosynthesis |
| US7233820B2 (en) | 2002-04-17 | 2007-06-19 | Superdimension Ltd. | Endoscope structures and techniques for navigating to a target in branched structure |
| EP1501411B1 (en) | 2002-04-22 | 2014-03-12 | Johns Hopkins University | Apparatus for insertion of a medical device during a medical imaging process |
| US7787932B2 (en) | 2002-04-26 | 2010-08-31 | Brainlab Ag | Planning and navigation assistance using two-dimensionally adapted generic and detected patient data |
| US7165362B2 (en) | 2002-07-15 | 2007-01-23 | Apple Computer, Inc. | Glass support member |
| US7356115B2 (en) | 2002-12-04 | 2008-04-08 | Varian Medical Systems Technology, Inc. | Radiation scanning units including a movable platform |
| MXPA03006874A (en) | 2002-07-31 | 2004-09-03 | Johnson & Johnson | Long term oxygen therapy system. |
| WO2004019279A2 (en) | 2002-08-21 | 2004-03-04 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US7251522B2 (en) | 2002-09-12 | 2007-07-31 | Brainlab Ag | X-ray image-assisted navigation using original, two-dimensional x-ray images |
| DE10243162B4 (en) | 2002-09-17 | 2005-10-06 | Siemens Ag | Computer-aided display method for a 3D object |
| DE10245669B4 (en) | 2002-09-30 | 2006-08-17 | Siemens Ag | A method for intraoperatively generating an updated volume data set |
| US6928142B2 (en) | 2002-10-18 | 2005-08-09 | Koninklijke Philips Electronics N.V. | Non-invasive plaque detection using combined nuclear medicine and x-ray system |
| US6898263B2 (en) | 2002-11-27 | 2005-05-24 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for soft-tissue volume visualization |
| FR2848806B1 (en) | 2002-12-18 | 2005-11-04 | Ge Med Sys Global Tech Co Llc | METHOD OF CALIBRATING A RADIOLOGICAL IMAGING APPARATUS REQUIRING A LIMITED NUMBER OF ACQUISITIONS |
| US20040120981A1 (en) | 2002-12-20 | 2004-06-24 | Aruna Nathan | Crosslinked alkyd polyesters for medical applications |
| US7048440B2 (en) | 2003-03-12 | 2006-05-23 | Siemens Aktiengesellschaft | C-arm x-ray device |
| EP1606770B1 (en) | 2003-03-14 | 2010-08-11 | Koninklijke Philips Electronics N.V. | Motion-corrected three-dimensional volume imaging method |
| CN100591686C (en) | 2003-04-30 | 2010-02-24 | 森托科尔公司 | CNGH0010-specific polynucleotides, polypeptides, antibodies, compositions, methods and uses |
| JP4200811B2 (en) | 2003-05-16 | 2008-12-24 | 株式会社島津製作所 | Radiation therapy planning device |
| DE10322738A1 (en) | 2003-05-20 | 2004-12-16 | Siemens Ag | Markerless automatic 2D C scan and preoperative 3D image fusion procedure for medical instrument use uses image based registration matrix generation |
| DE10323008A1 (en) | 2003-05-21 | 2004-12-23 | Siemens Ag | Automatic fusion of 2D fluoroscopic C-frame X-ray images with preoperative 3D images using navigation markers, by use of a projection matrix based on a 2D fluoroscopy image and a defined reference navigation system |
| US7099431B2 (en) | 2003-06-09 | 2006-08-29 | Canon Kabushiki Kaisha | Radiation imaging apparatus |
| FR2856170B1 (en) | 2003-06-10 | 2005-08-26 | Biospace Instr | RADIOGRAPHIC IMAGING METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION, DEVICE AND COMPUTER PROGRAM FOR IMPLEMENTING SAID METHOD |
| US7186023B2 (en) | 2003-06-10 | 2007-03-06 | Shimadzu Corporation | Slice image and/or dimensional image creating method |
| WO2004110271A1 (en) | 2003-06-16 | 2004-12-23 | Philips Intellectual Property & Standards Gmbh | Imaging system for interventional radiology |
| US7482376B2 (en) | 2003-07-03 | 2009-01-27 | 3-Dimensional Pharmaceuticals, Inc. | Conjugated complement cascade inhibitors |
| MXPA05014238A (en) | 2003-07-04 | 2006-03-09 | Johnson & Johnson Res Pty Ltd | Method for detection of alkylated cytosine in dna. |
| WO2005015125A1 (en) | 2003-08-08 | 2005-02-17 | University Health Network | Method and system for calibrating a source and detector instrument |
| US6944260B2 (en) | 2003-11-11 | 2005-09-13 | Ge Medical Systems Global Technology Company, Llc | Methods and apparatus for artifact reduction in computed tomography imaging systems |
| US20050143777A1 (en) * | 2003-12-19 | 2005-06-30 | Sra Jasbir S. | Method and system of treatment of heart failure using 4D imaging |
| DE102004004620A1 (en) | 2004-01-29 | 2005-08-25 | Siemens Ag | Medical x-ray imaging method for recording an examination area for use in medical navigational procedures, whereby a spatial position of an examination area is recorded just prior to each shot and images then spatially compensated |
| EP1718202B1 (en) | 2004-02-18 | 2012-08-01 | Philips Intellectual Property & Standards GmbH | Device and method for the determination of the position of a catheter in a vascular system |
| US7035371B2 (en) | 2004-03-22 | 2006-04-25 | Siemens Aktiengesellschaft | Method and device for medical imaging |
| JP2008505852A (en) | 2004-03-29 | 2008-02-28 | ジヤンセン・フアーマシユーチカ・ナームローゼ・フエンノートシヤツプ | PROKINETICIN 2β PEPTIDE AND USE THEREOF |
| DE102004016586A1 (en) | 2004-03-31 | 2005-11-03 | Siemens Ag | Image reconstruction device for an X-ray device and method for local 3D reconstruction of an object region |
| US7142633B2 (en) | 2004-03-31 | 2006-11-28 | General Electric Company | Enhanced X-ray imaging system and method |
| US7620223B2 (en) | 2004-04-22 | 2009-11-17 | Siemens Medical Solutions Usa, Inc. | Method and system for registering pre-procedural images with intra-procedural images using a pre-computed knowledge base |
| US7567834B2 (en) | 2004-05-03 | 2009-07-28 | Medtronic Navigation, Inc. | Method and apparatus for implantation between two vertebral bodies |
| US7097357B2 (en) | 2004-06-02 | 2006-08-29 | General Electric Company | Method and system for improved correction of registration error in a fluoroscopic image |
| DE102004030836A1 (en) | 2004-06-25 | 2006-01-26 | Siemens Ag | Process for the image representation of a medical instrument, in particular a catheter, introduced into a region of examination of a patient that moves rhythmically or arrhythmically |
| EP1781174A4 (en) | 2004-08-16 | 2009-08-05 | Corindus Ltd | IMAGE-GUIDED NAVIGATION FOR INTERVENTIONS INVOLVING THE INSTALLATION OF A CATHETER |
| US7327872B2 (en) | 2004-10-13 | 2008-02-05 | General Electric Company | Method and system for registering 3D models of anatomical regions with projection images of the same |
| US8515527B2 (en) | 2004-10-13 | 2013-08-20 | General Electric Company | Method and apparatus for registering 3D models of anatomical regions of a heart and a tracking system with projection images of an interventional fluoroscopic system |
| BRPI0518437A2 (en) | 2004-11-16 | 2008-11-18 | Brian Cran | lung treatment device and method |
| MX2007006441A (en) | 2004-11-30 | 2007-08-14 | Johnson & Johnson | Lung cancer prognostics. |
| US7720520B2 (en) | 2004-12-01 | 2010-05-18 | Boston Scientific Scimed, Inc. | Method and system for registering an image with a navigation reference catheter |
| WO2006063141A2 (en) | 2004-12-07 | 2006-06-15 | Medical Metrx Solutions, Inc. | Intraoperative c-arm fluoroscope datafusion system |
| JP4649219B2 (en) | 2005-02-01 | 2011-03-09 | キヤノン株式会社 | Stereo image generator |
| US7359477B2 (en) | 2005-02-15 | 2008-04-15 | Siemens Aktiengesellschaft | Method for reconstructing a CT image using an algorithm for a short-scan circle combined with various lines |
| FR2882245B1 (en) | 2005-02-21 | 2007-05-18 | Gen Electric | METHOD FOR DETERMINING THE 3D DISPLACEMENT OF A PATIENT POSITIONED ON A TABLE OF AN IMAGING DEVICE |
| WO2006095324A1 (en) | 2005-03-10 | 2006-09-14 | Koninklijke Philips Electronics N.V. | Image processing system and method for registration of two-dimensional with three-dimensional volume data during interventional procedures |
| EP1869637A1 (en) | 2005-03-31 | 2007-12-26 | Paieon Inc. | Method and apparatus for positioning a device in a tubular organ |
| EP1876988B1 (en) | 2005-04-26 | 2016-06-08 | Koninklijke Philips N.V. | Medical viewing system and method for detecting and enhancing static structures in noisy images using motion of the image acquisition means |
| US7844094B2 (en) | 2005-04-29 | 2010-11-30 | Varian Medical Systems, Inc. | Systems and methods for determining geometric parameters of imaging devices |
| DE102005021068B4 (en) | 2005-05-06 | 2010-09-16 | Siemens Ag | Method for presetting the acquisition parameters when creating two-dimensional transmitted X-ray images |
| DE102005023167B4 (en) | 2005-05-19 | 2008-01-03 | Siemens Ag | Method and device for registering 2D projection images relative to a 3D image data set |
| DE102005023194A1 (en) | 2005-05-19 | 2006-11-23 | Siemens Ag | Method for expanding the display area of 2D image recordings of an object area |
| US7603155B2 (en) | 2005-05-24 | 2009-10-13 | General Electric Company | Method and system of acquiring images with a medical imaging device |
| DE102005030646B4 (en) | 2005-06-30 | 2008-02-07 | Siemens Ag | A method of contour visualization of at least one region of interest in 2D fluoroscopic images |
| DE102005032523B4 (en) | 2005-07-12 | 2009-11-05 | Siemens Ag | Method for the pre-interventional planning of a 2D fluoroscopy projection |
| DE102005032755B4 (en) | 2005-07-13 | 2014-09-04 | Siemens Aktiengesellschaft | System for performing and monitoring minimally invasive procedures |
| DE102005036322A1 (en) | 2005-07-29 | 2007-02-15 | Siemens Ag | Intraoperative registration method for intraoperative image data sets, involves spatial calibration of optical three-dimensional sensor system with intraoperative imaging modality |
| EP1782734B1 (en) | 2005-11-05 | 2018-10-24 | Ziehm Imaging GmbH | Device for improving volume reconstruction |
| US7950849B2 (en) | 2005-11-29 | 2011-05-31 | General Electric Company | Method and device for geometry analysis and calibration of volumetric imaging systems |
| DE102005059804A1 (en) | 2005-12-14 | 2007-07-05 | Siemens Ag | Navigation of inserted medical instrument in a patient, e.g. a catheter, uses initial three dimensional image of the target zone to give a number of two-dimensional images for comparison with fluoroscopic images taken during the operation |
| CN101325912B (en) | 2005-12-15 | 2011-01-12 | 皇家飞利浦电子股份有限公司 | Systems and methods for visualizing cardiac morphology during electrophysiological mapping and therapy |
| DE102006008042A1 (en) | 2006-02-21 | 2007-07-19 | Siemens Ag | Medical device e.g. c-arc x-ray device, for receiving image of area of body of patient, has position determining devices determining position of medical instrument within operation areas, where devices are arranged in device |
| US8526688B2 (en) | 2006-03-09 | 2013-09-03 | General Electric Company | Methods and systems for registration of surgical navigation data and image data |
| DE102006011242B4 (en) | 2006-03-10 | 2012-03-29 | Siemens Ag | Method for reconstructing a 3D representation |
| US8208708B2 (en) | 2006-03-30 | 2012-06-26 | Koninklijke Philips Electronics N.V. | Targeting method, targeting device, computer readable medium and program element |
| US7467007B2 (en) | 2006-05-16 | 2008-12-16 | Siemens Medical Solutions Usa, Inc. | Respiratory gated image fusion of computed tomography 3D images and live fluoroscopy images |
| DE102006024425A1 (en) | 2006-05-24 | 2007-11-29 | Siemens Ag | Medical instrument e.g. catheter, localizing method during electrophysiological procedure, involves obtaining position information of instrument using electromagnetic localization system, and recording two-dimensional X-ray images |
| US9055906B2 (en) | 2006-06-14 | 2015-06-16 | Intuitive Surgical Operations, Inc. | In-vivo visualization systems |
| US11389235B2 (en) | 2006-07-14 | 2022-07-19 | Neuwave Medical, Inc. | Energy delivery systems and uses thereof |
| FR2904750B1 (en) | 2006-08-03 | 2008-10-17 | Gen Electric | METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION OF AN OUTER ENVELOPE OF A BODY OF AN X-RAY APPARATUS |
| DE102006041033B4 (en) | 2006-09-01 | 2017-01-19 | Siemens Healthcare Gmbh | Method for reconstructing a three-dimensional image volume |
| US8248413B2 (en) | 2006-09-18 | 2012-08-21 | Stryker Corporation | Visual navigation system for endoscopic surgery |
| EP2074383B1 (en) | 2006-09-25 | 2016-05-11 | Mazor Robotics Ltd. | C-arm computerized tomography |
| DE102006046735A1 (en) | 2006-09-29 | 2008-04-10 | Siemens Ag | Images e.g. two dimensional-radioscopy image and virtual endoscopy image, fusing device for C-arm device, has image fusion unit for generating image representation from segmented surface with projected image point |
| US7711409B2 (en) | 2006-10-04 | 2010-05-04 | Hampton University | Opposed view and dual head detector apparatus for diagnosis and biopsy with image processing methods |
| US8320992B2 (en) | 2006-10-05 | 2012-11-27 | Visionsense Ltd. | Method and system for superimposing three dimensional medical information on a three dimensional image |
| EP2083691A2 (en) | 2006-11-16 | 2009-08-05 | Koninklijke Philips Electronics N.V. | Computer tomography (ct) c-arm system and method for examination of an object |
| US7831096B2 (en) | 2006-11-17 | 2010-11-09 | General Electric Company | Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use |
| US7671887B2 (en) | 2006-11-20 | 2010-03-02 | General Electric Company | System and method of navigating a medical instrument |
| US8111895B2 (en) | 2006-12-06 | 2012-02-07 | Siemens Medical Solutions Usa, Inc. | Locally adaptive image enhancement for digital subtraction X-ray imaging |
| IL188569A (en) | 2007-01-17 | 2014-05-28 | Mediguide Ltd | Method and system for registering a 3d pre-acquired image coordinate system with a medical positioning system coordinate system and with a 2d image coordinate system |
| US7655004B2 (en) | 2007-02-15 | 2010-02-02 | Ethicon Endo-Surgery, Inc. | Electroporation ablation apparatus, system, and method |
| EP2358269B1 (en) | 2007-03-08 | 2019-04-10 | Sync-RX, Ltd. | Image processing and tool actuation for medical procedures |
| EP2117436A4 (en) * | 2007-03-12 | 2011-03-02 | David Tolkowsky | Devices and methods for performing medical procedures in tree-like luminal structures |
| DE102007013807B4 (en) | 2007-03-22 | 2014-11-13 | Siemens Aktiengesellschaft | Method for assisting the navigation of interventional tools when performing CT- and MRI-guided interventions at a given intervention level |
| US9278203B2 (en) | 2007-03-26 | 2016-03-08 | Covidien Lp | CT-enhanced fluoroscopy |
| US7899226B2 (en) | 2007-04-03 | 2011-03-01 | General Electric Company | System and method of navigating an object in an imaged subject |
| US7853061B2 (en) | 2007-04-26 | 2010-12-14 | General Electric Company | System and method to improve visibility of an object in an imaged subject |
| US8798339B2 (en) | 2007-05-10 | 2014-08-05 | Koninklijke Philips N.V. | Targeting method, targeting device, computer readable medium and program element |
| DE102007026115B4 (en) | 2007-06-05 | 2017-10-12 | Siemens Healthcare Gmbh | Method for generating a 3D reconstruction of a body |
| US7991450B2 (en) | 2007-07-02 | 2011-08-02 | General Electric Company | Methods and systems for volume fusion in diagnostic imaging |
| FR2919096A1 (en) | 2007-07-19 | 2009-01-23 | Gen Electric | METHOD OF CORRECTING RADIOGRAPHIC IMAGE RECOVERY |
| US8335359B2 (en) | 2007-07-20 | 2012-12-18 | General Electric Company | Systems, apparatus and processes for automated medical image segmentation |
| DE102007042333A1 (en) | 2007-09-06 | 2009-03-12 | Siemens Ag | Method for determining a mapping rule and method for generating a 3D reconstruction |
| US8126226B2 (en) * | 2007-09-20 | 2012-02-28 | General Electric Company | System and method to generate a selected visualization of a radiological image of an imaged subject |
| US8271068B2 (en) | 2007-10-02 | 2012-09-18 | Siemens Aktiengesellschaft | Method for dynamic road mapping |
| US8270691B2 (en) | 2007-10-09 | 2012-09-18 | Siemens Aktiengesellschaft | Method for fusing images acquired from a plurality of different image acquiring modalities |
| US8090168B2 (en) | 2007-10-15 | 2012-01-03 | General Electric Company | Method and system for visualizing registered images |
| JP5106978B2 (en) | 2007-10-15 | 2012-12-26 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | X-ray CT system |
| US8195271B2 (en) | 2007-11-06 | 2012-06-05 | Siemens Aktiengesellschaft | Method and system for performing ablation to treat ventricular tachycardia |
| US9001121B2 (en) | 2007-11-15 | 2015-04-07 | The Boeing Company | Method and apparatus for generating data for three-dimensional models from x-rays |
| JP5229865B2 (en) | 2007-11-30 | 2013-07-03 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | X-ray CT system |
| CN101903908A (en) | 2007-12-18 | 2010-12-01 | 皇家飞利浦电子股份有限公司 | Feature-based 2D/3D image registration |
| DE102007061935A1 (en) | 2007-12-21 | 2009-06-25 | Siemens Aktiengesellschaft | Method for improving the quality of computed tomographic image series by image processing and CT system with arithmetic unit |
| US9445772B2 (en) | 2007-12-31 | 2016-09-20 | St. Jude Medical, Atrial Fibrillatin Division, Inc. | Reduced radiation fluoroscopic system |
| DE102008003173B4 (en) | 2008-01-04 | 2016-05-19 | Siemens Aktiengesellschaft | Method and device for computed tomography for |
| EP2082686B1 (en) | 2008-01-22 | 2015-01-21 | Brainlab AG | Method to display orientated (overlap) images |
| US20090192385A1 (en) * | 2008-01-25 | 2009-07-30 | Oliver Meissner | Method and system for virtual roadmap imaging |
| US8340379B2 (en) | 2008-03-07 | 2012-12-25 | Inneroptic Technology, Inc. | Systems and methods for displaying guidance data based on updated deformable imaging data |
| DE102008018269A1 (en) | 2008-04-10 | 2009-10-29 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Apparatus and method for rotation-free computed tomography |
| US8532259B2 (en) | 2008-04-17 | 2013-09-10 | University Of Florida Research Foundation, Inc. | Method and apparatus for computed imaging backscatter radiography |
| DE102008020948A1 (en) | 2008-04-25 | 2009-11-26 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | X-ray computer tomograph and method for examining a component by means of X-ray computer tomography |
| US9072905B2 (en) | 2008-05-15 | 2015-07-07 | Intelect Medical, Inc. | Clinician programmer system and method for steering volumes of activation |
| US8218847B2 (en) | 2008-06-06 | 2012-07-10 | Superdimension, Ltd. | Hybrid registration method |
| US20100013812A1 (en) * | 2008-07-18 | 2010-01-21 | Wei Gu | Systems for Controlling Computers and Devices |
| EP2156790B1 (en) | 2008-08-22 | 2012-03-28 | BrainLAB AG | Allocation of x-ray markers to picture markers depicted in an x-ray picture |
| JP2011239796A (en) * | 2008-09-03 | 2011-12-01 | Konica Minolta Medical & Graphic Inc | Medical image processor, and medical image processing method |
| DE102008050844B4 (en) | 2008-10-08 | 2016-04-21 | Siemens Aktiengesellschaft | Device and method for localizing a tissue change for a biopsy |
| JP5739812B2 (en) | 2008-10-10 | 2015-06-24 | コーニンクレッカ フィリップス エヌ ヴェ | Method of operating angiographic image acquisition device, collimator control unit, angiographic image acquisition device, and computer software |
| US8361066B2 (en) | 2009-01-12 | 2013-01-29 | Ethicon Endo-Surgery, Inc. | Electrical ablation devices |
| EP2586374B1 (en) | 2009-01-21 | 2015-03-18 | Koninklijke Philips N.V. | Method and apparatus for large field of view imaging and detection and compensation of motion artifacts |
| US8641621B2 (en) * | 2009-02-17 | 2014-02-04 | Inneroptic Technology, Inc. | Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures |
| US7912180B2 (en) | 2009-02-19 | 2011-03-22 | Kabushiki Kaisha Toshiba | Scattered radiation correction method and scattered radiation correction apparatus |
| US20180009767A9 (en) | 2009-03-19 | 2018-01-11 | The Johns Hopkins University | Psma targeted fluorescent agents for image guided surgery |
| US10004387B2 (en) | 2009-03-26 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Method and system for assisting an operator in endoscopic navigation |
| EP3427687A1 (en) | 2009-05-14 | 2019-01-16 | Covidien LP | Automatic registration technique |
| US8423117B2 (en) | 2009-06-22 | 2013-04-16 | General Electric Company | System and method to process an acquired image of a subject anatomy to differentiate a portion of subject anatomy to protect relative to a portion to receive treatment |
| US8675996B2 (en) | 2009-07-29 | 2014-03-18 | Siemens Aktiengesellschaft | Catheter RF ablation using segmentation-based 2D-3D registration |
| RU2568321C2 (en) | 2009-08-20 | 2015-11-20 | Конинклейке Филипс Электроникс Н.В. | Reconstruction of region-of-interest image |
| US8666137B2 (en) | 2009-09-07 | 2014-03-04 | Koninklijke Philips N.V. | Apparatus and method for processing projection data |
| US8706184B2 (en) | 2009-10-07 | 2014-04-22 | Intuitive Surgical Operations, Inc. | Methods and apparatus for displaying enhanced imaging data on a clinical image |
| DE102009049818A1 (en) | 2009-10-19 | 2011-04-21 | Siemens Aktiengesellschaft | Method for determining the projection geometry of an X-ray system |
| US8694075B2 (en) | 2009-12-21 | 2014-04-08 | General Electric Company | Intra-operative registration for navigated surgical procedures |
| JP5677738B2 (en) | 2009-12-24 | 2015-02-25 | 株式会社東芝 | X-ray computed tomography system |
| JP5795599B2 (en) | 2010-01-13 | 2015-10-14 | コーニンクレッカ フィリップス エヌ ヴェ | Image integration based registration and navigation for endoscopic surgery |
| JP4956635B2 (en) * | 2010-02-24 | 2012-06-20 | 財団法人仙台市医療センター | Percutaneous puncture support system |
| EP2557998B1 (en) | 2010-04-15 | 2020-12-23 | Koninklijke Philips N.V. | Instrument-based image registration for fusing images with tubular structures |
| US9401047B2 (en) | 2010-04-15 | 2016-07-26 | Siemens Medical Solutions, Usa, Inc. | Enhanced visualization of medical image data |
| JP6153865B2 (en) | 2010-05-03 | 2017-06-28 | ニューウェーブ メディカル, インコーポレイテッドNeuwave Medical, Inc. | Energy delivery system |
| DE102010019632A1 (en) | 2010-05-06 | 2011-11-10 | Siemens Aktiengesellschaft | Method for recording and reconstructing a three-dimensional image data set and x-ray device |
| US8625869B2 (en) | 2010-05-21 | 2014-01-07 | Siemens Medical Solutions Usa, Inc. | Visualization of medical image data with localized enhancement |
| JP5899208B2 (en) | 2010-05-27 | 2016-04-06 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Improved reconstruction for cone-beam computed tomography imaging with an eccentric flat panel detector |
| CA2743937A1 (en) | 2010-06-22 | 2011-12-22 | Queen's University At Kingston | C-arm pose estimation using intensity-based registration of imaging modalities |
| US8718346B2 (en) | 2011-10-05 | 2014-05-06 | Saferay Spine Llc | Imaging system and method for use in surgical and interventional medical procedures |
| US8526700B2 (en) | 2010-10-06 | 2013-09-03 | Robert E. Isaacs | Imaging system and method for surgical and interventional medical procedures |
| DE102011005119A1 (en) | 2011-03-04 | 2012-09-06 | Siemens Aktiengesellschaft | A method of providing a 3D image data set of a physiological object having metal objects therein |
| RU2586448C2 (en) | 2011-03-04 | 2016-06-10 | Конинклейке Филипс Н.В. | Combination of two-dimensional/three-dimensional images |
| WO2012131610A1 (en) | 2011-04-01 | 2012-10-04 | Koninklijke Philips Electronics N.V. | X-ray pose recovery |
| DE102011006991B4 (en) | 2011-04-07 | 2018-04-05 | Siemens Healthcare Gmbh | X-ray method and X-ray device for assembling X-ray images and determining three-dimensional volume data |
| US9265468B2 (en) | 2011-05-11 | 2016-02-23 | Broncus Medical, Inc. | Fluoroscopy-based surgical device tracking method |
| US8900131B2 (en) | 2011-05-13 | 2014-12-02 | Intuitive Surgical Operations, Inc. | Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery |
| US8827934B2 (en) | 2011-05-13 | 2014-09-09 | Intuitive Surgical Operations, Inc. | Method and system for determining information of extrema during expansion and contraction cycles of an object |
| JP2012249960A (en) | 2011-06-06 | 2012-12-20 | Toshiba Corp | Medical image processor |
| US10734116B2 (en) * | 2011-10-04 | 2020-08-04 | Quantant Technology, Inc. | Remote cloud based medical image sharing and rendering semi-automated or fully automated network and/or web-based, 3D and/or 4D imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard X-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data |
| US20130303944A1 (en) | 2012-05-14 | 2013-11-14 | Intuitive Surgical Operations, Inc. | Off-axis electromagnetic sensor |
| KR101146833B1 (en) | 2011-12-09 | 2012-05-21 | 전남대학교산학협력단 | The non-rotation ct system |
| US9155470B2 (en) | 2012-01-24 | 2015-10-13 | Siemens Aktiengesellschaft | Method and system for model based fusion on pre-operative computed tomography and intra-operative fluoroscopy using transesophageal echocardiography |
| EP3488803B1 (en) | 2012-02-03 | 2023-09-27 | Intuitive Surgical Operations, Inc. | Steerable flexible needle with embedded shape sensing |
| US9031188B2 (en) | 2012-02-08 | 2015-05-12 | Georgetown Rail Equipment Company | Internal imaging system |
| DE102012204019B4 (en) | 2012-03-14 | 2018-02-08 | Siemens Healthcare Gmbh | Method for reducing motion artifacts |
| CN104302241B (en) | 2012-05-14 | 2018-10-23 | 直观外科手术操作公司 | Registration system and method for medical devices using reduced search space |
| EP3470003B1 (en) | 2012-05-14 | 2024-09-11 | Intuitive Surgical Operations, Inc. | Systems for deformation compensation using shape sensing |
| EP2849668B1 (en) | 2012-05-14 | 2018-11-14 | Intuitive Surgical Operations Inc. | Systems and methods for registration of a medical device using rapid pose search |
| US20130303945A1 (en) | 2012-05-14 | 2013-11-14 | Intuitive Surgical Operations, Inc. | Electromagnetic tip sensor |
| US10039473B2 (en) | 2012-05-14 | 2018-08-07 | Intuitive Surgical Operations, Inc. | Systems and methods for navigation based on ordered sensor records |
| US11399900B2 (en) * | 2012-06-21 | 2022-08-02 | Globus Medical, Inc. | Robotic systems providing co-registration using natural fiducials and related methods |
| US9429696B2 (en) | 2012-06-25 | 2016-08-30 | Intuitive Surgical Operations, Inc. | Systems and methods for reducing measurement error in optical fiber shape sensors |
| US20150227679A1 (en) | 2012-07-12 | 2015-08-13 | Ao Technology Ag | Method for generating a graphical 3d computer model of at least one anatomical structure in a selectable pre-, intra-, or postoperative status |
| US9801551B2 (en) | 2012-07-20 | 2017-10-31 | Intuitive Sugical Operations, Inc. | Annular vision system |
| US20140037049A1 (en) * | 2012-07-31 | 2014-02-06 | General Electric Company | Systems and methods for interventional imaging |
| US9490650B2 (en) * | 2012-08-02 | 2016-11-08 | Sandisk Technologies Llc | Wireless power transfer |
| JP6074587B2 (en) | 2012-08-06 | 2017-02-08 | 株式会社Joled | Display panel, display device and electronic device |
| WO2014028394A1 (en) | 2012-08-14 | 2014-02-20 | Intuitive Surgical Operations, Inc. | Systems and methods for registration of multiple vision systems |
| KR102196291B1 (en) | 2012-10-12 | 2020-12-30 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Determining position of medical device in branched anatomical structure |
| US9001962B2 (en) | 2012-12-20 | 2015-04-07 | Triple Ring Technologies, Inc. | Method and apparatus for multiple X-ray imaging applications |
| KR20140092437A (en) | 2012-12-27 | 2014-07-24 | 삼성전자주식회사 | Medical image system and method for controlling the medical image system |
| US20140188440A1 (en) | 2012-12-31 | 2014-07-03 | Intuitive Surgical Operations, Inc. | Systems And Methods For Interventional Procedure Planning |
| WO2014110169A1 (en) * | 2013-01-08 | 2014-07-17 | Biocardia, Inc. | Target site selection, entry and update with automatic remote image annotation |
| CN108784702B (en) | 2013-03-15 | 2021-11-12 | 直观外科手术操作公司 | Shape sensor system for tracking interventional instruments and method of use |
| JP5902878B1 (en) | 2013-03-15 | 2016-04-13 | メディガイド リミテッド | Medical device guidance system |
| CN114343608B (en) | 2013-07-29 | 2024-07-26 | 直观外科手术操作公司 | Shape sensor system with redundant sensing |
| JP6562919B2 (en) | 2013-08-15 | 2019-08-21 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | System and method for medical treatment confirmation |
| KR102356881B1 (en) | 2013-08-15 | 2022-02-03 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Graphical user interface for catheter positioning and insertion |
| US9763741B2 (en) | 2013-10-24 | 2017-09-19 | Auris Surgical Robotics, Inc. | System for robotic-assisted endolumenal surgery and related methods |
| US9171365B2 (en) | 2013-11-29 | 2015-10-27 | Kabushiki Kaisha Toshiba | Distance driven computation balancing |
| CN105979899B (en) | 2013-12-09 | 2019-10-01 | 直观外科手术操作公司 | System and method for device-aware compliant tool registration |
| WO2015101948A2 (en) | 2014-01-06 | 2015-07-09 | Body Vision Medical Ltd. | Surgical devices and methods of use thereof |
| KR20160118295A (en) | 2014-02-04 | 2016-10-11 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Systems and methods for non-rigid deformation of tissue for virtual navigation of interventional tools |
| US20150223765A1 (en) | 2014-02-07 | 2015-08-13 | Intuitive Surgical Operations, Inc. | Systems and methods for using x-ray field emission to determine instrument position and orientation |
| US9280825B2 (en) * | 2014-03-10 | 2016-03-08 | Sony Corporation | Image processing system with registration mechanism and method of operation thereof |
| KR102372763B1 (en) | 2014-03-17 | 2022-03-10 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Surgical system including a non-white light general illuminator |
| US10912523B2 (en) | 2014-03-24 | 2021-02-09 | Intuitive Surgical Operations, Inc. | Systems and methods for anatomic motion compensation |
| CN106535812B (en) | 2014-03-28 | 2020-01-21 | 直观外科手术操作公司 | Surgical system with haptic feedback based on quantitative three-dimensional imaging |
| JP6707036B2 (en) | 2014-07-02 | 2020-06-10 | コヴィディエン リミテッド パートナーシップ | alignment |
| EP3169250B1 (en) | 2014-07-18 | 2019-08-28 | Ethicon, Inc. | Devices for controlling the size of emphysematous bullae |
| EP3169247B1 (en) | 2014-07-18 | 2020-05-13 | Ethicon, Inc. | Mechanical retraction via tethering for lung volume reduction |
| CN106714724B (en) | 2014-07-28 | 2019-10-25 | 直观外科手术操作公司 | System and method for planning multiple interventional procedures |
| JP6722652B2 (en) | 2014-07-28 | 2020-07-15 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | System and method for intraoperative segmentation |
| US9754372B2 (en) * | 2014-08-15 | 2017-09-05 | Biosense Webster (Israel) Ltd. | Marking of fluoroscope field-of-view |
| WO2016032846A1 (en) | 2014-08-23 | 2016-03-03 | Intuitive Surgical Operations, Inc. | Systems and methods for display of pathological data in an image guided procedure |
| US10373719B2 (en) | 2014-09-10 | 2019-08-06 | Intuitive Surgical Operations, Inc. | Systems and methods for pre-operative modeling |
| US10314513B2 (en) | 2014-10-10 | 2019-06-11 | Intuitive Surgical Operations, Inc. | Systems and methods for reducing measurement error using optical fiber shape sensors |
| US9721379B2 (en) * | 2014-10-14 | 2017-08-01 | Biosense Webster (Israel) Ltd. | Real-time simulation of fluoroscopic images |
| CN110811488B (en) | 2014-10-17 | 2023-07-14 | 直观外科手术操作公司 | System and method for reducing measurement errors using fiber optic shape sensors |
| AU2015338824A1 (en) | 2014-10-20 | 2017-04-27 | Body Vision Medical Ltd. | Surgical devices and methods of use thereof |
| KR102328266B1 (en) * | 2014-10-29 | 2021-11-19 | 삼성전자주식회사 | Image processing apparatus and image processing method, and ultrasound apparatus |
| US9974525B2 (en) * | 2014-10-31 | 2018-05-22 | Covidien Lp | Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same |
| EP3217911B1 (en) | 2014-11-13 | 2023-01-04 | Intuitive Surgical Operations, Inc. | Systems for filtering localization data |
| WO2016106114A1 (en) | 2014-12-22 | 2016-06-30 | Intuitive Surgical Operations, Inc. | Flexible electromagnetic sensor |
| US10163204B2 (en) * | 2015-02-13 | 2018-12-25 | St. Jude Medical International Holding S.À R.L. | Tracking-based 3D model enhancement |
| KR102542190B1 (en) | 2015-04-06 | 2023-06-12 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | System and method of registration compensation in image-guided surgery |
| EP3297529B1 (en) | 2015-05-22 | 2025-03-12 | Intuitive Surgical Operations, Inc. | Systems for registration for image guided surgery |
| US10674982B2 (en) * | 2015-08-06 | 2020-06-09 | Covidien Lp | System and method for local three dimensional volume reconstruction using a standard fluoroscope |
| CN108024698B (en) | 2015-08-14 | 2020-09-15 | 直观外科手术操作公司 | Registration system and method for image-guided surgery |
| CN108024699B (en) | 2015-08-14 | 2020-11-03 | 直观外科手术操作公司 | Registration system and method for image-guided surgery |
| US10245034B2 (en) | 2015-08-31 | 2019-04-02 | Ethicon Llc | Inducing tissue adhesions using surgical adjuncts and medicants |
| US10569071B2 (en) | 2015-08-31 | 2020-02-25 | Ethicon Llc | Medicant eluting adjuncts and methods of using medicant eluting adjuncts |
| WO2017044874A1 (en) | 2015-09-10 | 2017-03-16 | Intuitive Surgical Operations, Inc. | Systems and methods for using tracking in image-guided medical procedure |
| KR102661990B1 (en) | 2015-09-18 | 2024-05-02 | 아우리스 헬스, 인크. | Exploration of tubular networks |
| KR102601297B1 (en) | 2015-10-26 | 2023-11-14 | 뉴웨이브 메디컬, 인코포레이티드 | Energy transfer systems and their uses |
| US10405753B2 (en) | 2015-11-10 | 2019-09-10 | Intuitive Surgical Operations, Inc. | Pharmaceutical compositions of near IR closed chain, sulfo-cyanine dyes |
| WO2017106003A1 (en) | 2015-12-14 | 2017-06-22 | Intuitive Surgical Operations, Inc. | Apparatus and method for generating 3-d data for an anatomical target using optical fiber shape sensing |
| US9996361B2 (en) | 2015-12-23 | 2018-06-12 | Intel Corporation | Byte and nibble sort instructions that produce sorted destination register and destination index mapping |
| CN108024838B (en) | 2016-02-12 | 2021-10-01 | 直观外科手术操作公司 | System and method for using registered fluoroscopic images in image-guided surgery |
| JP6828047B2 (en) | 2016-02-12 | 2021-02-10 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | Posture estimation and calibration system and method for fluoroscopic imaging system in image-guided surgery |
| JP6976266B2 (en) | 2016-03-10 | 2021-12-08 | ボディ・ビジョン・メディカル・リミテッドBody Vision Medical Ltd. | Methods and systems for using multi-view pose estimation |
| US10702137B2 (en) | 2016-03-14 | 2020-07-07 | Intuitive Surgical Operations, Inc.. | Endoscopic instrument with compliant thermal interface |
| US20170296679A1 (en) | 2016-04-18 | 2017-10-19 | Intuitive Surgical Operations, Inc. | Compositions of Near IR Closed Chain, Sulfo-Cyanine Dyes and Prostate Specific Membrane Antigen Ligands |
| CN105968157B (en) * | 2016-04-29 | 2019-04-09 | 武汉大学 | An aptamer probe with photoactivation properties and a method for detecting cancer sites |
| US11266387B2 (en) | 2016-06-15 | 2022-03-08 | Intuitive Surgical Operations, Inc. | Systems and methods of integrated real-time visualization |
| US11819284B2 (en) | 2016-06-30 | 2023-11-21 | Intuitive Surgical Operations, Inc. | Graphical user interface for displaying guidance information during an image-guided procedure |
| CN114027987B (en) | 2016-06-30 | 2024-09-13 | 直观外科手术操作公司 | A graphical user interface that displays guidance information in multiple modes during image-guided procedures |
| US11324393B2 (en) | 2016-08-16 | 2022-05-10 | Intuitive Surgical Operations, Inc. | Augmented accuracy using large diameter shape fiber |
| WO2018038999A1 (en) | 2016-08-23 | 2018-03-01 | Intuitive Surgical Operations, Inc. | Systems and methods for monitoring patient motion during a medical procedure |
| KR102401263B1 (en) | 2016-09-21 | 2022-05-24 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Systems and methods for instrument buckling detection |
| US11219490B2 (en) | 2016-09-30 | 2022-01-11 | Intuitive Surgical Operations, Inc. | Systems and methods for entry point localization |
| WO2018075911A1 (en) | 2016-10-21 | 2018-04-26 | Intuitive Surgical Operations, Inc. | Shape sensing with multi-core fiber sensor |
| JP7229155B2 (en) | 2016-11-02 | 2023-02-27 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | Sequential registration system and method for image-guided surgery |
| US20180144092A1 (en) | 2016-11-21 | 2018-05-24 | Johnson & Johnson Vision Care, Inc. | Biomedical sensing methods and apparatus for the detection and prevention of lung cancer states |
| US11547490B2 (en) | 2016-12-08 | 2023-01-10 | Intuitive Surgical Operations, Inc. | Systems and methods for navigation in image-guided medical procedures |
| EP4104746B1 (en) | 2016-12-09 | 2023-12-06 | Intuitive Surgical Operations, Inc. | System and method for distributed heat flux sensing of body tissue |
| WO2018129532A1 (en) | 2017-01-09 | 2018-07-12 | Intuitive Surgical Operations, Inc. | Systems and methods for registering elongate devices to three dimensional images in image-guided procedures |
| WO2018144726A1 (en) | 2017-02-01 | 2018-08-09 | Intuitive Surgical Operations, Inc. | Systems and methods for data filtering of passageway sensor data |
| WO2018144698A1 (en) | 2017-02-01 | 2018-08-09 | Intuitive Surgical Operations, Inc. | Systems and methods of registration for image-guided procedures |
| EP3576598B1 (en) | 2017-02-01 | 2024-04-24 | Intuitive Surgical Operations, Inc. | System of registration for image-guided procedures |
| JP7282685B2 (en) | 2017-03-31 | 2023-05-29 | オーリス ヘルス インコーポレイテッド | A robotic system for navigation of luminal networks with compensation for physiological noise |
| WO2018195216A1 (en) | 2017-04-18 | 2018-10-25 | Intuitive Surgical Operations, Inc. | Graphical user interface for monitoring an image-guided procedure |
| CN119069085A (en) | 2017-04-18 | 2024-12-03 | 直观外科手术操作公司 | Graphical user interface for planning programs |
| CN111246791A (en) | 2017-05-24 | 2020-06-05 | 博迪维仁医疗有限公司 | Method for three-dimensional reconstruction of images and improved target localization using a radial endobronchial ultrasound probe |
| US20180360342A1 (en) * | 2017-06-16 | 2018-12-20 | Biosense Webster (Israel) Ltd. | Renal ablation and visualization system and method with composite anatomical display image |
| US10022192B1 (en) | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
| EP3641686B1 (en) | 2017-06-23 | 2024-12-04 | Intuitive Surgical Operations, Inc. | System for navigating to a target location during a medical procedure |
| CN118121324A (en) | 2017-06-28 | 2024-06-04 | 奥瑞斯健康公司 | System for detecting electromagnetic distortion |
| EP3644885B1 (en) | 2017-06-28 | 2023-10-11 | Auris Health, Inc. | Electromagnetic field generator alignment |
| EP3668582B1 (en) | 2017-08-16 | 2023-10-04 | Intuitive Surgical Operations, Inc. | Systems for monitoring patient motion during a medical procedure |
| US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
| US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
| CN111386086A (en) | 2017-11-14 | 2020-07-07 | 直观外科手术操作公司 | System and method for cleaning endoscopic instruments |
| EP3684281B1 (en) | 2017-12-08 | 2025-03-12 | Auris Health, Inc. | System for medical instrument navigation and targeting |
| WO2019113389A1 (en) | 2017-12-08 | 2019-06-13 | Auris Health, Inc. | Directed fluidics |
| CN110869173B (en) | 2017-12-14 | 2023-11-17 | 奥瑞斯健康公司 | System and method for estimating instrument positioning |
| EP3684283A4 (en) | 2017-12-18 | 2021-07-14 | Auris Health, Inc. | METHODS AND SYSTEMS FOR MONITORING AND NAVIGATION OF INSTRUMENTS IN LUMINAL NETWORKS |
| EP3749239B1 (en) | 2018-02-05 | 2024-08-07 | Broncus Medical Inc. | Image-guided lung tumor planning and ablation system |
| US10885630B2 (en) | 2018-03-01 | 2021-01-05 | Intuitive Surgical Operations, Inc | Systems and methods for segmentation of anatomical structures for image-guided surgery |
| US10980913B2 (en) | 2018-03-05 | 2021-04-20 | Ethicon Llc | Sealant foam compositions for lung applications |
| US20190269819A1 (en) | 2018-03-05 | 2019-09-05 | Ethicon Llc | Sealant foam compositions for lung applications |
| US20190298451A1 (en) | 2018-03-27 | 2019-10-03 | Intuitive Surgical Operations, Inc. | Systems and methods for delivering targeted therapy |
| MX2020010112A (en) | 2018-03-28 | 2020-11-06 | Auris Health Inc | Systems and methods for registration of location sensors. |
| CN110913791B (en) | 2018-03-28 | 2021-10-08 | 奥瑞斯健康公司 | System and method for displaying estimated instrument positioning |
| US10905499B2 (en) | 2018-05-30 | 2021-02-02 | Auris Health, Inc. | Systems and methods for location sensor-based branch prediction |
| WO2019232236A1 (en) | 2018-05-31 | 2019-12-05 | Auris Health, Inc. | Image-based airway analysis and mapping |
| WO2019231891A1 (en) | 2018-05-31 | 2019-12-05 | Auris Health, Inc. | Path-based navigation of tubular networks |
| JP7512253B2 (en) | 2018-08-01 | 2024-07-08 | ソニー・インタラクティブエンタテインメント エルエルシー | Method for controlling a character in a virtual environment, computer-readable medium storing a computer program, and computer system - Patents.com |
| US11080902B2 (en) | 2018-08-03 | 2021-08-03 | Intuitive Surgical Operations, Inc. | Systems and methods for generating anatomical tree structures |
| CN113164149A (en) | 2018-08-13 | 2021-07-23 | 博迪维仁医疗有限公司 | Method and system for multi-view pose estimation using digital computer tomography |
| CN112566584A (en) | 2018-08-15 | 2021-03-26 | 奥瑞斯健康公司 | Medical instrument for tissue cauterization |
| US10639114B2 (en) | 2018-08-17 | 2020-05-05 | Auris Health, Inc. | Bipolar medical instrument |
| US11896316B2 (en) | 2018-08-23 | 2024-02-13 | Intuitive Surgical Operations, Inc. | Systems and methods for generating anatomic tree structures using backward pathway growth |
| US11737823B2 (en) | 2018-10-31 | 2023-08-29 | Intuitive Surgical Operations, Inc. | Antenna systems and methods of use |
| US11637378B2 (en) | 2018-11-02 | 2023-04-25 | Intuitive Surgical Operations, Inc. | Coiled dipole antenna |
| US20200138514A1 (en) | 2018-11-02 | 2020-05-07 | Intuitive Surgical Operations, Inc. | Tissue penetrating device tips |
| US11280863B2 (en) | 2018-11-02 | 2022-03-22 | Intuitive Surgical Operations, Inc. | Coiled antenna with fluid cooling |
| US11730537B2 (en) | 2018-11-13 | 2023-08-22 | Intuitive Surgical Operations, Inc. | Cooled chokes for ablation systems and methods of use |
| US11633623B2 (en) | 2019-04-19 | 2023-04-25 | University Of Maryland, Baltimore | System and method for radiation therapy using spatial-functional mapping and dose sensitivity of branching structures and functional sub-volumes |
-
2018
- 2018-06-28 US US16/022,222 patent/US10699448B2/en active Active
- 2018-06-29 CN CN202311308688.5A patent/CN117252948A/en active Pending
- 2018-06-29 AU AU2018290995A patent/AU2018290995B2/en not_active Ceased
- 2018-06-29 CN CN201880038911.XA patent/CN110741414B/en active Active
- 2018-06-29 JP JP2019566871A patent/JP7277386B2/en active Active
- 2018-06-29 WO PCT/US2018/040222 patent/WO2019006258A1/en not_active Ceased
- 2018-06-29 EP EP18823602.0A patent/EP3646289A4/en active Pending
-
2020
- 2020-05-27 US US16/885,188 patent/US10846893B2/en active Active
- 2020-11-04 US US17/089,151 patent/US11341692B2/en active Active
-
2022
- 2022-02-28 US US17/683,115 patent/US20220189082A1/en not_active Abandoned
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140046175A1 (en) * | 2012-08-07 | 2014-02-13 | Covidien Lp | Microwave ablation catheter and method of utilizing the same |
| US20170035380A1 (en) * | 2015-08-06 | 2017-02-09 | Covidien Lp | System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction |
Non-Patent Citations (1)
| Title |
|---|
| SUNTHAROS, P. et al., "Real-time three dimensional CT and MRI to guide interventions for congenital heart disease and acquired pulmonary vein stenosis", The International Journal of Cardiovascular Imaging 33.10 (28 April 2017): p. 1619-1626 * |
Also Published As
| Publication number | Publication date |
|---|---|
| US20210049796A1 (en) | 2021-02-18 |
| EP3646289A1 (en) | 2020-05-06 |
| JP7277386B2 (en) | 2023-05-18 |
| JP2020526241A (en) | 2020-08-31 |
| US20220189082A1 (en) | 2022-06-16 |
| AU2018290995A1 (en) | 2019-11-28 |
| US20200286267A1 (en) | 2020-09-10 |
| US10846893B2 (en) | 2020-11-24 |
| US20190005687A1 (en) | 2019-01-03 |
| US11341692B2 (en) | 2022-05-24 |
| US10699448B2 (en) | 2020-06-30 |
| CN110741414B (en) | 2023-10-03 |
| CN110741414A (en) | 2020-01-31 |
| EP3646289A4 (en) | 2021-04-07 |
| CN117252948A (en) | 2023-12-19 |
| WO2019006258A1 (en) | 2019-01-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11341692B2 (en) | System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data | |
| US11547377B2 (en) | System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction | |
| US20230172670A1 (en) | Systems and methods for visualizing navigation of medical devices relative to targets | |
| US12059281B2 (en) | Systems and methods of fluoro-CT imaging for initial registration | |
| US12064280B2 (en) | System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction | |
| EP3689244B1 (en) | Method for displaying tumor location within endoscopic images | |
| CN115843232A (en) | Zoom detection and fluoroscopic movement detection for target coverage | |
| EP4434483A1 (en) | Systems and methods for active tracking of electromagnetic navigation bronchoscopy tools with single guide sheaths | |
| WO2024079639A1 (en) | Systems and methods for confirming position or orientation of medical device relative to target | |
| EP4601574A1 (en) | Systems and methods of moving a medical tool with a target in a visualization or robotic system for higher yields |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FGA | Letters patent sealed or granted (standard patent) | ||
| MK14 | Patent ceased section 143(a) (annual fees not paid) or expired |